What an incredible event. On behalf of the entire winning team, ‘Table 1,’ I want to start by extending our sincere thanks to the Open Data Institute organisers for arranging the Volunteering Hackathon 2025! The atmosphere was lively and fun, and it was genuinely inspiring to see so many dedicated people tackling the challenges of open data infrastructure and improving volunteer systems.
This victory, and all the outcomes from the day, are a testament to the fact that when passion meets technology, real community impact is indeed inevitable.
I am immensely proud of our team: Chris Martin, Dr. Amy Burnett, Andrew Mene-Otubu, Matt Parker, Murphy Campbell, Aaron Amato, and Nyaha Duri. To my teammates – thank you. Your collective brilliance, from deep domain expertise to sharp technical skills, made our prototype, a reality in such a short time.
The Challenge: Making Volunteering Human-Centric
We focused on user-centric discovery and experience, and specifically tackled the question: “Can finding a volunteer role be as easy as asking a friend?”
We aimed to create a conversational, generative search agent that leverages standardised open data to move beyond simple keyword filtering, making it effortlessly accessible.
The truth is, traditional search forces people to translate their human needs into rigid database queries. We saw an opportunity to build a system that understands the nuance of intent. Our solution, Project Alpha, is a natural language interface for volunteer opportunities that delivers a “Humanistic Search for Humans”.
Building the Brain: A Training Context for the LLM
Our core innovation was not just using a Large Language Model (LLM) – we worked on linking the open dataset via LLM to the Model Context Protocol (MCP) – but also how we trained it. We knew that to achieve truly empathetic and accurate matching, our model couldn’t be a generic bot; it needed a soul.
This is where the groundbreaking research of our own Dr. Amy Burnett (and Catherine Wilder) was instrumental. We adapted their framework, developed under a British Academy Policy Innovation Fellowship, to create a deep training context for our AI agent. This framework moves beyond simple transactional data like “skills” and “location” to probe the user’s purpose and story.
The LLM’s training wasn’t just about indexing 10,000+ opportunities; it was about internalising human motivation. The key questions we built into the agent’s context included:
- Who are you?
- Where have you come from?
- What is the user’s purpose and story?
- What value do they hope to create?
- What prior knowledge and experience can we call upon?
- What does transformation look like and feel like to you?
By forcing the model to engage with these profound human questions, we could elicit the necessary data, like passion & cause, logistics, and skills & mood, in a conversational, non-intrusive way. For example, when a user persona like Sarah says, “I have free time on Sunday afternoons, and I love being outdoors. I want to help somewhere that feels calm”, the system instantly understands her schedule, interest, and desired vibe.
We also ensured the model was fine-tuned on the specific taxonomies, emotional sensitivities, and operational realities of 23 distinct impact sectors, so it can understand that “working with hands” has a very different meaning in ‘Art & Culture’ versus ‘Disaster Relief’.


Universal Access: Multi-Channel Distribution with Cisco Webex Connect
A fantastic conversational agent is useless if no one can access it. Our architecture, which includes a User Chat that extracts data into an ‘Ephemeral Profile’ and posts it to our MCP Server, was designed for scalability and integration.
To ensure Universal Access, we designed the system to use Webex Connect. By connecting our MCP server to Webex Connect, we successfully bridged the gap between our intelligent agent and the platforms users use every day.
This seamless omnichannel integration means our natural language engine can power interactions across vital communication channels, making the search for volunteer opportunities truly accessible to everyone, regardless of their preferred platform:
- WhatsApp for global reach
- Facebook Messenger for social connectivity
- SMS for universal accessibility
- Apple Messages (iChat) for iOS integration
This architecture ensures that finding a role is not confined to a single app or website, but is available wherever the user is, supporting a seamless, cross-platform volunteer journey.
The Next Chapter
This Hackathon was a massive success for open data in volunteering. Our team has demonstrated that we can move beyond filtering by postcodes and categories and connect individuals to purpose through conversation.
We’re incredibly excited about the downstream opportunities Project Alpha opens up, from generating user data to optimising the AI’s performance to connecting outcome metrics to personal profiles.
This is the start of these amazing technologies becoming part of TeamKinetic.
A final, heartfelt thank you again to the amazing people on Table 1. It was an honour to build this with you.
Connect with TeamKinetic:









