Robot Dog Hackathon

Hello everyone, Andrew here! Lead Data Scientist at Allan Webb. 

Me and my colleagues, Dean Kinch and Richard Doyle, had an amazing time at the Robot Dog Hackathon a couple of weeks ago, and I wanted to share my experience. 

This year, Robot Dog Hackathon’s main challenge was to use Boston Dynamics robot dog Spot to detect and safely detonate explosive devices. The event was organised by the Defence AI Centre (DAIC) in collaboration with the Battlelab (who hosted us), representatives from 11 Explosive Ordnance Disposal and Search Regiment (EOD), MoD dog handlers, DE&S. 

The participants were five teams of seven to nine people, combining diverse talents – with teams including data scientists, analysts, software engineers, project managers, business analysis and more.

The first day was a whirlwind of activities. We began with a comprehensive briefing about the event’s aims and safety protocols, setting the stage for what was to come. Then came the exciting part – meeting our team members. The diversity in skills and backgrounds in each team promised an exciting collaboration and it was great to meet new people, and work on an innovative and challenging project with a like minded group. 

A well-deserved lunch break supplied great food but also time for informal networking and discussions, and I believe that is one of the most useful parts of events like this: being able to meet other professionals in the industry and hear what they are doing.

We dedicated the afternoon to getting acquainted with our new ‘colleagues’: the robot dogs. We spent a few hours getting to grips with the Boston Dynamics SDK, learning about the capabilities and limitations of man’s mechanical best friend.

Programming Robot Dogs to Perform Tasks

Day two was where the real challenge began. Our focus was programming the dogs to autonomously perform tasks such as moving, navigating, handling objects. A significant portion of the day was dedicated to data collection, gathering, and labelling data directly from the robot dogs. 

This data was crucial for our next big task: model experimentation. Our team worked on fine-tuning a machine learning object detection model, specifically YoloV8. Essentially, this involves building a model to analyse visual data and accurately identify and locate various objects within that data, a process akin to teaching a computer to ‘see’ and understand images. Whilst our model showed promise, another team using the same base model created one with much better performance. They labelled more images and augmented their training set to end up with 10X more images than us and this paid dividends as their model outperformed ours by some margin. This team graciously shared their model with the others, underlining the collaborative nature of the event (thanks, Mattia!).

As any self-respecting data scientist would agree, while experimenting and developing a solution is fun, adopting a superior solution from others (when they are happy to share) is a no-brainer. We are a pragmatic people. 

The final day was a race against time. Integrating the different components was our main challenge, and it proved to be as exhilarating as it was demanding. For me, this part was particularly rewarding. Watching the individual pieces our team had worked on come together into a cohesive solution was a great experience. Then, seeing the robot dog in action, implementing this solution, was nothing short of amazing. 

The culmination of our efforts was the test – a tense but incredible moment where we watched the dogs navigate the course. It was a proud moment for all teams, seeing our collective work in action, showing remarkable capabilities developed in such a short span of time.

In summary, the event was a showcase of talent and innovation. While no team completed every aspect of the challenge, the collective achievements were impressive, and all the tasks in the challenge were achieved by one team or another. Team 2’s navigation demo and Team 4’s model for mortar detection were particularly noteworthy. The event proved that by combining the best elements from each team, a comprehensive solution to the challenge is within reach. This hackathon was more than a technological showcase: it was a testament to what can be achieved with collaboration, creativity, and technology.

As I reflect on the event, I am struck by the potential applications of these technologies in keeping people safe, preservation of life being the EOD’s number one priority. Another AI detection mode could be expanded for different types of ordinance, and even be used to classify or categorise different items to give more information to EOD units. The hackathon was not just about technology, it was about people coming together to solve real-world problems. I’m excited to see what the future holds in this space and can’t wait to hopefully be part of it again.

For us here at Allan Webb, the potential for AI detection and automation has tremendous potential for our work in supportability services – and I would love to hear how those from industries like ours envision the future of this technology.

Stay tuned for more from us. If anything I have shared today has sparked your curiosity or if you’re keen on discussing the innovative potential of data and AI technologies, reach out! Connect with me on LinkedIn and email me at Thanks for reading!

#AI #Robotics #Innovation #DataScience #DAIC #Allanwebb