google research made project guideline open source, and it’s a big deal
In 2019, Thomas Panek, president and CEO of Guiding Eyes, asked “Can we help a blind runner navigate? Can we make that happen? Can we make a navigation for a blind person possible” at Google’s 2019 Hackathon.
“That’s the first time I ran alone in decades”.
Fast forwards to Tuesday the 21st of November, 2023. The Google research team have made Project Guideline open source, which you can access today on GitHub and include it in your project.
What is “Project Guideline”?
Project Guideline is a research project that leverages on-device machine learning (ML) to enable people who are blind or low-vision to walk or run for exercise independently. The project was developed by researchers at Google AI and was first announced in 2021.
Project Guideline uses a combination of computer vision and machine learning to detect and track obstacles in the user’s path. The app then uses this information to provide real-time audio cues to help the user avoid obstacles and stay on track.
Here are some of the key features of Project Guideline:
On-device ML:
Project Guideline uses on-device ML. For the less tech-savvy, “On Device Machine Learning” is a key component in AI (Artificial Intelligence) that allows it to learn and react in the way that a human would, and the fact that it’s on the device is a pretty big deal. In this case, it ensures that the app is always available and responsive, even when there is no internet connection.
Real-time audio cues:
Project Guideline provides real-time audio cues to help the user via bone conductive headphones, letting a runner know if there’s a crowd of people or an obstacle in the way, ensuring they stay on track.
Customisable settings:
Project Guideline allows users to customize the app’s settings to fit their individual needs. For example, their running pace or personalised running plans. Project Guideline can also integrate with fitness trackers or GPS devices.
How can Project Guideline help?
Using a device attached to the runners belt, the application uses the runners camera to evaluate the track ahead; specifically looking at the guideline on the path ahead to make sure they are following the path safely, without diverging. Feedback given to the user could be in both haptic (physical, such is vibration) and audio form to mark an event such as pace, obstacle or distance to finish.
Haptic feedback is particularly useful for users with hearing difficulty, it can, for example, provide increasing vibration intensity to alert the runner of potential dangers or collisions.
Does it work?
Thomas Panek asked at a Google Hackathon if it was possible to help a blind person navigate, in 2019. Panek talks about losing is sight from a young age in this moving Google Accessibility blog post, and the unexpected impact of his question.
In the video of Thomas running for the first time, without his seeing eye dog and only aided by this liberating technology, you can feel the overwhelming emotion after a successful mile as he expresses “the first time I ran alone in decades”.
The success of this first trial has lead to the exciting participation of ASICS World Ekiden 2022, where Project Guideline helped visually impaired runners participate in a virtual race that connects digital sashes in teams of six, without any escort runners. Phenomenally, all six runners ran their segment with the help of Project Guideline and completed the 42.195km in 4 hours 29 minutes 44 seconds, competing equally against other teams of able-bodied athletes from all over the world.
Innovation and Digital Accessibility
This kind of technology is incredible; the collaboration between Thomas Panek and the Google Research team has produced a genuinely feasible product to help millions of people engage in exercise and navigate, busy and complex environments.
Our sister company Midlands Online recently worked with Travel South Yorkshire to make their Live Tracking feature and enhance the Journey Planner to WCAG AA standards.
With this project, embedded within a TSY app, the possibility of Guidelines to allow a visually impaired user navigate the cityscape to get to their bus, tram or train stop would be so liberating.
Fairness believes in creating a digitally inclusive world, that enables equality in the physical one. We’re currently looking at ways to use AI to evaluate and report on your website’s accessibility and more importantly, we want this tool to be free for everyone.
Services offered by Fairness
While we plan and evaluate our expert consultancy services, which will include:
- Accessibility consultation
- Future WCAG 2.2 preparation for Public Sector clients
- On site consultant providing manual testing and pipeline, automated testing
Our existing services provide guidance on what’s going wrong for your service and users of assistive technology and how to fix and prevent them.
Find
Fairness conduct thorough accessibility audits to identify potential issues within your digital platforms. Our auditing process uses a range of devices and evaluations methods, the output of which is a detailed report of your services and applications to pinpoint accessibility barriers up to WCAG 2.1 AAA.
Fix
After identifying accessibility issues, our team implements corrective measures to enhance the usability of your website. We focus on making your platforms compliant with accessibility standards, ensuring a fair and inclusive experience for all users.
Prevent
Prevention is key. Fairness provides comprehensive and customised accessibility training programs for your team. Our training sessions empower your staff to understand and implement accessibility best practices, in your Content Management System, reducing the likelihood of recurring issues.
Follow us for more exciting news and to see what’s next for fairness.
Thanks for reading this article by Terry Purser-Barriff, Director & Founder of Fairness