13 June 2024

AI-powered app helps blind man run 5 kms solo

By: Gurbir Singh

A blind man successfully completed a solo run of 5 kms in New York’s Central Park last week without any human or guide dog help.

The only navigation aid 50-years-old Thomas Panek had was an artificial intelligence (AI)-powered app linked to his headphone through a smart phone attached to his harness.

Thomas Panek, who is President & CEO of Guiding Eyes for the Blind, began losing his vision due to a genetic condition at the age of 8, and was legally blind in his early 20s.

“To be able to be here, it’s real emotional,” Panek remarked after his successful run in an event sponsored by Google and the New York Road Runners Club on Thursday, November 19.

“It’s a real feeling of not only freedom and independence, but also you get that sense that you’re just like anybody else.”

Thomas Panek, CEO , Guiding Eyes for the Blind (Photo courtesy: GoogleResearch)

Still in the prototype phase, the AI-powered app, Project Guideline uses a phone’s camera to detect and track lines on the ground and then guides users with audio cues via headphones.

If the runner strays too far from the centre, the sound gets louder on whichever side they’re straying.

Being a marathon enthusiast and a believer that “humans are born to run,” Panek had to give up his passion of running.

Initially, he ran with human guides tethered in front and even “qualified for the New York City and Boston Marathons five years in a row”, but found them slow.

“But as grateful as I was to my human guides, I wanted more independence,” he wrote in a Google blog post.

“The safest thing for a blind man is to sit still. I ain’t sitting still,” he said.

At a Google hackathon in 2019, Panek asked computer vision designers if they can “help a blind runner navigate”.

He didn’t expect much but by the end of the day, they designed a demo that allowed a phone to recognize a line taped to the ground and give audio cues.

AI tracks marker on ground & sends audio signals (Photo courtesy: GoogleResearch)

Eventually a more sophisticated prototype was produced. The harnessed phone camera of Panek uses AI to look for a marker on the ground and sends audio signals to wearer depending on their position.

“If I drifted to the left of the line, the sound would get louder and more dissonant in my left ear,’ he said. “If I drifted to the right, the same thing would happen, but in my right ear.”

After a few months, and some adjustments, Panek was able to run laps on an indoor track without assistance from any human or a guide dog. “It was the first unguided mile I had run in decades,” he said.

Subsequently, the Google Research developers worked to adapt the computer-vision technology outdoors, where the obstacles are entirely different.

Panek felt ‘free’ for first time in his life after solo run (Photo courtesy: Reuters)

The system was able to keep him on course, and with every stride. “For the first time in a lifetime, I didn’t feel like a blind man. I felt free, like I was effortlessly running through the clouds,” he wrote in his blog.

Project Guideline doesn’t need an internet connection to work and can account for weather conditions.

NewsViews – Proudly supports GlaucomaNZ

One comment;

  1. As rightly said in story, this app would make life live-able for visually-impaired, and they may not have to depend upon anyone to jog or to do exercises. Thanks, NewsViews for this story & is a great service to your readers(& community)

Comments are welcome

Get Noticed –  

from NewsViews

Like NewsViews

Advertise here

Support us- we are local