A class project has evolved into a startup for Human Computer Interaction researcher and computer science Ph.D. candidate Mike Karlesky.
“It's much easier to understand when you try it than to describe it in words,” Karlesky explains. “We say it's ‘eyes-free, ears-free, hands-free navigation that you feel.’ Basically, we're hacking the brain-body connection, using vibration prompts to extend our natural sense of direction.”
The device is a flexible band worn around the neck. It senses where the wearer is looking and, together with an accompanying smartphone app, transforms users into human compasses. Vibration prompts can orient the wearer to north, for instance, or provide full turn-by-turn directions to a destination. Karlesky has tested the early prototype with cyclists, runners, sailors, the blind, and gamers. He believes a variation could be of interest to fire/rescue personnel as well.
According to Karlesky, “Anytime a screen or audio prompts are impractical or unsafe for navigation, our approach is valuable. We think our first customers will be cyclists and runners who are telling us they want variations on their normal routes that they don't have to think about.”
The project placed third in NYU's Inno/Vention competition and was showcased at Poly's recent Research Expo. Karlesky and one of his original class project team members were recently selected for NYU's Summer Launchpad incubator program. Karlesky and Ph.D. candidate Xiaochang Li will spend the summer developing the concept into a business. Karlesky says that he is looking forward to working on the business full time “instead of soldering until 2 o'clock in the morning at the Game Innovation Lab.”