Two computer science students from the University of Pennsylvania, Eric Berdinis and Jeff Kiske, have hacked together a very impressive tactile feedback system for the visually impaired using a Microsoft Kinect device and a number of vibration actuators. The Kinecthesia is a belt worn camera system that detects the location and depth of objects in front of the wearer using depth information detected by the Kinect sensor. This information is processed on a BeagleBoard open computer platform and then used to drive six vibration motors located to the left, center and right of the user. The video below shows a demo of the system in use and gives a quick explanation of its operation.
The students came up with the idea for the Kinecthesia when Read more »
*This blog post was originally published at Medgadget*