A research team at the University of Cambridge has harnessed machine learning algorithms to instruct a robotic sensor in swiftly interpreting lines of braille text. Surprisingly, the robot demonstrated an impressive reading speed of 315 words per minute with nearly 90% accuracy, showcasing potential applications beyond its primary purpose.While not initially designed as assistive technology, the robot's heightened sensitivity in reading braille makes it an ideal testing ground for the development of robot hands or prosthetics with human-like fingertip sensitivity. Human fingertips, known for their remarkable sensitivity, provide crucial information about the surrounding world, and replicating this in a robotic hand presents significant engineering challenges.
Professor Fumiya Iida's lab at Cambridge's Department of Engineering focuses on overcoming the engineering challenges associated with reproducing human-like sensitivity in robotic hands efficiently. Softness, a characteristic of human fingertips, poses challenges when combined with the need for extensive sensor information, particularly when dealing with flexible or deformable surfaces.The researchers developed a robotic braille reader using an off-the-shelf sensor equipped with a camera in its 'fingertip.' Unlike existing robotic braille readers that work in a static manner, touching one letter pattern at a time, the new approach involves a dynamic reading process.
The team incorporated machine learning algorithms to address challenges such as motion blur, allowing the robotic reader to achieve a remarkable reading speed of 315 words per minute at 87% accuracy.The success in replicating human braille reading speed opens doors to broader applications in tactile sensing systems. The researchers envision scaling the technology to the size of a humanoid hand or skin, promising advancements in robotics and prosthetics. The breakthrough could revolutionize how robots interact with their surroundings, making them more adaptable and efficient in handling delicate tasks.