Engineering Student Invents Robotic Arm Controlled By Facial Expressions.

The Business Insider (5/16) reports an engineering student at the University of Toronto has developed a robotic arm that responds to signals in a person’s brain. The user wears an “emotive headset” reads facial expressions in order to move the robotic arm. The inventor hopes that the technology, which appeared at the University of Toronto’s Design Fair, can be used to advance the technology of mind-controlled prosthetic limbs.

This entry was posted in Computing, Robot News. Bookmark the permalink.