Arama They Didn't

10:34 am - 04/10/2012

Tongue-controlled Kinect interface under development in Japan



A research group at The University of Electro-Communications is developing an interface to detect tongue movement using the Kinect.

This interface is intended mainly for training the oral muscles, which include the tongue, for people who have oral motor function disorders affecting their ability to speak or swallow. The research group suggests this as a hygienic detection method, which doesn't require attaching a device to the tongue.

"First of all, the face and eyes are detected using Kinect. When both eyes are recognized, the system can estimate the position of the nose, and based on the position of the nose, movement values are obtained. The minimum movement value indicates the tip of the nose. Based on that position, the system determines the mouth area. Based on the mouth area, movement values are obtained, like before. So finally, the position of the tongue is obtained."

"One way of training the tongue is to move it left and right. We've created a shooting game based on that motion."

"When you stick your tongue out, a bullet is shot from the player in the middle. The bullet's trajectory depends on which way the tongue is pointing. That's how you aim at the targets in this game."

"The first problem is that detection isn't very precise, the system isn't very robust. So we're considering how to improve the precision. Also, we'd like to develop an interface that uses the motion of the lips as well as the tongue, so training can include a combination of mouth motions."

Just "kissing" the posters wasn't enough.

Source: Diginfo Tv
squallina 10th-Apr-2012 08:56 am (UTC)
O.O I'd never call you tacky. You're far too awesome!

This page was loaded Dec 16th 2017, 6:46 pm GMT.