backtop


Print 13 comment(s) - last by StevoLincolnit.. on Oct 12 at 3:14 AM

The Kinect-based system could enable game devs to turn tables or other flat surface into touch "controllers"

Pinch-to-zoom; two fingered swipe; the scissors gesture -- our fingers are marvelous at multi-touch gestures on tablets or other devices.  But what if you could take that concept and add multi-touch gestures to any surface -- no capactive touch screen necessary.

Purdue University Electrical and Computer Engineering Professor Niklas Elmqvist demonstrated the patent-pending system that uses Microsoft Corp.'s (MSFT) Kinect visualization device and can detect hand posture with 98 percent accuracy.

Using the Kinect sensor, which uses imaging technologies to determine the position of objects in 3D space, the new software transforms everyday surfaces into virtual multi-touch screens.  Comments co-author and Purdue Mechanical Engineering Professor Karthik Ramani, "We project a computer screen on any surface, just a normal table covered with white paper.  The camera sees where your hands are, which fingers you are pressing on the surface, tracks hand gestures and recognizes whether there is more than one person working at the same time."

Using a proprietary computer model of the human hand, the researchers can use the pixel processing power of the Kinect sensor to pick out finger location, where the hand is in relation to the surface, and left vs. right hand.  Comments Professor Elmqvist, "We can isolate different parts of a hand or finger to show how far they are from the surface.  We can see which fingers are touching the surface. With this technology, you could potentially call up a menu by positioning your hand just above the surface."

Minority Report touchscreen
Purdue researchers have developed virtual multi-touch, much like that depicted in the movie Minority Report.

The professors are currently working to refine the accuracy of their method, which was presented in a research paper at the Association for Computing Machinery Symposium on User Interface Software and Technology (ACM UIST 2012) in Cambridge, Mass. 

The technology could be used to add inexpensive multitouch to a variety of traditional electronics devices including TVs, computers, gaming consoles, and appliances.  Comments Professor Elmqvist, "Imagine having giant iPads everywhere, on any wall in your house or office, every kitchen counter, without using expensive technology.  You can use any surface, even a dumb physical surface like wood. You don't need to install expensive LED displays and touch-sensitive screens."
hand model
The researchers use a proprietary multi-touch hand model. [Image Source: Purdue]

It certainly is an exciting possibility to think about, and Microsoft is surely pleased that this is all made possible with its proprietary, patent-protected Kinect sensor.

Sources: ACM [Paper], Purdue University [Press Release]



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Touch vs Mouse
By ET on 10/11/2012 2:58:00 AM , Rating: 2
It would be interesting to see a mouse vs. finger speed test. At least when playing Plants vs. Zombies (which I have on my phone and on my PC) the touch interface feels faster and more natural.

I think that the issue is with certain types of interaction, which is why lolmuly talked about dragging and not just selecting.

But I think that the problem with touch on large surfaces is mainly fatigue.


RE: Touch vs Mouse
By StevoLincolnite on 10/12/2012 3:14:51 AM , Rating: 2
No. Not a silly casual game like Plants vs Zombies.

Try testing it with a game like StarCraft 2 where you can actually measure APM (Actions Per Minuit.)


"I f***ing cannot play Halo 2 multiplayer. I cannot do it." -- Bungie Technical Lead Chris Butcher














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki