Advertisement

Blog

Gesture Interface at CES Taps Ultrasound

Gesture recognition technology on display at 2014 International CES may be the cure for those frustrated by sensitive touchscreen sensors. For the first time in the United States, Norwegian company Elliptic Labs will exhibit its Elliptic SDK, which uses ultrasound for touchless gesturing on Android and Windows operating systems.

“A lot of consumers are looking for new ways to interact with their devices, whether it's for how to answer the phone, browse pictures, or play games,” said Elliptic CEO Laila Danielsen.

The Elliptic SDK leverages devices that consume little power and are already standard in many phones such as Knowles or Wolfson MEMS microphones operating above 20kHz, transducers, and Murata ultrasonic speakers. Sound waves from the speakers will interact with the user’s hand and bounce back into the microphone to provide position data for moving objects on a display.

Time-of-flight measurement and distributed sensing allow users to interact with their devices above, below and to the side of the screen. The software can recognize gestures within 180 degrees from the front of a display at a range of 50 centimeters to a meter away from it.

Developers or users can create their own gestures in ways that help them navigate quickly.

“The gesture could be a flick with your hand, [or] you can point your hand straight toward [the display] and do a little tap [in the air],” Danielsen said. “If I do a circle on right side of device, it could mean turn off sound. If I do a circle on top of the device, it could mean bring up my Facebook profile.”

“We're selling a software platform for various OEMs to build various applications,” Danielsen said. “Some OEMs tried to focus on building a few games, some integrated it so users can answer the phone and browse email.”

The software will first appear in handsets, but Danielsen said it will also be used in tablets, laptops, and smart TVs in 2014. In addition, Elliptic is talking to companies about use in wearables and cars.

Danielsen said the software is “scalable, adjustable and customizable” so supporting other operating systems would “not be an obstacle,” she said. “The key is making sure that we pick a few features and make sure they work really, really well,” she added.

This article was originally published on EE Times .

2 comments on “Gesture Interface at CES Taps Ultrasound

  1. t.alex
    January 15, 2014

    I have long thought that for gesture recognition, some form of powerful image processing is used. With ultrasound, that seems to be a very low-cost and low-power solution. Can this SDK be deployed to any smart phone?

  2. prabhakar_deosthali
    January 16, 2014

    The key feature about this gesture recognition that I see is the ability to define one's own gestures and associate them with the functions one wants to.

    This is a major advantage to make your devices more personalized and hence automatically secure also.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.