Ultrasound technology that enables mobiles and tablets to be controlled by gesture could go into production as early as next year.
The move towards gesture control has gathered pace and there are now many such products on the market.
What sets Elliptic’s gesture-control system apart from others is its wide field of use, up to a metre away from the phone. It means it can identify mid-air gestures accurately.
Because it uses sound rather than sight, the sensor can recognise gestures from a 180-degree field. It also consumes less power and works in the dark.
By contrast Samsung’s Galaxy S4 uses an infrared sensor that can only interpret hand movements within a very small zone.
“The user needs to learn the exact spot to gesture to instead of having a large interactive space around the device,” said
The ultrasound system in action
Allowing users more freedom in how they gesture is vital if such products are to become mainstream, he thinks.
“With a small screen such as a phone or a tablet, the normal body language is not that precise. You need a large zone in which to gesture.”
If consumers can quickly see the effects their gestures have on screen, he thinks, “it is quite likely that this is the next step within mobile”. The technology was recently shown off at Japanese tech show Ceatec.
In the demonstration, an Android smartphone was housed in a case containing the ultrasound transmitters. But
Continue reading the main story
It is ideal if you have dirty or sweaty hands”
Ben Wood CCS Insight
Increasingly firms are experimenting with gesture control. PrimeSense, the company that developed gesture control for Microsoft’s Kinect console, has also made strides towards bringing the technology to mobile.
By shrinking down the sensor used in the Kinect, the firm showed it working with a Nexus 10 at a
Meanwhile Disney is testing technology that allows users to “feel” the texture of objects on a flat touchscreen. The technique involves sending tiny vibrations through the display that let people “feel” the shallow bumps, ridges and edges of an object.
“Ultrasonic is particularly interesting as you don’t need to touch the screen which can be an almost magical experience.
“It is ideal if you have dirty or sweaty hands. A common example people use is flicking through a recipe when cooking. Other examples include transitioning through a slideshow of photos or flicking through music tracks or turning the page on an ebook,” he said. “The big challenge that remains is how you make users aware of the capability.”
Most Popular Stories
- Dell Offers Undisclosed Number of Employee Buyouts
- Saab Gets Back into the Game; U.S. Auto Sales Soar
- Authorities Close to Deal with JPMorgan Chase over Madoff Response
- Apple Activates Customer-Tracking iBeacon
- 2013 Tech Gift Guide: iPad Mini Still Hot; Chromecast a Great Low-Cost Option
- U.S. Stocks Rise on Sysco Acquisition
- A Biography of Jonathan Ive, Apple's Creative Chief
- American Airlines, US Airways Complete Merger
- Unemployed Wait as Lawmakers Debate
- Tech Giants Call for Controls on Government Snooping