Sure, the new iPhone 6 is equipped with a new retina HD display, A8 chip for greater power and a better battery for more juice — all tucked into a thinner container. However, these are not new buzzwords that we are just hearing today. A prettier display, faster processor, and longer battery life has been a constantly renewed promise by manufacturers since the advent of cell phones. The iterative improvements in the resolution and chips have become a tad mundane for the itching consumer anticipating the unpredictable. Where has the novelty of touch screens and double-digit megapixel cameras gone?
Fortunately, researchers at University of Washington are working on something we can all look forward to as they attempt to add a new feature to cellphones: in-air gestures. They have created a way for our movements to translate into certain functions without even taking our phones out of our pockets. And this is all done by interacting with the 3G or 4G signals emitted by our phones.
Gestures are non-vocal forms of communication in which our bodily movements express particular messages to other people. We have been socially trained to walk over to a friend waving to us or run away from a sibling waving his fists. However, gestures in the 21st century can mean something else: a way to communicate with our technologies. Our generation is accustomed to nimbly swiping and pinching our phones to browse and zoom. Some of our cell phones are also accompanied by the loving voices of Siri and Google, as well as the Smart Pause function, which enable phones to pause videos when its users look away from the screen.
Programmed gestures have become a convenient and common way to talk to our phones. However, these gestures have been limited merely by whether the phone is in a user’s hand. Gestures like re-calibrating the compass or shuffling songs with a shake are usually done while users are actively using the phone. “SideSwipe,” the gesture program that the researchers are working on, bypasses this requirement by creating an invisible, interactive bubble around the phone.
Our mobile phones are constantly exchanging data with the Internet. We are updated with new emails and Facebook notification in real-time because our phones constantly transmit radio signals on a 3G or 4G network to communicate with the base station. SideSwipe allows our body to interact with these 3G or 4G signals. While the waves freely travel through the material in our clothes or handbags, our bodies interfere with the signal to a slight extent by reflecting some of the signal back to the phone. SideSwipe can analyze the returning signals to detect very specific gestures that we produce.
Signals from our phones reflect off of our bodies due to a change in medium. When waves pass through air and into a different material, they can refract or reflect based on the nature of the material. For example, when light travels into a glass prism, they tend to refract, as manifested by the rainbow traveling out. This occurs when there is a change in index of refraction, a measurement that indicates the electric properties of a specific material. The electric field, inherent in the material, could change the length and velocity of the wave, which according to Snell’s Law, can change its direction.
When users move their hand across the environment of their phone, their skin, muscles and bones can effectively change the wave’s path. SideSwipe can be trained to detect these changes and learn what each gesture means. For instance, if your phone goes off in the middle of a meeting, you can train your phone to go silent when you subtly create a thumbs up over your pocket without anyone noticing.
An added benefit to a program like SideSwipe — and quite an important feature for many — is the conservation of battery life. Many gesture type programs available in contemporary mobile phones use the camera on the front screen to detect movement. Since SideSwipe sensors is enabled by low-power receivers and simpler signal processing than video-inputs, it is a considerably better, battery-saving option.
While the program is still in its infancy, the researchers have conducted a ten-person study to determine the effectiveness of fourteen different gestures. This included the expected swiping and tapping gestures that we are accustomed to already. The team found that SideSwipe recognized about 87 percent of gestures and successfully converted them to useful functions within the phone.
Mike Yamakawa, a technology enthusiast and recent graduate of Hopkins, gave his input on the program. “I wonder whether SideSwipe can distinguish between intended and accidental gestures?” Yamakawa said. “But, if it becomes more accurate than voice recognition, I think it could be something I use a lot.”