This new functionality makes possible new applications that use common gestures to control the smartwatch and, ultimately, other objects connected through the internet of things. By monitoring vibrations that occur when people hold objects or use tools, the smartwatch also would be capable of recognizing objects and activities.
It could even be used to help tune a guitar, with the smartwatch displaying the note transmitted as the guitarist plucks and adjusts each string.
Carnegie Mellon University researchers have developed technology to enable smartwatches to detect taps, scratches and flicks against the wearer’s body, making possible new types of interactions with wearable devices.
Credit: CMU Future Interfaces Group
Normally, a smartwatch accelerometer is used to detect when a person lifts an arm so the screen can activate, or sometimes to count footsteps. To do so, the accelerometer only needs to take measurements about 100 times a second. But when researchers increased the sampling frequency to 4,000 a second, 4 kHz, they found it acted like a vibrational microphone. Rather than detecting sounds transmitted through the air, however, it couples with the body to detect bio-acoustic signals.
“ViBand isn’t just a way to control your smartwatch,” Harrison said. “It enables you to augment your arm. It’s a powerful interface that’s always available to you.”
A ViBand-enabled watch can tell if someone is tapping on the forearm, the palm of the hand or the back of the hand. It can detect finger flicks, scratches and other motions. It also can sense if a person is holding various mechanical and electrical tools, such as an electric toothbrush, power drill or handsaw. Each body tap, device or activity has distinctive bio-acoustic signals.
To increase the frequency of the accelerometer’s sampling rate, the research team developed a custom kernel — the core of the smartwatch’s operating system. That’s the only modification required and can be performed as a software update, Laput said.
The team developed several demonstration apps for ViBand, including the use of hand gestures in the area around the watch to control apps on the watch. Similar gestures could be used to control remote devices, such as lights or a TV or other appliances connected via the internet of things.
They also showed it could be used for object-aware apps, such as monitoring meal preparations or providing visual feedback while tuning an acoustic guitar.
Finally, they propose an application using what they call a vibro-tag, which is a small object that emits inaudible, structured vibrations that contain data. A vibro-tag on a person’s office door, for instance, might transmit information about office hours or alternative contact information.
This research was supported by the David and Lucile Packard Foundation, a Google Faculty Research Award and Qualcomm.