As marvelous as smartphones are, sometimes their persnickety touchscreens and minuscule menus lead me down virtual corridors I never intended to tap. Chalk it up to big fingertips or lack of dexterity, but occasionally navigating my Droid makes me feel like I'm wearing oven mitts.
In effort to bring a little precision back to the delicate ballet that is touchscreen navigation, some researchers as Carnegie Melon's Human-Computer Interaction Institute have developed a prototype called TapSense. The system can differentiate between touchscreen taps from different parts of the finger — such as the tip, nail, pad and knuckle — and could therefore be used to perform different functions depending on what part of the finger touched the screen.
TapSense uses a small microphone attached to the device to evaluate the sound of your finger touching the glass, potentially opening up an entire new range of user options. Think of it as a mouse's 'right click' option for your smart phone, only using your knuckle instead.
For example, in one application, researchers demonstrated how a knuckle tap on an email heading displayed a list of options instead of opening the message. Another application of TapSense granted users access to alternate keyboard characters by typing with their finger tip instead of their pad. Users could backspace by using their fingernail.
TapSense can also differentiate between styluses and the various materials they could be made from, such as wood, acrylic and polystyrene foam. This could allow multiple people to use the same screen and the system could identify users by what stylus they're using.
Check out the video below to see TapSense in action:
Image: Carnegie Melon