Touchscreen moves out of the Screen


While the world’s ooh-ing and ah-ing with Microsoft Surface some time ago for its engaging and intuitive interaction, researchers within the campus are moving on to yet another interesting interface – touch control but out of the screen.

Called SideSight, the interface allows you to control a phone placed on a table by wiggling your fingers in the space around it. This helps to solve the problem that a touch screen is limited by the need for fingers to touch it – thereby limiting how small the screen can go.

Personally I see application of this more outside of the phone though – how often do you place a phone down on the table? But think about things like ultramobile laptops and stuff – a virtual trackpad if you will – and things start getting more interesting.

[via New Scientist]

Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *