Tired of mucking about with your touchscreen? Constantly having to worry about scratching the screen in your pocket? Wiping the face of it with your t-shirt to get your greasy finger marks off it? Microsoft may have an answer.
SideSight, a prototype by Microsoft, uses Infrared proximity sensors to determine which way you want to spin or expand the screen of your smartphone. “The sensors can read inputs up to 10 centimeters away, just through reflected infrared light.” This way you can browse through your phone without having to worry about mucking up your screen.
While this technology is limited (for instance, you need a flat surface for the sensors to work), it shows some amazing potential for future phone interactions. By placing sensors all around the phone, you will be able to use your hands directly in front of the screen in order to shuffle through images or browse sites. Being able to tell exactly where your hands are gives you the added bonus of being able to control the interface with individual fingers or your hand position itself, something the touchscreen can only do through physical contact.
As touch-screen interfaces become more reactive and computers get smarter we’re bound to see faster, reactive, and more forgivable interfaces. Case in point is a new product called Swype that allows users to intuitively swype through various letters on a touch-screen keyboard in a single fluid motion, then statistically calculates what you intended to type.
If it sounds a lot like the next generation of T9 that’s because one of the founders, Cliff Kushler, also invented that huge time-saver. But make no mistake about it, Swype marks a big leap in next-gen productivity. Already garnering rave reviews, it works
“across a variety of devices such as phones, tablets, game consoles, kiosks, televisions, and virtual screens” and lets formerly slow texters achieve input speeds of over 50 words per minute. That’s right – some/most people can’t even type that quickly on a regular keyboard.
The C3 Loops touchscreen, developed by “Rikard Lindell”, uses an interesting way to zoom in and out — by using circles. The user draws their fingers around in circles clockwise to zoom in, counterclockwise to zoom out. Much different than the accepted way of zooming in and out by spreading hands apart or pinching them together. In fact, you might say this guy has circles on the brain, especially when you see his website.
Lindell believes that circles are the key to just about any design and bases most of his research around them, calling it ConCentric CRAFT. “From experience we know that users want to work and collaborate content centric in an unbroken activity flow.” For some the idea of a circle being incorporated into the design might give the user a Zen-like feel to their interactive experience. Or it could make you dizzy.
A while back I reported on Microsoft's prototype called the SideSight, a cellphone which uses infrared sensors to determine your hand movement. Now it turns out Apple has applied for a patent on just that.
Sure, the image isn't too clear, but what you see is a possible infrared sensor array using LEDs or OLEDs as the sensor. You'll be able to rotate displays with just a wave of your hand if you bring it close to the screen. It doesn't stop there too. Apparently they're looking into the same technology for a possible OLED iPhone that will feature these sensors. Now THAT would be awesome.