Tired of mucking about with your touchscreen? Constantly having to worry about scratching the screen in your pocket? Wiping the face of it with your t-shirt to get your greasy finger marks off it? Microsoft may have an answer.
SideSight, a prototype by Microsoft, uses Infrared proximity sensors to determine which way you want to spin or expand the screen of your smartphone. “The sensors can read inputs up to 10 centimeters away, just through reflected infrared light.” This way you can browse through your phone without having to worry about mucking up your screen.
While this technology is limited (for instance, you need a flat surface for the sensors to work), it shows some amazing potential for future phone interactions. By placing sensors all around the phone, you will be able to use your hands directly in front of the screen in order to shuffle through images or browse sites. Being able to tell exactly where your hands are gives you the added bonus of being able to control the interface with individual fingers or your hand position itself, something the touchscreen can only do through physical contact.
Systems theorist, futurist and Acceleration Studies Foundation Executive Director John Smart presents a near-term scenario in which new comm technologies enable remote peer networks to effectively bond with and support mental patients, assisting in socialization and treatment from a safe distance.
Such “Symbiont Networks”, as Smart calls them, could be highly effective drivers of mental health, among other things, as they augment standard treatment that can consist of heavy medication and little face time for certain individuals.
Here’s a short clip from my recent interview with John in which he describes a Symbiont Scenario:
Detractors of the Symbiont Scenario will likely critique the “dehumanizing” aspects of distance communication and also point their fingers at unintended consequences. But, though I agree it’s highly probable that whole new classes of disorders (like autism, ADHD, etc) will continue to emerge as we co-evolve with the changing environment, I also fundamentally believe that because there’s no such thing as standing still in an environment of accelerating change it is incumbent upon us to use new technologies to help people, and our system, to self-actualize better.
MIT Technology Review has a great post on the use of (bee) 'swarm' inspired algorithms to reduce energy consumption of networked appliances like air conditioners, computers and heating systems. Toronto-based startup REGEN ENERGY is building smart energy platforms using new technology standards like Zigbee and micro-controllers to 'maximize collective efficiency'. Their trick is to enable 'bottom up' self organized smart grids for appliances without having to actively manage their energy consumption with a 'single order'.
The 'Big Grid' is based upon a mass distribution model from the 1930's and technology from even earlier. But industry and the Department of Energy are beginning to develop standards to transform the Big Grid into the Smart Grid so that it can handle renewable energy sources, electric vehicles, distributed energy generation, demand side managment, and information about it all. The sale of electric vehicle charging technology company V2Green to Smart Grid technology company GridPoint marks the beginning of a market where hi-tech geeks meet energy geeks.
Could there be a collision of paradigms between geeks who've grown up under Moore's law and those whose basic technology hasn't changed in 70 years?