Perhaps huffing at your computer might get you somewhere if research at the Georgia Institute of Technology comes to fruition. Shwetak Patel and Gregory Abowd from Georgia Tech have published a paper that describes how to use a computer microphone to determine where on a screen a person is blowing. The technique, which they call BLUI for Blowable and Localized User Interaction, can distinguish between the different sounds air makes depending on where the breath is directed. Note: This won’t be part of Grow. Just so you know.
Interesting to see this kind of interface, opens opportunities for its playful nature of interaction between people and computers.
Is this actually working? or is it a mock up? Reminds me of similar work by a colleague of mine (sorry for the plug) http://www.design-interactions.rca.ac.uk/people/alumni/04-06/eriko-…
You blow on the pipe, and all the icons float around to let you clear your desktop.
i can do the same in my Nintendo DS.
Thats the first thing I thought of too! Blowing out torches in the Phantom Hourglass by blowing at the DS.
Blowable and Localized User Interaction
First Al Gore invented the internet, and now Bill Clinton has inspired a whole new computing experience!
I’m not trying to make a joke (no sexual connotations, etc); I say this in all seriousness: Blowing on stuff makes you really dizzy after a very short amount of time. Try blowing rapidly for a minute. You won’t be able to.
You make me “dizzy” Miss Lizzy
connotation.sexual = True;
sorry….
There are a lot of people with disabilities there that could very well make use of this technology. Some don’t have or can’t move arms so, yes, it is something with a lot of potential.
… that this interface really blows. It’d go great with this wallpaper:
http://tinyurl.com/2p9yh3
🙂
This almost looks like something out of The Onion.
why not lick and spit too? now, a lick screen isn’t that bad an idea for adult video games is it?
I wonder what would happen when you sneeze, BSOD?