Computers of the future could be controlled by eye movements, rather than a mouse or keyboard. Scientists at Imperial College, London, are working on eye-tracking technology that analyses the way we look at things. The team is trying to gain an insight into visual knowledge – the way we see objects and translate that information into actions.
I blink?
I am more for the science health and sfaety reason stated rather then web surfing.
funny that RSI is already adressed by nature.
our eyes (yes we have two!) are our main navigators in the our information enviroment. two lubricated balls, fully embedded in and suppored by our fore head, that are moved by some short, supple muscles — with very little friction.
we don’t need to cure RSI, we just have to pick the tools (that are give to us) to do the job.
cies breijs.
Pick the tools? Well your hands are some of the tools given to you to do the job.
Eyes the main navigators? In a sense…no pun intended. But in doing a lot of things, touch is a major factor and the our so called hand-eye coordination is what gets the job done. Working with one and not the other can be difficult.
RSI… I can “see” that monster of a migrane coming from here.
So is that left wink for left click, right wink for right click and double blink for the double click (since you blink naturally anyway)?
RSI… I can “see” that monster of a migrane coming from here.
You look where you move the mouse pointer anyway. This may just reduce the number of things involved here. Combine this with voice recognition or even sub-vocalization, and you may not have to use your hands for doing a large portion of the things we do on computers today.
You look where you move the cursor, but that only approximately puts it near the center of your field of view.
The problem with past attempts at eye tracking is that they all end up with you needing to make unnaturally precise eye movements and led to various kinds of stress depending on what was attempted.
There’s a reason why mice are ubiquitous but eye movement (and hand gesture) systems aren’t: scale of motion.
You look where you move the mouse pointer anyway. This may just reduce the number of things involved here. Combine this with voice recognition or even sub-vocalization, and you may not have to use your hands for doing a large portion of the things we do on computers today.
Hurra!!!! at last i will have the hands free when i browse porno sites 😉
– Click?
– Right-click?
– Middle-click?!!
Poke yourself in the eye with one, two, or three fingers.
Get a Mac (one mouse button, yeah I know you can have more) and use space/enter/whatever as mouse button
CLick = Blink Left-eye.
Right-Click = Blink Right-eye.
Middle-Click = Blink boths.
O_o
Canon’s been making camcorders with eye-motion control for a while, it’s about time computers picked this up.
As a consumer, because of time required for stability and especially accuracy, I would not give this another thought for 10 more years. Anything that is entirely dependant on a body movement requires a long time for development, if ever at all. I’m talking for retail market-type quality for something like this.
Take a look at http://www.cs.utk.edu/~shuford/terminal/engelbart_mouse_alternative…
which contains a copy of an email from Doug Engelbart – inventor of the mouse. In the brief note he comments on other methods they considered, including eye tracking.
Ever since then people have considered eye tracking as a pointer controller, but its mostly easier to use your hand.
Commercial eye tracking mice have been available for sometime, having been developed in various places to allow people with disabilities easier use of a compter. Take a look at http://www.eyetechds.com/ for example.
As for the Canon stuff they made SLRs with eye tracking for focus control sometime ago, then gave up becuase it didn’t work. Their first attempt at a video camera with eye driven focusing wasn’t much better either, to quote from http://www.diku.dk/~panic/eyegaze/node22.html During normal use the video camera can all of a sudden fail to track the eye properly, which forces the user to make sudden movements with the eyes to regain correct tracking. A white square is constantly displaying where the user is looking. We think this must be a very annoying feature, since you cannot avoid looking at this “fly” hovering in front of your eyes all the time.
I doubt that moving your mouse with your eyesight will catch on. There are too many things that can’t be done intiutively (clicking, dragging, etc.) with your eyes that are better served by the good old point-and-click mouse. Additionally, there are too many things we do with our eyes subconciously that would produce unintended results. For example, say I’m reading a Dvorak article and roll my eyes at his audacious claims.. suddenly I’m up to my eyeballs (excuse the pun) in new browser windows, or I’ve closed the window completely.
I don’t see this as a practical replacement for the mouse.. although I agree with Bonus that there are probably some useful scientific and medical applications for such technology.
Just one step closer to the nirana of “Focus Follows Brain”
That’s what I get for posting with Opera instead of Konq, no spell check.
When will there be a fore feedback version that can be used to stab video game players in the eyes?
That’s what I get for posting with Opera instead of Konq, no spell check.
Just blink your left or right eye like you do with your mouse. Middle click doesn’t work because the system may take it like a normal blink. Apparently they are going to implement middle click with a mouth action. Just open you mouth wide open and the gesture triggers a middle click.
Windows follow EYE focus plz ffs!
I hate when I type something and it doesn’t come there I look
Everybody who know the least bit about human perception knows that the natural eye movement is extremely jumpy and all over the place most of the time.
Having to force yourself to make accurate non jumpy eye movements is worse than having to handle a 5kg mouse while making a handstand during an earthquake.
This will make you ill for sure.
A teacher and a student here at Guarda – Portugal already had that implemented see http://www.magickey.ipg.pt/ (only in pt_PT).
The version isn’t opensource (i’ve tried to talk them out) but it’s used in cooperation with a child help fundation, the only price payed is from the fundation to the service of adapting every copy of the program to the person’s eye, so it’s kind of free as free beer.
Anyone remember the old Level 9 Text (!) Adventure Snowball? Set on a huge great space ship, and interacting with the computer by typing in “look at 1”, “Blink” etc etc..
Great fun!
“Anyone remember the old Level 9 Text (!) Adventure Snowball? Set on a huge great space ship, and interacting with the computer by typing in “look at 1”, “Blink” etc etc.. ”
Good grief! Yes I remember that one – I never finished it though. Something like 7,000 locations iirc. Bit of a big L9 fan generally though. Red Moon!
In terms of the technology, I think somebody in a previous post got it spot on – a combination of eye and voice control would work best:
“Pick Up”, “Drop”, “Open”, “Delete”, etc, together with a cursor linked to what you are looking at at the time. At least I think so, it’s hard to say unless you try it.
I thought this kind of technology was already available to fighter plane pilots though? Is this the same kind of thing?
Eyes move way to fast from one object to another, so focusing on one thing is really hard, the nice thing of having a mouse is that it doesn’t move all the time. Although it is nice to see what they’re capable of, it won’t be usefull anytime soon.