Peter Hutterer, the man behind the multiple pointer X server, has released an update to MPX server, adding support for multitouch displays. “MPX already supported multiple input devices. Which blows pretty much all assumptions in user interfaces (input) out of the water. Now I’ve gone one step further and added support for multi-touch displays.”
He said he’s gonna focus on cleaning it up; when is this functionality supposed to be merged into mainline X?
This is what i’ve been thinking of for years now…
Yet, when i bring this up people always tend to miss the point, and rather see the problems, or fail to understand what this possibly could be good for. I bet lot’s of people said the same thing about the computer mouse back in the days.
What i would really like to know, is how this is handled by the GUI toolkit/framework (two hands in one app) and/or window manager (two hands in two apps)?
I’ve often used this analogy to point out the possible benefit of a multi-touch user interface;
Think of when you’re playing “age of empires” (or any game that involves a lot of “point and click” really) and you could utilize both your hands instead of just the mouse and some keys on you keyboard . Yeah, that would be real slick. But, what if you transfer that scenario to a real world (job, military or whatever) application that involved a lot of input/action? That would make you job a lot easier.
fltk was the first toolkit that came to mind. It hard codes a single static input device.
The most fundamental issue I wonder about what net affect this has on focus policy. I guess it still works well with “focus follows mouse”, of course that just means that more than one application can have focus simultaneously for some things.
Very cool technology.
———————————————————
What i would really like to know, is how this is handled by the GUI toolkit/framework (two hands in one app) and/or window manager (two hands in two apps)?
———————————————————
As far as the applications go, they don’t know the difference in devices. MPX emulates a “CorePointer” i.e. generic mouse for each input device (be it a mouse or your hand).
Two hands in different apps is easy… it just means two different mouse pointers, each independent in it’s respective application.
Two hands in the same application would require support for multiple events in the GUI at the same time (another use for multiple cores!). I wonder how many applications already allow two pointers to interact? Eventually we’ll need applications with menus that can be opened independently and at the same time.
Multiple core CPU’s will help out with lag time.. if we allow each CorePointer and its respective action to operate on a separate core. Which also means we need a re-architecture of programs that lock up when one part of it is in use. THink.. how to use GIMP for two users at the same time? What if one person is using a CPU heavy filter and another wants to do something else.
The joys of layered development. Adding this to the x system is just the first step. Now lets see someone really use it. And even better, it will work over a network from day one!
Browser: Opera/8.01 (J2ME/MIDP; Opera Mini/3.1.8295/1690; nb; U; ssr)
First time seeing this.Watched the video and was impressed.I think thats where the pc desktop going.Touch screen with voice controlled apps.
Edited 2007-07-15 00:31
Thanks go google and SoC, a student is working on integrating gnome’s metacity window manager with MPX.
http://code.google.com/soc/2007/gnome/appinfo.html?csaid=DB13B8A574…
http://live.gnome.org/Metacity/Mpx
This will be interesting to play with once it is merged into x.
Update:
For ubuntu users who want to play with mpx, take a look here for a repo that contains it for you:
http://wearables.unisa.edu.au/mpx/?q=node/85
Edited 2007-07-15 02:52
I think 10 years from now the mouse is history.
I doubt very much that mouse will be history in 10 years, if you put the display vertical as current screen, your arms will be quite tired at the end of the day.
If you put the display horizontal as shown on the video, it’s better for the arm, but
1)it’s still more tiring than using a mouse,
2)it takes quite a lot of space and
3)your neck may not appreciate looking down all the day.
Not the old “tired arms” argument. We don’t all have to ham it up like Tom Cruise! I say try it, and work out the usability bugs rather than pooh-pooh it.
Imagine your keyboard/mouse combo, currently on your desk is replaced by a touchscreen, designed specifically for the desktop. Now, a lot of people wouldn’t like typing that way – without the feedback. But I hesitate to suggest that most people don’t actually do a great deal of typing at home, and when you need to, a keyboard is like $5 so why not have one lying around.
1) So with this set-up, why would it be more tiring than a mouse? You don’t usually drag things around the desktop, so mostly you’ll just be touching a control, then your arms are yours again, to rest as you like.
2) I don’t think the space a production version, designed for the desktop, would take up would be a great problem. My PDA has a touchscreen, after all.
3) No-one told you to dump your monitor. Think Nintendo DS on a large scale. Heck, your old monitor could be a touchscreen as well, for total control.
And this is just stuff off the top of my head. You have to assume that most, if not all the problems with this proof-of-concept system would be ironed-out by the developers. Mice are cheap and convenient, but limiting. If we could all have touchscreens in our homes, we would.
Any reason we couldn’t be using both a vertical and horizontal screen at once?
I have to agree with you. The only way I see it is if you use the computer while standing while having some type of air-display like in the sci-fi movies, the picture being displayed in the air — projecting from a cube…(the computer)
This air display would have to be interactive, accepting touches etc otherwise imagine today’s ordinary computers with special touch panels where you have to use your hands all the time, touch them to do stuff. Your hands would have to be in the air all day, you will get tired….Of course this exists today…
Edited 2007-07-15 11:41
is there a difference between a multi-touch display and a touchscreen in terms of hardware, or can any touchscreen work with mpx?
normally a touch screen can only register one point of contact. or if multiple ones are made, the avaraged center between them.
but i have a feel that as the iphone meme gets moving, the default will be multi-touch screens. hell, it may well be that any touch screen can do multi-touch but is limited in hardware or software as the creator didnt see the need (had a stylus in mind and so on).
I’ve seen the video (and many others) but I am not convinced multitouch displays is the way to go. For starters, you need a keyboard to do serious work (because computer commands are still input by the keyboard!), and the demo did not show any truly innovative uses of multitouch. The drawing application is nice for 5 year olds like my nephew, but it is nowhere near appropriate for detailed work.
Computer displays are different beasts from mobile phones. A multitouch screen may work nicely for the iPhone, but the iPhone’s screen is very small. It’s not the same as a 17″ monitor.
Congratulations though to the MPX guy (independently of my views of multitouch, the guy deserves some respect).