The Interactive Computing Lab team in ENAC, Toulouse, has been successful in collaboration with Linux developers in bringing native multi-touch support to Linux. While there is Multi-Pointer X in the mainline X.Org server (to be released with X.Org 7.5/X Server 1.7), we now have multi-touch support to be able to handle gestures and other actions. This multi-touch support requires the Linux 2.6.30 kernel. How this works right now is by reading the input events, translating them into multi-touch events using simple gesture recognition, and then sending D-Bus messages over to Compiz to produce multi-touch effects. Right now the code is deemed just a proof-of-concept, but they are currently working on a better implementation.
definitely a first step, but maybe it won’t take too long to start to appear on distros and be considered when designing the user interfaces… also, by this time, it may also be worth performance-wise.
I also wish this project give us a nice, clean and hardware-agnostic solution (not tried to some driver/hardware vendor), that could actually be used by the community without forks and parallel projects trying to accomplish the same thing…
I remember reading something about a crap load of patents Apple had for multi-touch when they first released their iPhone. Yet it seems like everyone is implementing it. Microsoft has their “table” or whatever they call it. Google seems to have stayed away from it however.
Is this a problem? Is apple just waiting for the right time to sue?
Not to sure if it is the same thing but I am using Fedora 11 on my netbook which has a multi-touch pad – it seems to be working in its detection of two finger strolling and so forth. I assume that this report is further development that is a lot more advanced compared to what I am able to do now.
Yes, multitouch works on many Synaptics notebooks quite decently already under Linux. Often it has just to be enabled in the driver’s configuration in xorg.conf.
The thing still missing however was an interface to software/user space for that. While the Synaptics driver was happy to incorporate multitouch responses to translate them to scroll events for X, X had no chance to directly access and understand the responses.
Xorg now has multi-pointer support which is already one important step to enable understanding of several input coordinates at once. However, this is still in the development version of X.
I don’t really know how the work of Intel relates to this. Interestingly, they also work on a yet higher level of processing: Estimating the gestures.
Edited 2009-06-16 15:43 UTC