As an answer to someone asking whether Unity will require a working OpenGL stack to operate in Ubuntu 11.04 “Natty Narwhal”, Mark Shuttleworth announced that Canonical would offer an optional, QT-based, “2D” implementation of Unity. Here is a video, too.
It’s exactly as unattractive as the 3D version. And it’ll have the bonus of pulling QT into RAM for a primarily GTK-based application-set, so future Ubuntu users will get to enjoy default memory bloat regardless of whether they have 3D support or not.
Cool.
No, those who have 3D support won’t bother using this and will be using Unity 3D instead without even knowing that Unity 2D is there.
Edited 2011-01-15 18:24 UTC
Well, the 2D is already way faster. Also 2D does profit from 3D thansk to QML and QGraphicsview plus QML allows way greater customizing. The only thing missing in 2D are the compiz-like 3D-effects but then you can still add compiz on top of the 2D unity or maybe even switch the WM to e.g. KWin. Such a combination would rock the house.
First I am actually using only a very small number of GTK-based applications.
Second even if not those additional 100kb are so important for you cause of…….?
Am I the only one who thinks that Ubuntu’s UI team should be forbidden from using their Macs and forced to use an Amiga instead when they’re doing their design work ?
I mean… First the window buttons, next this dock and strange menubar… This just looks like a very cheap OSX clone to me. Can’t they use a little more imagination ?
Edited 2011-01-15 18:24 UTC
Well, OSX design is great and all, but I agree with you. Ubuntu doesn’t need to be a cheap OSX clone. I really believe that with some good time spent on design research you can achieve something equally good, if not better, without copying
Yeah, but then, people will reproach Ubuntu for having an uncommon interface that you have to relearn…
Well, if Ubuntu’s UI team had that in mind, they could hardly mess up more…
* Windows users must definitely re-learn the interface
* OSX users must re-learn the interface as well, and as a bonus are deceived by the look (because no matter how much they want to make them look-alike, gnome does not work like OSX’s UI)
* Even gnome and KDE users must re-learn the interface to some extent, since no single distro’s UI works like Unity does.
Edited 2011-01-15 20:29 UTC
Probably not. This is Ubuntu after all. The sun and moon rise and set on the shoulders of Mark Shuttleworth. Just ask any Ubuntu fan. When KDE shakes up the desktop with something new, it’s a deemed a disaster and a travesty. When Canonical does it, it’s brilliant and innovating. c.c C.C
Edited 2011-01-16 02:06 UTC
Are there any left?
What I think is more stupid is that they rely on Mac OS X to get the work done for Linux – that should be setting off alarm bells to the developers that maybe they should address the lack of tools which force the people to use Mac OS X to do graphics work for Linux instead of it being natively done on Linux.
Reminds me of Sun trying to get people to use their software but all the work at Sun is being done on Windows and Mac OS X computers. It really destroys your credibility to say to the market to do one thing and all the functions of your business use your competitors products.
I’m for function above form, so how it looks is really less important to me than how well and fast it navigate/works
It looks like Apple’s UI works pretty well so it cant be all that bad.
So this is for environments that don’t have 3D, like ARM notebooks and so forth. These envs also don’t have a great deal of RAM.
So am I the only one who thinks having a dependency on the QT monster AND freaking QT JavaScript is a good idea for a Gnome/GTK desktop?
Perhaps Ubuntu won’t be a “Gnome/GTK desktop” forever? Just saying.
It is a good idea if it helps Ubuntu make a compelling experience with less work. People who object to it are free to run other stuff. Or get a job and buy a computer with more than 256megs of ram.
That’s KDE fans wet dream, but I don’t think it will happen, just saying.
Well, it already started. We will see how the situation evolves.
I doubt very seriously it’ll end up moving to KDE wholesale, although I wouldn’t be at all surprised to see them doing more new things with a QT base as time goes on. It really is an attractive option for development purposes compared to having to deal with GTK.
And considering who is behind Qt, you get a heck of a lot contribution by big names that’ll lead to hopefully improvements on the desktop. If it means that Canonical can focus on the desktop and others pick up the heavy lifting behind the scenes then it’ll be interesting to see in the future where it leads to.
I hesitate to say this but I could see in the future a movement to creating a new Qt based desktop by Ubuntu developers simply because I have a feeling that what they want to do would require as much work if they were to customise KDE – that it wold be easier to start with a clean slate.
That’s essentially what they’re starting to do here. Can’t say I blame them. I like what the widget set is capable of and the licensing options but I absolutely hate the KDE way of doing things.
😉
http://farm1.static.flickr.com/48/141521711_9d1842ccc7_o.jpg
That pic is so old and irrelevant at these days.
Who’s the first patron of KDE now?
http://ev.kde.org/supporting-members.php
This is not a fake 🙂
http://www.cyberciti.biz/tips/mark-shuttleworth-financially-support…
http://www.easylinux.de/Artikel/ausgabe/2006/07/013-news/
http://blogubuntero.wordpress.com/2007/07/09/entrevista-al-genio-de…
http://www.infos-du-net.com/actualite/13329-ubuntu-windows.html
It already happened. Ubuntu won’t go GNOME 3 but will go Unity which happens to be based on OpenGL and Qt. GNOME is irrelevant now.
It’s already happening. Ubuntu at one time would never have went anywhere near Qt, and yet, here they are replicating their entire interface in it……for 2D.
They will either end up having to build a whole desktop in Qt or reusing what’s already on offer (KDE), but they’ll be dragged kicking and screaming into doing it and Canonical might well have gone bust by then.
Or get a job and buy a computer with more than 256megs of ram.
That’s a very arrogant and ignorant thing to say and quite clearly shows how much thought you put behind the whole thing..
Care to elaborate how it’s either arrogant or ignorant?
It’s a fact of life that having ~ 20 megs more libraries mapped in virtual memory only hurts people who have way too little ram. It is not a concern in desktop/laptop/netbook form factor, and even the crummiest ubuntu-capable tablets will probably match the requirement as well. Catering to that crowd is not in the best interest of Canonical, those are well served by ligthweight window managers like fluxbox.
First of all, telling the other person to get a job is the arrogant part; you don’t know whether or not he has one or whether or not he could afford a computer.
Secondly, just telling someone to buy a newer, better computer is the ignorant part; it’s much wiser to keep using old hardware as long as it works and fits the bill. Just buying new all the time creates unnecessary stress on the environment.
Obviously I was not referring to parent poster. Everyone on osnews can afford a computer (otherwise they would not be on the internet).
If you can afford internet, you can afford a computer with >= 1 gig of ram.
You can acquire a fast enough computer second hand for under 100eur, without stressing the environment. I buy almost all my components (not monitors) second-hand, since that’s the most rational way to go about it anyway.
Libraries? Internet cafés? Etc.
I doubt that it’s common, but you can’t just dismiss the possibility.
Just buying new all the time creates unnecessary stress on the environment.<P>
Even when new hardware is faster and more capable than the old while drawing much less power?
It’s been a long time – close on to ten years, I would think, – since the mass market desktop PC was sold with a bare 256 MB of RAM.
One reason for the rapid demise of the Linux netbook was the potent combination of an Atom CPU, Win XP and 1 GB RAM.
The demise of the Linux Netbook market was IMHO down to the stongarm (no pun intended) tactics of bully boy Microsoft.
“Mr OEM, I see that you are selling PC’s without windows pre loaded. You are using that commie thing calld Linux”
“So?”
“Well, the next time your OEM Windows deal comes up for negociation I think each copy is going to cost you $50 instead of $5”
“WTF?”
“Unless of course you stop this Linux foolishness this instant?”
“Yes your majesty. Three bags full your majesty”
That was a big part of it, but by no means the only reason. Anyone ever used that piece of shite (aka Xandros) that Asus preloaded? Buggy, unstable, bloated, artificially crippled, outdated, and hard as hell for anyone not familiar with Linux to update. Acer and Linpus suffered from pretty much the same problems, though it wasn’t quite as buggy as Asus’s offerings. Dell was the only one of the big names that even got close to offering something that worked and, even though it worked, it was still horribly outdated and couldn’t easily be updated unless you knew Linux fairly well. You can blame Microsoft, and you’d be about half right, but the blame also falls on the OEMs that seriously f**ked it up.
I recall sometime mid last year this same guy basically saying you should just kiss the major graphics card manufacturers’ asses and use *their* official drivers, even if you prefer to use and support open source/free software and value your freedom. Even if you want to run a fully open-source OS. If not? Then apparently you deserve to have to put up with shitty drivers or just switch to Windows to get “proper” ones, or just shut up. A quote of his:
Never mind the fact that it’s these manufacturers in the first place that that make writing open source drivers so difficult, with their refusal to publish specifications. And according to this guy, there is such thing as “proper” drivers. Which would be, those proprietary ones blessed by their respective hardware manufacturers that the drivers target, who have and give themselves full and exclusive access to the specs of their own hardware. 😐
He even agreed with my understanding of him that if your graphics card is no longer “supported” by its manufacturer and the latest drivers… you should just go out and buy a new graphics card. Yeah… sure.
Arrogant? That word nails this guy.
By the way, the above quote was taken from the article called “Ubuntu Nearing X Server Not Running as Root” if you want to read the rest of the conversation.
Duh! Expecting Ubuntu to always be suitable for old, outdated hardware is a good example of daydreaming.
Canonical wants to make Ubuntu a modern OS, so it’s perfectly reasonable for them to add features that up the requirements. Folks who need to run Linux on ancient hardware with very limited resources have plenty alternatives (DSL, Puppy, Knoppix, etc.).
Ignorant of what? Do you realize how cheap old hardware goes for on ebay? I found a replacement pentium-m cpu for a laptop from 2005 for 5 dollars + 2 for shipping.
256mb does not cut it anymore, have a look at how much a modern browser uses with just a couple tabs open.
Modern operating systems should require at least 512mb for installation.
Should, why? Actually I think the less RAM an operating system uses the better (while obviously providing necessary functionality), since that leaves more RAM for applications, which actually IS the reason I launch an operating system in the first place.
If you care about performance, you buy adequate hardware, or make appropriate software selection (like using something apart from Ubuntu). You don’t complain to people that write software and give it to you for free.
You misunderstood. I see nothing wrong in an operating system saying ‘you need 512mb to run this’, I was arguing against the statement that a ‘modern operating system should require atleast 512mb ram’.
I run Arch Linux, which is a ‘modern’ operating system where I choose the components I want myself, the system uses ~130mb ram (openbox, cairo-dock, cairo-composite manager, conky) which is way below 512mb. I run it on a 4gb ram machine, but that doesn’t mean I want the OS to use up all that ram (except for aggressive caching), I want the ram for my applications, many of which are really ram hungry (like 3d stuff for instance), and of course being able to run many programs simultaneously (I often run blender, gimp, firefox (several instances), handbrake, inkscape etc all at once). The leaner the OS is while still providing me with the functionality I need, the better in my opinion.
Again this doesn’t mean that I think there’s anything wrong with an operating system demanding 512mb ram, it’s up to the user to decide whether the requirements are matched by what it provides, I was just arguing against the blanket statement.
Edited 2011-01-16 14:03 UTC
These open source and free software developers would be nowhere if it wasn’t for their users. Same with all of these proprietary software companies. If they lose their users/customers, they lose their business to someone else–whether it’s another free alternative or some other company. They wouldn’t last very long with an attitude like that, never taking the wishes of their *users* into consideration. And I don’t think it’s too much to ask for the developers to write more optimized code–it should be right up their with fixing bugs. Apparently a lot of people want it–just look at Windows 7 for an extreme, OS-level example.
People want faster applications, they don’t care if it requires 10megs more RAM.
Because the browser alone can take up 256mb. Any OS with a desktop and browser should require at least 512mb.
Minimal standards should be set to ensure a consistent user experience.
Ah, well it seems I misunderstood you. I thought you were claiming that the actual operating system would use 512mb of ram to be considered ‘modern’. Yes I can certainly agree that with general application usage 512mb ram would be considered a minimum for a desktop os.
He’s right. As soon as people do something that moves software on and makes it do more you get some ignoramous complaining how it won’t work on his machine from over ten years ago with 64MB of memory.
and pay the microsoft tax and get a windows 7 PC
you can only get windows pcs or macs in sweden if you go the the big stores and middle size stores
Online stores, and roll-your own PCs.
Buying prebuilt is just plain stupid, IMO.
if i want a laptop i have to choose mac or windows 7 laptop
Well, considering how absurdly easy is to create an interface like this on QML, and how hard it would be doing it on Gtk, and also considering you are not paying for this guys doing the software, I agree with the developers decision.
Just saying…
You are getting it wrong.
Why? Please enlighten us.
So… Qt is heavy and slow right?
Sorry to disagree: http://www.osnews.com/story/24261/1_Second_Linux_Boot_to_QT_
Qt ain’t that heavy and is not that slow.
But people may take that perception becuase the main apps. that use Qt are heavy and slow, for example: they perceip KDE as heavy, bloatted and slow, hense they assume Qt is the same. But not really.
Maybe I’m wrong, but I think KDE passes a wrong idea about Qt. It’s such an elegant and well designed framework used in a such messy and bloated desktop environment.
ps: Sorry the trolling KDE fans, but I became very disappointed to what KDE turned these days, after so much promise on the beginning of kde4 project..
I’m not your average KDE fan (I use KDE 4.6rc/4.5.x on my workstations, GNOME on desktops / HTPC’s, XFCE on my netbook/low-end machines and IceWM on VM’s), but I must admit that I’m somewhat baffled by your claim.
In my experience, as long as you have a decent graphics driver, KDE is actually faster than GNOME, at the expense of somewhat (~50MB) higher memory usage.
However, if you can’t spare the memory, XFCE is far better suited to low-end machines compared to both KDE and GNOME…
You may or may not like KDE4 features *, but KDE 4.6 is just as bloated as GNOME 2.x is (or vice versa).
– Gilboa
* I personally depend on plasma’s different widgets per desktop, kwin’s per-window/application/class configuration/grouping and konsole’s profile.
Edited 2011-01-16 13:49 UTC
Maybe because you still think that Qt is and will be a ressource hog. Which is exactly why it has been adopted by Nokia, y’know…
That was my first thought too… machines that need the 2D fallback are the machines that will struggle most to run both GTK and Qt-based software stacks at once.
Not really.
Machines that need 2D fallback are the machines that don’t have proper GPU drivers (as is often the case with ARM). They may well have good amounts of RAM available.
I think it might actually be nice to have the base toolkits for running the most popular free software installed by default.
Especially since the software center doesn’t tell novice users that one application depends on GTK+ and another on Qt.
Yesterday my wife, who uses the GIMP so sporadically has to re-learn it every time, thought she might install another (simpler) graphics program. She found KolourPaint in the software center … but there on our home network it was taking too long to download, so she gave up.
Probably the software center should have some way of recommending software based on the smallest download size / dependency list, but all the same, had Qt already been there, it would be a much smaller problem.
/rambling
My guess is that this implementation will be the only implementation of Unity as time goes on. Once QSceneGraph goes mainstream, QML will benefit as much from GPU acceleration as Compiz, making the “normal” 3D unity redundant.
By the time Ubuntu goes Wayland, at the latest.
It is probably because it is faster to develop that they picked Qt here, it’s the 5th time they rewrite the Netbook shell, 2 time in GTK, one in EFL because the GTK one was too heavy, then Unity-gtk and now Unity Qt. The EFL version existed because GTK was a massive fail for ARM (gnome mobile anyone?) and Qt exist because GTK was a massive fail for highly interactive and attractive interface without additional bloat. I think there is a pattern here and Cananocal start to understand that they are wasting effort with GTK.
Qt is far from perfect, it work well on lower power devices, but do that a while to load unless to work hard on speeding it up. The GTK unity is unusable on most first and second generation netbook (900mhz celeron and 1.6ghz atom), so is KDE Plasma (with both raster and X11 backend, one is too slow and the other full the cpu. I dont even mention dataengine taking the rest of the cpu for themselves), this unity-2d come to fill that hole in the lineup. It is in Qt simply because clutter would not work and GTK2 is not very good for interactive interface until it got something like QML (CSS and HTML ui help a lot too, but GTK is working on that, lets give them credit).
*Btw, before voting me down, the point I made above have been made in Canonical blog post/press release.
I’d be interested by why they ditched EFL, though. It seemed to be the right libraries for the job.
Well, maybe it’s because EFL it’s nearly abandoned if compared with QML and it will never be stable software?
I don’t know what other developers think, but I had my bad time already working with non-supported libraries…
Edited 2011-01-15 22:51 UTC
Well, they are finishing version 1.0, it look like it will actually be released after all, and soon. But yea, nobody use it apart from some obscure Japanese phone and E17
“Well, maybe it’s because EFL it’s nearly abandoned if compared with QML and it will never be stable software?
I don’t know what other developers think, but I had my bad time already working with non-supported libraries…”
We are releasing the EFL next week or in 2 weeks.
comparison with QML (on set top box) : QML couldn’t just draw and scroll a list with 6 items (with no graphic options, that is, it was roughly just a list of strings). No problem with the EFL and a bunch of graphics effects.
So, QML is just way slower than the EFL.
The EFL are used in around 2 millions of set top boxes, they are supported by several companies. Not a lot, though, unfortunately.
They are quite stable and not at all abandoned. Thanks for the troll.
Citation needed, please. What hardware? QML can scroll huge lists with graphics just fine (even on older phones). Misconfiguration, perhaps?
hardware: x86 @ 1Ghz + sgx 530 + 1GB of ram
The company who tried it used a list of 5 items with animations. 100% of CPU used, the fillrate was huge. They removed all the animations (so the simplest list), the animation was not smooth at all, with still a huge amount of used CPU.
The 1st design problem in QML is that there is one Javascript context per object (creating one costs a lot). You can’t cache and reuse an object (you have to destroy it, then recreate it, and as creating one object is slow…). So you can’t even cache the items of a scrolling list to have good speed on slow devices.
The 2nd design problem (not related to speed or memory consumption) is that they embed in one object the UI and the code. Which means a code that is not easy to maintain (while it would have been a better choice to separate the UI from the code)
After one year of development, trying to improve QML code, the company gave up and used something else.
Is there a video of a scrolling list on old phones that shows the smoothness of QML ?
N900 uses hardware inferior to that, and QML is pretty smooth there.
Here’s an example I saw just today:
http://www.youtube.com/watch?v=ak-Py3cf_Ac
Not true. You can create a global js context by doing “.pragma library” in the .js file.
QML actually can actually cache list elements:
http://doc.qt.nokia.com/4.7-snapshot/qml-listview.html#cacheBuffer-…
I don’t know what you mean by this. You can call out to javascript and c++ components from your qml components.
One year, and they didn’t contact Nokia at all for help? It seems to me that any problems they encountered should be pretty simple to solve.
Most of the videos seem to have N900, but I’ve been running my “QmlReddit” application ok on N97 mini. It’s not 60fps, but it can do okay anyway. Perhaps I could make a video someday.
OTOH, N900 is old tech today as well ;-).
Incidentally, once we switch to QSceneGraph the performance will be much greater still:
http://www.youtube.com/watch?v=n3W5O2biSPU
> After one year of development, trying to improve QML
> code, the company gave up and used something else.
Are you aware that QML was introduced with Qt 4.7 and 4.7 is only since a few months around?
To be fair, QML has been available for a long time as a preview:
http://labs.qt.nokia.com/2009/05/13/qt-declarative-ui/
Thought Qt 4.7 is where it’s an officially supported technology. Before that it was ‘use at your own risk’.
Answer here:
http://bfiller.wordpress.com/2011/01/13/unity-2d/#comment-31
I haven’t seen any Drag&Drop operations to the side bar and from the sidebar while the space is automatically
for you. Is this available?
Blog post here:
http://bfiller.wordpress.com/2011/01/13/unity-2d/
Check out this interesting comment:
http://bfiller.wordpress.com/2011/01/13/unity-2d/#comment-16
::
Who still sees grand future for “classic” Unity?
BTW, on topic of unity – I could consider using it if they got rid of that stupid icon bar on the left that I don’t use, it’s a waste of space.
Edited 2011-01-15 21:19 UTC
It’s bad… It depend on most of GTK, Gnome and Qt at once… It is gconf, glib2.26 (and not 3) and many gtk lib, but is built in the KDE fashion with Cmake and qt plus 15 wrapper for gobject based libraries.
Well, it’s the first iteration. Give it some more and it will become unity 🙂
Dude, I tried, I tried really hard to use Unity… We’re almost on February and it DOES look a lot like the same thing back when people started previewing it and saying it would “change and turn into something completely different”…
Oh I doubt, I doubt…
And lemmetellya, I won’t use this crap for anything in life! I would rather go back to Windows 7 than keep Ubuntu and use this. It certainly may look like Windows 7 taskbar but the whole MacOS inheritance shivers down my spine.
Looks like Linux Mint will be actually going #1 in DistroWatch.com, after a few months Ubuntu 11.04 is released!
Ubuntu, please… Shuttlef–kup!
Edited 2011-01-15 23:05 UTC
You are aware, I trust, that while Unity will be the default desktop, you can always use classic gnome, gnome shell, KDE (with Kubuntu), lxde (with Lubuntu), xfce (with Xubuntu), or literally dozens of unofficial derivatives such as Mint (see one list at https://wiki.ubuntu.com/DerivativeTeam/Derivatives ).
Nobody’s twisting your arm here like certain competitors. It’s open source. We have *choices*! Use whatever you like. 😉
It would be great if someone would create a website special for designing a new, easy to use, and good looking desktop. Everyone in the world could help to create the best desktop ever!!
Everyone would be able to come with new ideas and comment on others.
So who is going to set up such a website?
One thing I’ve been wondering about as I’ve gone desktop environment/window manager hopping is why, in the Gtk world we have a number of DEs and WMs, but in the Qt world, up until recently, there’s been only KDE. Now, with Meego and Unity, we have options. This is a good thing, even though I have not seen anything interesting enough to convince me to move away from my Gtk-based desktop(s).
there is a bunch of them, just not many popular. GTK have two, “many” is the the right word.
I personally use Lubuntu, so this whole Unity debate feels irrelevant to me. However, when Ubuntu 11.04 finally comes out, I WILL give it a test drive if for no other reason than to see what all the angst is about. However, I’m more than likely to stick with Lubuntu thanks to it’s fast performance and light memory requirements. I consider it the best of the *buntus.
But feel free to use whichever WM you like. That’s one of the great things about Linux – we do have this kind of choice. Windows and OSX users are not so blessed.
Great. You just convinced me to use Lubuntu.
Sweet! Ubuntu is bringing my open source Linux distribution (2D version) to Ubuntu!!
Yay!
Thanks Mark!
http://unity-linux.org
Yes they are currently the top distro, but they are quickly being supplanted by Linux Mint. The reasons are plentiful, community, speed, but I have to assume the #1 reason is Ubuntu’s continuous and unflappable ability to screw up the design scheme beyond all usability. Half ass cloning Mac’s design, which is already frustrating to the 90% of the world used to Windows. Hideous browns tans and oranges, only to be supplanted by perhaps even uglier oranges and purples. Overbearing security. Lack of inclusion of nearly all the useful codecs. A continuing decision to cut useful apps (gimp) in favor of cruft. No VLC by default.
Linux Mint is literally 1000x better, and with the move to a debian base instead of Ubuntu, it looks like Ubuntu will be fully irrelevant soon enough. And I say, “ABOUT TIME!”
“design scheme”
doesn’t exist in open source software. correction, open development software. correction, open ecosystem software. you know what I mean. too many cooks in the kitchen. too many spoons in the pot. too many hands in the pants.
point is you need one guy with style to make something look good. any more or less means a putrid abortion.
It would be interesting to know what path mint will take, GNOME Shell or Unity.
Even on planetkde some are puzzled with this port:
http://kamikazow.wordpress.com/2011/01/14/so-canonical-ported-unity…
Porting the shell is one thing, but it means that the shell will use a different library than all the applications given that Ubuntu has standardised on GTK: IMHO this is a useless increase in memory usage..
Even though I prefer Qt over GTK, I wonder why they did this port..
Apparently choosing a GUI library and *sticking to it* is really difficult: SUSE went from KDE to Gnome (and then back on some versions), Nokia switched from GTK to Qt, Canonical use both at the same time apparently, *sigh*.
Suse WAS a a kde distro and along comes Ximian and Novell and try to make SUSE a gnome distro. Opensuse remained kde centric (at least in numbers of users), SLES was almost completely gnome based.
I believe nokia switched from gtk due to the slow pace of development and inability to make gtk behave in mobile environments.
Canonical, however is a ship without a sail, plagued by indecision and now are being hammered by the same problems that led to nokia abandoning gtk…
It’s got nothing to do with memory usage. What’s worse is divided and wasted effort.
Because some people with issues didn’t want to accept that Qt, and then probably KDE, was the right technical direction they should have been going in in the first place.