“The Ubuntu Technical Board has made two technical decisions of which we would like to inform the Ubuntu community. Both of these decisions concern the upcoming 7.04 release of Ubuntu, scheduled for mid-April.” Ubuntu 7.04 will not activate binary video drivers by default, essentially meaning nothing will change from the previous releases. The second change is a major blow to the PowerPC architecture and thus owners of Apple PPC hardware: “The PowerPC edition of Ubuntu will be reclassified as unofficial. The PowerPC software itself and supporting infrastructure will continue to be available, and supported by a community team.” Translation: Ubuntu PPC can shake hands with the dodo.
I blogged about it here earlier:
http://eugenia.blogsome.com/2007/02/13/a-sad-day-for-the-ppc-archit…
It really makes me a bit sad…
This is indeed sad, since one of these days, Apple will axe support for the G4 as well, but that was never a problem since you knew a well-supported and well tested PowerPC distro was available (Ubuntu really shines on PPC compared to its rivals) to replace Mac OS X in the future.
Sad, this.
It is indeed sad. The PPC chip is a marvel of engineering, but the truth is that people are not optimizing for it heavily because the userbase is still low. What with no Flash or plugin support, or binary support from any vendors, nor virtualisation support for Windows (the hot topic right now) it seems that the PPC arc is no longer practical to maintain.
BELETED: (If you don’t have something nice to say…)
Edited 2007-02-13 23:57
Beleted?
Must have meant Belated.
Brief introduction – my name’s Matthew Garrett, and I’m a member of the Ubuntu technical board. I’m a community member, not employed by Canonical. I was involved in making both of these decisions, and while neither has been resolved in entirely the way I would have preferred, I wholeheartedly agree with the basic conclusions in both cases.
One thing that’s important to understand about the PPC version of Ubuntu is that almost nobody was using it. Download figures were tiny for dapper, and even smaller for edgy. The PS3 is an obvious market, except that with only 256MB of RAM (and the currently entirely unaccelerated 2D graphics, let alone the lack of hardware accelerated 3D) it’s not really a good fit for Ubuntu[1].
In reality, developers aren’t going to simply start ignoring PPC bugs. We’ve got too much pride for that. Bugs get fixed for IA64 even though it’s never been a release architecture, and Sparc spent a long time as an unofficial version before we made a release for Niagara. Several Ubuntu developers (me not included) still use PPC systems as their primary development environments, and there’s an obvious incentive for them to ensure that it still works.
Realistically, the single biggest obstacle to high quality PPC support has been hardware support – Apple hardware is generally a good deal weirder than a lot of x86 stuff, and that’s really saying something. Now that there’s no more of it appearing, support is likely to stabalise. I’ve always felt that the PPC version of Ubuntu was rough around the edges compared to the x86 release, with edgy finally getting to the point where I didn’t feel faintly embarrassed about the entire thing. If PPC support genuinely degrades to the point where it’s significantly worse than it is now, you’re free to swear at me at length. I’ll apologise profusely and make sure that something’s done about it.
[1] Yes, I agree that it’s insane that Ubuntu doesn’t work well in 256MB of RAM. I’m really, wholeheartedly sorry.
Thanks for joining the discussion. Although I agree that the decline of PPC as a desktop architecture forces the Ubuntu project’s hands as a primarily desktop-oriented distribution, what about the server releases? My understanding is that Canonical hopes to expand Ubuntu’s prevalence in the server market, and PPC-based hardware is very much alive and well in this space.
For example, this year IBM will be standardizing on POWER6 across three of their four lines of server systems, including the System Z mainframe (formerly based on the S/390 architecture). The mainstream System I is targeted at the Linux market, and supports both Red Hat and Novell. Doesn’t Canonical want to challenge the big guys in this market?
[Note I’m an IBM employee, but I do not represent IBM in any way. These opinions are purely my own.]
At the moment, we really haven’t made a decision on the future of long-term supported PPC releases. It’s likely to be heavily influenced by demand, and if the POWER-based hardware is popular, then there’s likely to be demand. The Niagara release of 6.06 was based off the community SPARC release from earlier versions, and I think we were still the first distribution to be offering a commercially supported release on that hardware[1]. So, if someone wants it, we can provide full support in a very short space of time. Right now, as far as I know, nobody[2] is really telling us that they want it.
[1] To an extent because the release cycle just worked out that way, but still.
[2] Well, I’m not privy to Canonical’s commercial interactions. It’s quite possible that there are negotiations surrounding this sort of thing, and I’m just not being told about them. But there’s no sense in refusing to supply what people want.
Well right now Server space is not the focus for Ubuntu, though they definitely want into the space. So let’s look at the IBM Power based lineup:
P-Series: Here is the attractive one of the bunch from an Ubuntu perspective. The blade servers in particular are supportable for the most part. However, from what I have seen the rest of the P’s lineup come with RHEL/SLES only drivers not yet in the main Kernel tree. IBM seems to not be in any rush to add another target Linux distro, though I dearly wish they would. The other half of the coin is that if you are getting a P chances are you are doing DB intensive work, for that AIX is still better.
I-Series: Can you run Linux (any flavor) in a supported fashion directly on an I? That is an LPAR install only I think (though I could be wrong) which would need support and assistance from IBM to get working.
Z-Series: Profitable for IBM, but honestly why would it be a target for Ubuntu unless they charged for support on a per LPAR basis. Even then, the manpower it would take to get it running and certified for Z would probably outweigh the profit, though I suppose it would be good press. Existing Linux on Z customers are not going to shift from RHEL. And new Linux on Z customers are far more likely to go with the conservative distro. Put simply too much on the line to go with an unknown risk.
So really you have one of three lines with a somewhat strong case for Ubuntu, but even there the target audience might be better off with AIX. Not to mention the need for support from IBM to make it worth the effort for Canonical. IBM is not showing any indication or willingness to target anything but RHEL and SLES though.
On the other hand by putting PPC on the backburner Canonical can concentrate on getting into the x86 server space, where they are much more likely to have success. Only if/when Ubuntu starts to get traction in the server space and IBM shows an interest in making Canonical a strategic partner would it make sense for Canonical to start targeting PPC server space.
Matthew, thank you for your candor. Knowing makes understanding easier.
FWIW, I use a fair number of machines but one is fast becoming an all-time favorite, a PPC G4 miniMac (loaded) running 6.10. Super quiet in a home office, fast enough for most uses, and I regard being non-Intel as an advantage when exposed to the net. (not anti-Intel, just like making worms and viruses even harder and less frequent.)
Count me in for as long as I can hang on… 😉
“””
I blogged about it here earlier:
“””
Does anyone care?
Thanks for the blog link, Eugenia, it was an interesting take on this.
I think what this goes to show is that Ubuntu was really supporting the Mac rather than the PPC…it’s sad, but in a way it was inevitable. If Apple had kept the PPC for its Macs, no doubt Ubuntu would still support it today.
With a PowerMac G5 and a G4 Mac Mini I too find this a bit disturbing. I had some hope of using Ubuntu on them the day Apple drops support for the PPC architecture. My guesstimate is that that will happen around the release of 10.7 as I believe 10.5 to be the last OS X that will run on PPC Macs. In other words, this should happen in about 3-4 years time.
On the one hand I probably won’t be using any of the computers by then, on the other hand thay could still come in handy for some tasks. I guess it’ll have to be Debian that gets installed around 2011…
Not the decition I like, but I respect it.
Edited 2007-02-13 22:19
Some people will not agree with me, but from a pure pragmatic POV I hope this decision was made because of time limits, and will be reconsidered for 7.10
The decision was expressly not made because of time limits. It was made because there’s no compelling reason to supply this functionality by default. If there is by 7.10, it’ll be reconsidered.
Ubuntu must feel they need to focus on the largest market share. When I look at some of the other steps they have taken (Linspire’s click-n-run comes to mind) it seems they have a goal in mind. I would not be surprised if they came out with some major announcement before the end of the year.
Debian as usual supports PPC
http://cdimage.debian.org/cdimage/daily-builds/daily/arch-latest/po…
this may not make much difference to powerpc ubuntu users. there will still be iso, and repositories for powerpc. most bugs in the powerpc ubuntu exist in the x86 ubuntu, so they will still get fixed. the powerpc specific issues will still get fixed upstream, or by the community.
of course you wont be able to buy powerpc support from canonical (but if a significant number of people actually were, then this would not have happened).
it is still a shame for powerpc users, as there will be less testing. but i dont think it is worth leaving ubuntu over.
I’m glad business decisions are being made. One of the most common “knocks” I hear against open-source projects is that nobody takes leadership, and things linger on without decisiveness. Here two very clear, albeit unpopular decisions were made and commuunicated. I, too am sorry to see the PPC go as a supported platform for Ubuntu, but I applaud their willlingness to make tough decisions.
[Edit: Changed Hear to Here]
Edited 2007-02-13 22:41
There are no major PPC desktop vendors anymore so I’m not surprised at this decision.
It sucks for the folks relying on ubuntu to keep their PPC hardware up to date software wise though.
I expect apple themselves to shelf PPC support here soon. It is bound to happen for most Operating Systems on PPC.
I expect apple themselves to shelf PPC support here soon. It is bound to happen for most Operating Systems on PPC.
Guess I better hold off on that G4 powered Mac Mini on ebay, then.
dapper has support for PPC for another two years, and debian isn’t going to drop PPC support until you pry it from their cold, dead hands.
Actually, Debian will drop it soon or at least move it to second-class citizen status. They adopted a policy for Etch of only supporting hardware that is commercially available, so there may not be an official release for PPC after Etch.
Well, ppc is commercially available (Ibm servers, Pegasos, many embedded devices, xbox360, ps3) so I don’t see a problem here.
I am sure they meant the macppc, since noone really supports Pegasos’ ilk and few try to support the xbox 360 or the ps3, making it hard to turn their ports to second tier developments.
Edited 2007-02-14 18:48
That’s not true. See the Debian Release Manager’s clarification at http://lists.debian.org/debian-devel/2005/03/msg01167.html – the requirement to which you refer actually means “must be able to replace machines that fail”. That’s not very onerous to satisfy – heck, even the m68k architecture meets that particular criterion.
The sarcasm was really more directed at Apple dropping support for PPC than Ubuntu.
So I guess Canonical is leaving the PPC ports to the community then? Probably not that much of a bad thing.
A linux distribution using low market share as a reason to not support something is either incredibly ironic or a sign of how far linux has come. Probably both.
Full ack. Opensource should be something different.
Linux on PPC is the alternative OS for the alternative platform.
When Apple switched focus to x86 we all knew PPC for the home computer would die a slow lingering death.
It’s just a matter of time unless there is reversal from Apple or an unexpected shakeup.
While apple has a long history of orphaning relatively recent hardware, (look at mac -> macII -> macIII for example), Linux on the other hand, has it’s roots in alternate platforms and cast off junk. Not disagreeing with you at all, just surprising coming from a linux distro.
Another reason to stick with Gentoo, wide cross platform support. Helps keep your environment homogenous.
It seems to me that Ubuntu went into “shark jump danger mode” from Breezy onwards, with various quality issues surfacing in almost every new release. Perhaps if the developers spent less time trying to emulate pointless desktop special effects shown off by Mac users (or Sun engineers) with no real work to be getting on with, they’d realise that binary-only drivers aren’t necessary in making a genuinely decent/innovative desktop experience. (In fact, binary-only drivers often have serious technical issues which would impede user acceptance, even in environments where money is changing hands to have the offending graphics hardware directly supported.)
It might also help if they also stopped introducing half-thought-through breakage in the name of usability, presumably led by some “experts” who think removing most of the features, emptying most of the screen, and swapping the buttons makes for usability. And actually fixing reported bugs would probably improve the user experience more than having spinning desktop cubes and the like.
But I guess the pundits and the punters just love “shiny”, so we’ll be seeing more of it. Let’s hope the installer still works by the time it arrives in force, eh?
Another sign that Ubuntu has “arrived” is the number of people who slam it. Success of any kind brings this out.
Why do you use Ubuntu again? Or, if you don’t use it then why are you responding here? Ubuntu is spending considerable time on “bling” as you put it and on “usability” features (again, as you put it) because that’s what gets Linux onto desktops. And obviously whatever they’re doing is working because Ubuntu has been ranked number one on Distrowatch for over two years; almost since it’s inaugural release. Having to install and maintain a workstation via the command line or by editing cryptic text files may be your cup of tea but most people want their computer to be functional AND fun to use.
I distinguish between bling (lots of fading in/out, wobbling, spinning things) and usability. You don’t need the former to get lots of the latter. And things like decent control panels are important, and the Ubuntu people have done quite well with them, so I’d like to see more work in that area, but without them doing things like badly patching the printing system (a major source of criticism) or playing “hide the filesystem”.
Perhaps if the developers spent less time trying to emulate pointless desktop special effects shown off by Mac users (or Sun engineers) with no real work to be getting on with, they’d realise that binary-only drivers aren’t necessary in making a genuinely decent/innovative desktop experience.
I disagree. A hardware-accelerated desktop is *not* pointless. Some of the effects, such as the Expose clone, are *very* useful to a certain type of user (such as me).
As far as the “spinning cube” goes, it’s a great way to visualize four different desktops…and it really impresses non-Linux users (I have yet to see anyone being indifferent to it).
Beryl/Compiz have been a great boost for Linux awareness. You might not feel it’s important to you, but that doesn’t mean it doesn’t help Linux as a whole gain more mindshare.
Also, you should realize that the Ubuntu devs are not “spending time” on Beryl/Compiz…the work is done by the Beryl/Compiz devs, Ubuntu is simply benefitting from it. That’s how things work in open-source – there’s absolutely no guarantee that Beryl/Compiz devs would work on other, less blingy Linux stuff if they weren’t working on the hardware-accelerated desktop.
Finally, I’d like to say that it’s misleading to suggest that the quality of Ubuntu releases has been going down. In fact, I have to say that in my experience, it’s been the other way around (apart from a few problems at the release of Dapper). Edgy is an excellent release, and I’m looking forward to installing Feisty once it’s ready!
Edited 2007-02-14 03:16
While I don’t think that a 3D accelerated desktop is necessarily “pointless”, I have yet to see anything that is useful in one.
Expose clone? Maybe. But does it really require 3D?
It always comes back to the spinning cube… and how it makes virtual desktops clear to stupid people. I think that is questionable.
As it happens, I have about 60 Linux desktop users that I support. Some of them pick up virtual desktops instantly. Others never will… no matter how many spinning cubes you put in front of them.
The spinning cube, more than anything, reminds me of the weapon reload animation in Quake and Doom. Its main function is to artificially slow you down.
The best argument I have ever heard in favor of 3D desktops under Linux is the argument that if Windows goes completely 3D, chip makers will drop the 2D API, leaving us in the lurch, as it were.
Which is a very bad thing for Linux. Because our OSS coverage of 3D chipsets is crap compared to the competition. Even in the case that the manufacturer makes the specs available, as with the Radeon 9200 and below.
And the situation is far worse when they don’t.
I, for one, do not welcome our new 3D overlords.
Edited 2007-02-14 03:56
While I don’t think that a 3D accelerated desktop is necessarily “pointless”, I have yet to see anything that is useful in one.
Expose clone? Maybe. But does it really require 3D?
It requires *hardware* acceleration, which is done by cards that also have 3D capabilities. A lot of the Beryl/Compiz effects have nothing to do with the spinning cube or 3D. The new Desktop Wall effect is pretty cool (and is a 2D way of dealing with desktops).
You have to understand that Beryl/Compiz is *not* a 3D desktop. It is a super-charged compositing manager that *allows* for 3D effets. However, the windows and desktop remains 2D objects.
It always comes back to the spinning cube… and how it makes virtual desktops clear to stupid people. I think that is questionable.
Hey, I *like* the 3D desktop, and I’ve understood the concept of virtual desktop a long time ago. I just like it, but I understand that other might not like it as much. The cool thing with Beryl is that if you don’t like the 3D cube, you can opt for other desktop management, such as the aforementioned desktop wall effect. The expose effect is great for people who – for some reason or another – don’t like multiple desktops. I find myself using it a lot. To each his own.
Hey, I often used alt-tab anyway, and I understand those who prefer this. Well, lo and behold, Beryl has eye-candy there too with a “Ring” window switcher (reminescent of the original Tomb Raider item switcher, for those who remember).
It’s *not* just the spinning 3D cube. It’s also true transparency, for those who want it, and hardware-rendered shadows for windows and menus. It’s the “Group and Tab Windows” plugin…it’s kinda hard to explain, you have to see it to get it (I’m beginning to get it, and find it quite useful!). It’s the Snow plugin. (Yes, the snow plugin. I found that with the right background, it can be very relaxing and soothing when I need to think about something.)
It’s the Interact Zoom plugin, which is *amazing* when you do photo retouching. It’s the Widget Layer plugin, which allows Dashboard-like applets to be put on their own layer, which can be toggled on and off (I use the bottom left corner, since my panel is at the top). It’s the screen annotations…ok, I haven’t found much use for that yet, but it’s kinda of an interesting concept. Anyway, you get the idea.
The spinning cube, more than anything, reminds me of the weapon reload animation in Quake and Doom. Its main function is to artificially slow you down.
Gee, do you switch desktop *so* often that a 0.25 second delay for the animation to play really slows you down?
Obviously you don’t like the cube. That’s okay, just don’t use it. There are *lots* of other useful things in Beryl/Compiz, and lots more to come, too.
The best argument I have ever heard in favor of 3D desktops under Linux is the argument that if Windows goes completely 3D, chip makers will drop the 2D API, leaving us in the lurch, as it were.
I personally don’t think that’s a very strong argument, as it’s certainly unlikely. Though in that case, yes, it would be a very good thing.
Which is a very bad thing for Linux. Because our OSS coverage of 3D chipsets is crap compared to the competition. Even in the case that the manufacturer makes the specs available, as with the Radeon 9200 and below.
It’s not “crap”, it’s just a little behind. Nvidia drivers are quite good, and the ATI driver, as subpar as it is, does an honest job (and hopefully will get better). You have to remember that hardware-accelerated desktops don’t require heavy 3D performance (at least not in poly counts – texture memory is obviously more important, but even then it’s not *that* demanding). I’m getting excellent performance on my Compaq Laptop with the Radeon Xpress 200 chipset – not exactly a powerhouse!
I disagree that this is bad for Linux. This is *excellent* news for Linux, as it generates a lot of interest in the platform, in addition to providing us with a more advanced desktop. That can’t be bad!
What matters the most for me, as a Linux user, is that *I* like it, and find it useful. Seeing the excitement Beryl/Compiz generates, I know the feeling is generalized, and that also can’t be bad for Linux as a whole.
I, for one, do not welcome our new 3D overlords.
Don’t see it as 3D, because it’s not, really. See it as a supercharged, hardware-accelerated desktop.
Edited 2007-02-14 04:30
There are also pure technical advantages. In programs that don’t double buffer very well, a compositing window manager “fixed” that problem as the layer is moved on top of the other layer, and the area that would have been previously damaged is now part of the underlying layer. With a compositing manager, people who said performance of moving windows around sucked because of the tearing in the background if you had Firefox there will see a huge improvement. It also helps to offset the load from the CPU onto the GPU, and that has a noticable positive effect on my battery life and speed of my laptop (moving a window around previously would cause the CPU to scale up, thanks to it being on the GPU, it no longer does).
In addition, there really is a lot more polish on the Linux desktop when you’re using a compositing manager. Instead of the stupid black bars when minimizing a window, the window actually is minimized, and instead of 20 of the same icons with the title when you have it highlighted, you can see the contents of the window as the icon (with the other icon there as well) and ALSO see where the window is when its selected, a HUGE gain for usability.
Also, for accessibility, the gnome-mag based magnifier feels a LOT better and more natural then the previous pixmap based solution. Other accessibility improvements have also been made possible thanks to the use of compositing. Other then the fact that some drivers have shit 3D support in Linux right now, what are the real disadvantages of going to a 3D composited desktop? If they become used for the basic desktop instead of just 3D rendering and games, then they’re going to become a lot better faster, and for the hardware that can’t support it (which my geforce 2 go can, and thats pretty old, so we’re talking about real ancient hardware here), there still is the option of disabling compositing that would work on everyone’s computer (with many usability improvements disabled)
Time to load my ceremonial rifle and fire off a one-gun salute.
I use Ubuntu, but everytime I hear of these big goals and plans for the next release I can’t help but thinking; “Yeah, .. right”. I’ve come to expect ‘deferred’.
Edgy was supposed to be edgy, Feisty was supposed to be feisty. Edgy wasn’t so edgy and Feisty certainly doesn’t look like it’s going to be feisty.
To be fair though, there was upstart for Edgy. And I’m certainly not complaining about the small evolutionary updates, but each time you hear about the next one they make it sound like the previous version would be rendered completely obsolete.
I guess one of the ways to judge how successful something is, is by listening to the amount of wind being blown?
Edited 2007-02-13 23:11
Dropping PPC was actually a really smart move even though it sucks for people with PPC hardware…
Simple things like Graphical grub splash screens are not enabled by default in Ubuntu because it is blank on certain PPC hardware. Ubuntu has to go for the least common denominator. If there is 1 specific architecture (PPC) that makes things difficult for the other various ports, it brings down all of Ubuntu.
Again, this sucks, but Debian should still run fine on PPC hardware.
Simple things like Graphical grub splash screens are not enabled by default in Ubuntu because it is blank on certain PPC hardware.
We don’t support grub on any PPC hardware, so that’s not why we made that decision. In fact, I’m not aware of a single feature that’s been dropped because of PPC support.
ok I was wrong. I read something once on #ubunut-devel about splash screens on PPC not working and assumed it was grub.
You would know.
Speaking of grub splashes, is there any reason we don’t use a nicer-looking GRUB like in SuSE/SLED?
Seems too early to abandon PPC — the body isn’t cold yet.
While I respect Ubuntu’s decision, I’ll move my Linux boxes (PPC and x86) over to Debian. Don’t like mixing my machines.
I’m disappointed. I really like Ubuntu, and was looking forward to the new media-based spinoff. I’m typing this on a G5 PowerMac, which I hoped would run Ubuntu in its dotage.
Apparently not. Maybe I should look into BSD.
Well, NetBSD and OpenBSD both support multiple PPC platforms, as does the Linux kernel itself. For MacPPC support, OpenBSD and NetBSD work, with OpenBSD not properly supporting the Old World ROM boxes. And on the Linux front there are Yellow Dog Linux and others still.
FreeBSD on the other hand has been progressively dropping efforts on older hardware platforms, specifically any that are no longer being sold – macppc’s support is toast, along with Alpha and a few other platforms that were once usable with FreeBSD.
So, NetBSD or OpenBSD could work for you – but you’ve got to pick based on your need, NetBSD’s performance is higher but it’s stablility lower, while OpenBSD is the opposite, the security shtick hurts performance while ensuring stability.
Edited 2007-02-14 05:47
I have used the PPC version on a old G4 for a few years, running MOL when I had to interact with a Mac app when I had to. But using Linux on Apple hardware always had some problems. I never got the cd burner to work properly, and X liked to default to 1024×768, even with it was not the preferred mode. I have to remove all other modes to get anything else.
The performace for a dual 800 was nice, but the really annoying thing is the lack of plugins, even getting Java installed was a pain. But the joy I get from running Linux on Apple hardware and annoying rabid fan-boys, is priceless.
it’s going to die like BeOS PPC died
Shows Sun is a better Open Source partner than IBM
Shows Sun is a better Open Source partner than IBM
I’m curious about the logic behind this statement…what are you trying to say, exactly?
I’m endlessly pleased to see the madness that was the proposal to default to using binary drivers die, it so deserved it.
But now can we please get rid of the wifi drivers as well and maybe you know.. focus on building a proper wifi stack on Linux with drivers that don’t fall over if you look at them wrong. We are at least a year behind OpenBSD on wifi, they got there with hard work and a dedication to freedom – what have we got to show for the compromise we made on wifi?
I’m getting rather tired of supporting people who installed them because of the reply one gets when such drivers break – “it’s Linux’ fault” or “Linux sucks because my wifi is flaky”. I honestly think people would accept non-existant wifi better than the current state wherein we have no proper wifi stack, all the drivers implement everything themselves.. and badly. To support most drivers you need to jump through hoops and then it still doesn’t work due to odd bugs.
Can we please get back to the tried and true method of doing things right, even if it takes a bit longer? I promise it will be worth it in the long run and I’m in this for the long run not just a goal of converting as many people as possible while leaving an equal or greater amount of people with hatred of Linux because of profoundly bad experiences.
well, you do seem to have a point, but let’s think a bit further than this.
Abandoning working solutions is a major set-back for linux. Our tendency as a company is to prevent the support hassle so we try to get supported hardware for a strat and that makes sense. Our wifi works, we don’t need to support broken nstallations as they are not borked anyways.
The same holds for video stuff. If people like bling (and so far I’ve seen that they do — they see vista and we go futher than aero with compiz, beryl, metisse, lg3d), we should have it. Point.
If you really want to be pure, you will have to understand that it’s a major setback in functionality and people will abandon easily linux.
That is also the reason why some distributions are not too well accepted — it delays the useability too much because of the “purity” the developers want.
It’s fine, all ok, but if we want to go better, we’ll have to accept a few things. thinks like some closed source drivers, ndiswrapper, deals between companies that some don’t like.
The real world says: hey, I/we want a heterogenous network, else we’l have to ditch linux. Not the right route.
So yes, sometimes some thinks are being thought of that you and I dislike, but please, don’t try to stall linux, like FSFs work with GPLv3 tries too.
Abandoning working solutions is a major set-back for linux.
You seem to totally miss his point. These “compromise” solutions don’t work, so let’s concentrate on the ones that do work (like OpenBSD did) and scrap those half-working solutions.
Now feel of course free to agree or disagree with that.
P.S.: I don’t think we need any more anti-FSF-fud here.
I am afraid that you missed *my* point.
I said that _we_ don’t have theis support hassle because we get supported hardware. There *is* no need to use halfbaked solutions.
Anyone seems to understand that running linux or windows on a ZX80 doesn’t work; still people just buy hardware to find out that it’s not really supported on their OS.
What normally is being done:
1) select OS
2) select hardware that fits 1).
if you don’t you indeed have support hassles.
Regarding the FSF stuff: I wrote that as both his statement and FSF statement are approximately the same: progress is not wanted. And I guess that it’s a mistake, a big one. So far, the current incarnation of the GPLv3 shows clearly that it _will_ set us back.
How can that be FUD? It’s the exact problem we have right now. Two parties:
1) we don’t want linux to succeed so let’s try to stop good things in it’s tracks
2) we want working stuff so if that makes some less desirable things to happen, so be it.
It seems that both the $OP and you are on the first camp. Nothing wrong with that, it’s all about freedom to choose.
I think it’s bad. No FUD here. (think about why we as a company don’t have support hassles and others do, we mostly see the problems in camp 1 — it’s just what _we_ see. your reality may be different.)
Regarding the FSF stuff: I wrote that as both his statement and FSF statement are approximately the same: progress is not wanted.
Sigh. If this isn’t FUD, what is?
What’s especially funny however is that he wrote about how to get real progress, so to act as if he doesn’t want progress is simply insulting.
While the idea of doing things the OpenBSD way is nice, it’s not going to happen with Linux – there are too many special interests that get their say and can effect the direction of the kernel’s development. Just look at the mess of firewall options for Linux, where OpenBSD took one and streamlined it, cleaned it up and made it a thing to be feared, Linux allows for more than 4 different systems to do it’s packet filtering.
Linux serves as the hydra because it’s got all these minds taking it in all these different directions, it’s unlikely that a sudden change will come to get everyone to look to the same goal.
It is Linux’s fault, the way it is developed encourages this, you know, the whole, “it is my blessing, and it is my curse,” bit.
Yes. Long ago I bought into the idea that it would be better if we focused all our efforts upon one desktop, etc.
Eventually, I realized that it wasn’t ever going to happen. Later on, I realized that it was not even desirable.
Look at one of the few pieces of the stack that *has* been effectively single-sourced in the past: XFree86.
Remember what happened to us there? How many years were wasted in effective stagnation thanks to David Dawes strangle-hold upon the project (as the rest of us cowered in fear at the project’s monolithic complexity?)
We still haven’t recovered.
Sometimes things *do* get out of hand and we end up with *too many* competing projects in a specific area.
But we should always have at least two realistic and popular alternatives as checks and balances to each other.
It may not be “optimal”. But, pragmatically speaking, it is best.
I disagree, I find the BSD idea of, “one problem, one tool, one way to solve it,” is better than, “one problem, a dozen tools and as many ways to solve it.” It wastes a great deal of time developing all the varied solutions when just having one works fine.
XFree86 was mismanaged until soneone finally said enough, oddly enough, OpenBSD was one of the first to say it – forking XFee86 until X.org was announced. There need not be two projects for X, or any other single task, it just needs to be run with a very clear guideline of what it’s goals are and how to get there. Dawes never really had that, he had the glory of being in charge of a vital project.
Having the GTK/qt nonsense is bad enough, there should be a singular toolkit, but the number of Desktop Environments is too high, and it’s even worse with Window Managers. There are too many projects all doing the same thing differently, most are bloated, slow and do not work on hardware without 256 MB, or at least 128 MB, of RAM these days.
There should be one project doing it’s job well and people keeping an eye on it to make sure it continues to, X’s fork is a good example, the fork should have happened earlier, but when the last straw snapped, it was done and everyone left XFree86 behind.
People use the term pragmatic too much, it’s not a part of pragmatism to be wasteful, that’s beurocracy. Pragmatism is about approaching things logically, in a straight-forward, matter-of-fact, direct means. No bouncing around with multiple options, but taking the right path with things going properly. Dropping minor needs for greater goals, taking the bull by the horns and leading it. It is about getting things done right, rather than just getting things done.
I would point out that “*BSD and the BSD way” vs “Linux and the Linux way” is, in itself, a free choice.
While acknowledging that there are many more factors involved than the topic we are discussing, I will point out that Linux has solidly trounced *BSD for popularity.
I suspect that someone might point out that, by that standard, Microsoft has the best methodology of all, so I will address that now.
Linux and *BSD started out from very similar positions. If anything, Linux came from behind. As BSD fans are eager to point out, BSD has been around longer. And from a technical standpoint, *BSD was still ahead at the time that its legal troubles ended.
Neither had the kind of marketing muscle that has always insured Windows’ success in the marketplace.
Maybe a lot of other factors are involved. But Linux can’t be doing things too terribly unpragmaticly and still enjoy the continued lead that it has over *BSD.
But remember that in my philosophy “*BSD and the BSD way” is just another choice. And one that is no doubt a good fit for many.
I can understand your position because I used to hold it. But not anymore.
Certainly it’s a free choice, but it’s not the pragmatic choice, pragmatism dictates the clean direct path to getting the job done right. Anarchism doesn’t tie into that.
The popularity of the various Linux-based operating systems verus the various BSD-derived systems isn’t in question. But saying Linux and *BSD is not the right way to look at things, instead saying Ubuntu has stomped FreeBSD or Fedora Core has clobbered OpenBSD, or even that OpenSuSE has desimated NetBSD works better.
There is no uniform entity known as Linux, that’s what we’ve all just agreed to, and we know that the three major BSDs are there own systems. So let’s deal with these based on operating systems when talking popularity – Windows 2000 is still more popular than Windows XP.
Microsoft’s methodology of system development is that of a mostly uniform system, it’s tools are almost exactly the same between different editions of the same operating system, what does that have to do with pragmatism or lack thereof in the way that Linux-based systems develop?
Going off on your unrelated tangent, who cares what marketshare anyone has? The popularity of a system does not reflect it’s pragmatism, only it’s popularity.
I have no idea why you associate popularity with system development directness – American Idol is popular, does that mean it’s pragmatic? The connection makes no sense, right? That’s what you’re saying.
Certainly it’s a free choice, but it’s not the pragmatic choice, pragmatism dictates the clean direct path to getting the job done right. Anarchism doesn’t tie into that.
I’m not sure “pragmatism” means what you think it means…
I disagree, I find the BSD idea of, “one problem, one tool, one way to solve it,” is better than, “one problem, a dozen tools and as many ways to solve it.” […] Having the GTK/qt nonsense is bad enough, there should be a singular toolkit, but the number of Desktop Environments is too high, and it’s even worse with Window Managers.
I’m sorry, but as far as I know the BSDs use both Gnome and KDE as well.
As for having a single toolkit, no major OSes does: not Linux, not Windows, not OS X, not the BSDs.
Efficiency is *not* always desirable. Yes, there is less waste, but it is also less creative. For things to evolve, it doesn’t hurt to have a bit of chaos.
The problem is that you *can’t* prevent people from making their own (sometimes redundant) projects, and sometimes the competition actually produces *better* results. So instead of criticizing something that *won’t* change, I think it’s more constructive to see the good sides of it…
Yeah. If a major new feature shows up in KDE, you can bet that it’ll be released in Gnome sooner or later. Likewise, Gnome features will be ported to KDE. So the two projects will feed off each other and increase the development rate compared to having only one project. Of course, there’s always a counter-argument.
The issue is summed up rather well at:
http://en.wikipedia.org/wiki/Competition#Consequences_of_competitio…
No, they don’t. The three major, and thus to date only ones that matter, BSD-derived operating systems all use no Desktop Environment nor Window Manager. They have ports, and ports let people install whatever random crap the user wants.
The closest thing to BSD using GNOME or KDE is PC-BSD, a distribution of FreeBSD that tries to be as Linux-like as possible, running KDE and using a shitty package system it made on it’s own. Or I suppose OpenBSD’s usage of FVWM if someone installs X.org.
I could care less what people do on their own, but calling the anarchodevelopment system pragmatic is silly, they’re very much polar to one another. It’s people’s choice to develop what they will, but that doesn’t mean it’s a good thing. Competition does drive production, but so does drive itself. If something simply has focus and goals it develops just fine, there is no need to have competition if there is an actual foucs on improvement.
If efficiency isn’t desired, that’s fine for you, but in the end efficiency is one of the things at the core of any programmer’s goals. There are efficiency, cleanness, security, portability and a few others, throwing away one of the core precepts that are drilled into a programmer’s head just doesn’t work for me.
Edited 2007-02-15 00:27
The three major, and thus to date only ones that matter, BSD-derived operating systems all use no Desktop Environment nor Window Manager. They have ports, and ports let people install whatever random crap the user wants.
Whether you have to install them separately or not is irrelevant – the fact that you can means that, if you want to use BSD as a desktop, you still don’t have a default DE/WM. That was my point.
I could care less what people do on their own, but calling the anarchodevelopment system pragmatic is silly, they’re very much polar to one another.
I disagree. Pragmatism is very much a part of it. The main definition of pragmatic is “dealing or concerned with facts or actual occurrences; practical.” I don’t see how that is incompatible with the chaotic development model.
If efficiency isn’t desired, that’s fine for you, but in the end efficiency is one of the things at the core of any programmer’s goals. There are efficiency, cleanness, security, portability and a few others, throwing away one of the core precepts that are drilled into a programmer’s head just doesn’t work for me.
Efficiency doesn’t prevent project duplication, simply because people will not always agree. I don’t use Enlightenment, but I’m happy it exists, because those who develop it are trying something new.
persuade Sony to release a PPC linux driver for the nVidia RSX graphics chip found in the PS3.
which require paying nVidia to make such a driver for them.
Edited 2007-02-14 11:46
”
But now can we please get rid of the wifi drivers as well and maybe you know.. focus on building a proper wifi stack on Linux with drivers that don’t fall over if you look at them wrong. We are at least a year behind OpenBSD on wifi, they got there with hard work and a dedication to freedom – what have we got to show for the compromise we made on wifi?
”
Lovechild, have you seen this? http://liquidat.wordpress.com/2007/02/06/kernel-2620-still-no-new-w…
http://www.devicescape.com/news/releases/release_05-01-06_opensourc…
Apparently it’s supposed to be ready for 2.6.22, but it has been out since 2006 so I don’t really know what to make of this new stack.
linux-it: do you work for Novell?
Edited 2007-02-14 12:05
For what it’s worth, it’s likely that we’ll be shipping the devicescape stack in Feisty and moving as many drivers as possible over to it. It’ll take some time before all drivers are ported, especially out of tree ones which have seen little development for some time like rtl8180.
For what it’s worth, it’s likely that we’ll be shipping the devicescape stack in Feisty and moving as many drivers as possible over to it.
Really? That’s great news! I don’t have a problem with my Wifi card using Linuxant’s driverloader, but a native solution is always preferable (if it works well, that is…)
Keep up the good work, guys!
For what it’s worth, it’s likely that we’ll be shipping the devicescape stack in Feisty and moving as many drivers as possible over to it. It’ll take some time before all drivers are ported, especially out of tree ones which have seen little development for some time like rtl8180.
Just curious, I haven’t followed development too closely, but aren’t there blockers still preventing dscape from migrating to mainline? I thought it was still a work in flux. Won’t that cause the potential for problems with a “supported” kernel, or will it be packaged as an unsupported add-on?
I played with 2.6-wireless-dev for a while as part of my futile effort in getting my broadcom working with the bcm43xx drivers. No doubt easier accessibility to dscape will help a lot of wireless sub-projects depending on it get more testers involved.
Does a distro exist that will always remain dedicated on supporting PPC no matter what?
I’m asking because, this might be an opportunity to buy a G3 or G4 PPC laptop cheap. Or am I just being overly hopeful now?
I think debian would be a good bet here.
Oh, it has PPC has it? What about Yello Dog?
Yes, YellowDog Linux which is based on Fedora. They do excellent work supporting PPC, I hope to see them work closer with Fedora now that we are opening up Core. It would be great to have less duplicate work and draw on everyones strengthes at once.
What do you mean when you say you are ‘opening up’ “[Fedora] Core”?
Fedora is currently split into Core and Extras, for release 7 we are merging those two repos and giving more people access to the Core packages which were previously maintained in the open but by pure Red Hat employees. This is good stuff, it means a complete review of all the SPEC files to ensure they live up to the strict quality rules for Extras (we need help to finish this, people who know how to read spec files and can learn the Extras guidelines please please help out, we’d hug you ever so much).
It’s a move for more openness for the contributors and a better product for the end users.
Yes, Yellow Dog Linux. They are PPC only and even if they are a port of Fedora, in my experience YDL works better than Fedora on Apple PowerBooks. See http://www.TerraSoftSolutions.com for more details.
opensuse should be a good bet.
PPC has one advantage over Intel in that some attacks cannot work because of the different instruction set.
Sad yes, but so long as there is good support for PPC, official or otherwise, I won’t be upset.
Don’t forget you have LTS with 6.06. Soon these apps are going to get too heavy for your g4 anyway.
Says it all…and I only downloaded “Feisty Fawn” this week, darn it.