From Gamespot.News: “… Kutaragi (Sony Computer Entertainment president) disclosed that he plans to install the Linux operating system on the PS3’s hard disc drive (HDD) so it will be recognized as a computer, rather than a mere console.”
From Gamespot.News: “… Kutaragi (Sony Computer Entertainment president) disclosed that he plans to install the Linux operating system on the PS3’s hard disc drive (HDD) so it will be recognized as a computer, rather than a mere console.”
i wonder if sony is going to charge to be able to use linux again; like they did with the ps2? cell linux will be amazing if they manage to apply 3d drivers and a decent ide for development (for fun).
Likely as poor / expensive an implentation as Sony’s past effort.
What if recent apple switch, made Sony think of adding some cool things to PS3…
Then that + KDE and I’m in heaven. OMG, I need tissue, cuz… umm… I made a mess. *drool*
Assuming this were possible, you could use it as a whitebox to play emulators on your TV, everything from pretty much any arcade game up to at least the mid-90’s (MAME), most ever console up to and including the N64, and it’s already compatable with PS1 & PS2 games. That would be the ultimate HD emulation station. Of course, that is too good to be true, so I’m sure Sony would never allow it
This guy has no clue *whatsoever* he is talking about. He is changing his mind about whether or nor PS3 will feature a HD three times in one sentence alone.. check hardocp for that one…
heh….. maybe I should read the article before i comment….. /me goes to read teh article
I’ve never owned a PS2, nor have I seen one up close but there doesn’t seem to be a way to add a keyboard and mouse without modding the thing.
It would be nice to see PS/2 or USB ports on the PS3, then if it runs Linux it can double as a real computer instead of a game console, and people looking for cheap machines to make compile farms or servers out of won’t have to look far. I wouldn’t mind turning one into a network storage server or something else of the sort, Sony could expand their market a little by making it easy to apply mods of this sort.
so linux will be ported to Cell chip???? just cannot wait, great news.
will the hardware drivers be open sourced too or will they be propietary?
stupid idea since you cant connect a mouse/keyboard to it who gives a damn
ok, I read the article, ….. I am not a Cell fanboy persay, but… I must admit, that in it’s element, it could revolutionize the Graphic Arts Industry (this is not just from this article), and in addition to that….. If Gentoo would port to the cell, and I think it would change the way people look at linux, …. it would cut compile times down to nothing….. hence eliminating many of the arguments against gentoo (takes to long to install software)… and it would make a gentoo a viable alternative for bleeding edge linux development……
this could truly change the Graphic Arts and the Video editing industry.
you can connect a mouse and keyboard
could be connected via usb, or any other type of port
at least, they do not try to fence linux off, like microsoft does with it’s x-box
“Linux is legacy”
ROFLMAO
that says it all right there.
nick: you can connect a mouse and keyboard
I’m not sure about what the PS3 will be like (I haven’t read much about it yet), but yep, the PS2 had keyboards available for it. There are also a couple of keyboards integrated with a standard controller.
Also… (For any doubters) Just look at one. It has USB ports in front. (BTW… It also has Firewire. In fact there’s other peripherals you can use with a PS2, like a Zip drive.) And even if it didn’t, you don’t need a USB port for a keyboard. The Dreamcast and Sega Saturn also had keyboards and they didn’t have any “standard computer ports”.
120 Gig is currenlty the maximum size of a 2,5″ HD.
I think Seagate is working on a 160 Gig, but it’s not for sale yet.
So that’s why they mention 120Gig….
true dat deletomn, … I am confident that we will have keyboards and mice… in addition to that, for those of you who do not know… Sony is all about Open standards, I’m not saying Sony stuff is OSS, but when Sony sets a standard, they make it open to the rest of the market (most of the time), and most of the time everyone jumps on board
Funny thing is…. sony consistently beats Microsoft in every market they enter (I have friends who work for microsoft who also say this and it pisses them off) … and sony is way more open with their technologies (except CSS lol).
No-one can really emulate PS2 games in a usable state at the moment. ps2emu can play little bits of one or two games, very…very…very…slowly…
I fail to see why anyone would think that Linux on a PS3 is a good idea. From what I’ve seen from all the info floating around about the cell it blows as a general purpose cpu. This means compile times are going to be awful, most perl scripts would be almost unworkable and grep will take 5 minutes to run (well, not that bad but it would feel like it did). This is good news for people who use apps like Povray, PD or Csound (assuming they all get a major rewrite).
While the article is completely devoid of any real info, from what I’ve seen of Sony is that their really paranoid. Sure it runs Linux in some fashion, but I’m willing to bet money that it won’t run user created code out of the box without some major hacking of the hardware.
Ahh, I see… that I can understand. The way he said it made it sound like an excuse.
yuuuummmmmmmmm!!!!!
from what I have seen, the Cell cpu does not blow as a general purpose desktop processor, if you had done any research whatsoever you would know that IBM in fact has a Cell based blade server in the works, and in addition to that, various benchmarks have shown the cell to be very comparable to other x86 and ppc cpu’s in some things and the cell utterly ahinilates them in other tasks.
Of course it runs NetBSD as well… 😉
http://www.netbsd.org/Ports/playstation2/
Dont buy into the hype(yet)
I remember back in 1999, Sony said the same “supercomputer” thingy with EE. But from what i have see, a year later Xbox P!!! 800MHz system just wiped the EE.
supercomputer? hahahahahahahahahahahahaha
Can it play games too?
If anything there is a internel harddrive for the OS, that would be the smart way to go.
If you look at the size of it, they could get away with a 3.5 inch harddrive inside.
Uh… under what rock have you been living under?
IBM is promoting (and selling, i think) Cell based Linux workstations, and has been doing so for months now.
I wonder why Sony anyway would want the user to allow a custom OS on the PS3, since they’re losing money with each unit they sell. People would use the PS3 as a computer most likely aren’t a desired customer, since they don’t buy any games…
So even if they run linux, its only purpose would be to simplify development for 3rd party developers. I am sure they will try to secure the console from ‘missuse’ (perhaps in the same way it’s done by Microsoft with the XBox).
Don’t hyperventilate, dood.
sony appears to be ‘getting there’ with their support of open standards, and a recognition that what matters is what YOU want to do with your machine.
they haven’t made the transition yet, witness the lock on the ability to play memory stick movies at full rez, but it’s a start.
congrats to sony
The cell processor does not do out of order execution; its benefit comes from thread level parallelism and the ability to process individual in-order threads quickly.
For generalized computing; sorry a blade server by IBM isn’t “general computing” its usually for a specific and mostly serialized task; this is not going to be a good performer.
“Linux is legacy”
My interpretation of this is that Linux can take advantage of its legacy and so run a huge number of applications out there.
On top of the stripped down PPC, one of the things the 8 on-CPU co-processors were touted as being capable of doing is code-morphing. So the system could effectively load some code morphing software into the main CPU and effectively execute x86 and x86-64 opcodes rather quickly. So then run linux on the system (using the PPC part of the main CPU), and you can run x86 binaries… Or furthermore, run WINE on that and voila!
Exactly. Finally hl2 will be able to get above 32fps!
I don’t understand how the cell is supposed to sit in the middle and run the OS in a sandbox, it’s just another architecture, I’ve seen the code that Arnd has been adding to the Linux kernel. Surely Linux is still in charge of the processor, not the other way around.
stupid idea since you cant connect a mouse/keyboard to it who gives a damn
Even if you couldn’t connect a “mouse/keyboard” to it, why would that make it useless? I guess you’re a big Microsoft fan, so you probably don’t realise computers without “mouse/keyboard” aren’t useless.
Quite possibly the biggest users of Linux on PS3 would have neither keyboard nor mouse – servers and loosely coupled distributed memory clusters.
I don’t understand how the cell is supposed to sit in the middle and run the OS in a sandbox, it’s just another architecture, I’ve seen the code that Arnd has been adding to the Linux kernel. Surely Linux is still in charge of the processor, not the other way around.
The processor doesn’t, he said other OSes run as applications and he also mentioned a kernel controlling it. So therefore Sony have their own custom kernel controlling it and runs other OSes like they are in virtual machines.
> Of course it runs NetBSD as well… 😉
good joke….
from the NetBSD-PS2 site:
Not Supported Peripherals
* game controller
* Audio
* i.LINK
* Memory Card
* DVD/CD-ROM drive
for what is NetBSD useful then on a PS2?
Existing PPC64 chips have hypervisor extensions already. Perhaps the cell includes these. This would explain the assertions that Linux is legacy and that they’ll run other things at the same time.
lol You guys are a piece of work
Cell is a cOmPLEtELy different beast to most current CPUs, so throw out EVERYthing you know, or think you know, and start again.
If out of order execution is such a big issue for Cell, don’t you think IBM/Sony/Toshiba might have noticed? This is a NEW design (a new paradigm even) so it’s not like your usual Intel adding rockets to skate boards to make them go faster!
My speculative guess is that they found in-order to work better, and no, not just “coz its fer gamez”.
So chill out guys, lets just wait and see. The technical documents will be released soon, along with the first desktop to feature Cell, the one Arnd will revealing in Germany at the end of the month.
I’m digging the irony, if it really does turn out there’s no better phrase for Cell than “Think Different”.
(and remember, Sony’s P3 Cell is just one instance of Cell, the Blade server mentioned features 2x Cell processors! …nobody knows if they are the SAME spec as the PS3)
For whatever reason, computers in Europe pay a lower import duty than consoles do. Sony used this tactic back with PS2, and a judge admitted its pledge to pay less. Europe’s just as a big market for Sony as the US is (and far far larger than Japan), so Sony is simply trying to save some loads of bucks (and if that means cheaper PS3 for us, I welcome it).
> Sony is all about Open standards, I’m not saying Sony stuff is OSS, but when Sony sets a standard, they make it open to the rest of the market (most of the time), and most of the time everyone jumps on board
really? remember minidisks and what killed them? and what was the purpose to reinvent the wheel with MemoryStick(TM)? c’mon… 1) MS 2) MS Duo 3) MS Pro 4) MS Pro Duo? whos using that #@%$#% besides Sony?
Cell is geared around executing a lot of threads at once. It’s an in order processor with no branch perdition, IIRC.
OoOE adds about a 30% performance increase, mileage varying based on the aggressiveness of the logic, diminishing returns and all that apply. Why was this route not taken, simple, because the cost in transistor logic is high and it complicates the design. Cell would perform much better otherwise.
The thing is in a serial task, like interpretting a script, it will run like complete crap. Unless of course, your interpretter is uber smart. Not very likely, more on this later.
All that aside, the cool thing about Cell is the new way it looks at computing. Thus far, we’ve had various semantic and syntax models, but our computational model has remained fairly static in terms of its performance characteristics.
In terms of it getting adopted for general computing. I think the most important thing for Cell with be a runtime, an IR, I’m going to point to LLVM as a candidate. If one could target everything to that, plus some likely extensions, then we could do some very interesting things. This would mean one could target this as a run time for scripts, with some simple tuning, it could have decent performance.
Basically, what it comes down to, the more tasks you do, the faster it’ll be, via aggregation.
Now you’re cookin!
Actually, ARS has a neat article which goes part-way to explaining the ins and outs of Cell.
http://arstechnica.com/articles/paedia/cpu/cell-1.ars
Personally I feel we need to take care when applying legacy diminishing returns and cpu characteristics to something so new. I love the concepts of Cell and that IBM have attempted to turn things on their head somewhat, so I’m willing to give them a chance to prove they are right.
As for the Kutaragi article, it was brief at best, and probably poorly translated.
Compared to Cell and PS3, Linux IS legacy, and Windows is a dusty old relic
Maybe he simply meant that Linux is legacy because it wasn’t designed with the Cell in mind. It will work but it won’t be the optimal solution but, it will be the best until/unless someone writes a new operating system?
has anyone managed to get a source cd from linspire yet? nah i didnt think so
They mention the OS’s are applications, is the Cell virtualized? So the PS3 would have a hypervisor in ROM and then run the guest OS’s.
What speed is the network connect, I’d hope 1Gb since they are cheap. Can the video drive high resolution DVI? How much system RAM?
Now plug the 1Gb net into your desktop PC and the PS3 into your monitor. Turn it on and let it remote boot the OS without a disk. The remoted booted OS would run a super X terminal program.
This will run a lot faster than you think with something like Xegl and composite. All of that alpha blending to make the screen will occur in the PS3, not over the net.
Surely if the OS is virtualized, there would be no point (for a home user) in running yet another operating system (kernel) inside the cell’s own kernel from the point of view of managing threads and processes.
(I’m not talking about business server virtualization and logical partitioning in this case)
Why not then, just have one OS underneath that runs on the ppc core, whether it be Cell OS, Linux or whatever kernel you like, that has access to the SPUs itself, and either makes use of the SPUs individually at a user-space level, or allocates threads to SPUs itself. (obviously that would require some kind of translation but the speed would still be there)
It seems to me that any processor can run multiple OSes concurrently; I can fire up vmware on my Sempron and have Server2003 running [on|with] Slackware.
To conclude, I wonder if it’s possible to skip all the virtualisation and run Linux directly on the iron itself?
I’m sure it’s technically possible to run on the Cell (whether the PS3 platform is set up to allow this is another matter). Virtualisation could have some neat benefits, though, e.g. running Linux and PS3 games simultaneously.
Whether this’ll be practical in the PS3 is doubtful but if one day all your “computers” can be virtual machines on a PlayStation, you’d see a lot of smiling Sony execs…
Also, although any CPU can run multiple virtual machines, they do vary in efficiency: current x86 OSes require quite a bit of overhead to virtualise (unless you modify the guest OS, as in Xen). PPC64 is easier to virtualise. Future Intel and AMD CPUs will be integrating tech to make them easier to virtualise).
Surely if the OS is virtualized, there would be no point (for a home user) in running yet another operating system (kernel) inside the cell’s own kernel from the point of view of managing threads and processes.
That’s not how hypervisor virtualization works. With a hypervisor Linux would think that it is running on the bare metal. It would still need to implement memory management, thread, etc.
The hypervisor provides virtualized hardware, not threads and processes. A Linux device driver would then use this virtualized hardware. Since the hardware is virtualized the hypervisor can intercept Linux’ use of it. Now the hypervisor can look at what Linux is trying to do and decide to let it procede or not. For example the hypervisor could block reading a DVD protected with DRM and return an error code the Linux device driver.
“No-one can really emulate PS2 games in a usable state at the moment. ps2emu can play little bits of one or two games, very…very…very…slowly…”
Right, but what idiot would run a PlayStation emulator on a PlayStation?
Give it a year and there will be a linux kernel optimized for a rediculous number of processors… Oh wait, there already are .
I will never understand *why* you people want Linux to run on such a thing like this so badly… are the drivers “open source” or not etc.
You think that Sony and nVidia are going to open source their brand new high end hardware? Please, Sony is the most controlling company around next to MS or do we have to remind you of their desire to do everything on their media formats.
And seriously, what the hell are you going to do write software on your couch with a controller? Type up some documents in Open Office?
I mean, is a goddamn game console! Over and over I see all the effort put into putting Linux on Xbox or Linux on PS2 etc… well thats just fantastic now do something. Oh wow, so you did all that so you could get some kind of server with an 8 GB hard drive and 64 Megs of ram. Congratulations!
It would be cool if it were used for some kind of homebrew apps, but again… stop mixing the PC and the console! Anyone remember how much WebTV sucked? Its the same principle.
If only all the effort and futile excitement for this kind of stuff went into something useful…
First, you should know that it was Sony itself who put out the Linux kit for the PS2. And, if you’d RTFA, you’d know that this again is a Sony offering: Linux will be pre-installed on the official hard drives sold for the PS3.
As far as “typing with the controller”, you should know that the PS3 – like some versions of the PS2 before it – has USB ports which you can plug a keyboard, a mouse, etc.
Finally, as to “what to do” with a Linux-running PS3, with the power of the Cell processor you could easily use it for non-linear video editing, 3D modeling/animation, and game development. You could conceivably install MythTV (or similar app suites) to turn your PS3 into a TiVO.
Don’t blame others for your own lack of imagination.
You people fretting over keyboards and mice, do your homework. The PS2 has two standard usb ports in the front, and the PS3 has at least 4.
would love to know if there will be any ram expansion for the ps3?
media apps need lots of memory today, so it would be great if one
could upgrade the 256megs of ps3.
otherwise sounds all verry kewl 🙂
rSl
Finally, as to “what to do” with a Linux-running PS3, with the power of the Cell processor you could easily use it for non-linear video editing, 3D modeling/animation, and game development. You could conceivably install MythTV (or similar app suites) to turn your PS3 into a TiVO.
You’d need some serious hard drive space to turn it into a DVR. However, if someone wanted to go through the pain of writing an X-Box emulator it could save people the money of buying multiple consoles. Unfortunately there are a lot of great games not available to play on the PS that are available on the X-Box.
Its a GAME console. It’s not supposed ot be used for 3d image work or video editing! So it probably won’t allow for ram upgrades.. but who knows.. the things is not even in stores yet for crying out loud.
http://ps3.ign.com/articles/614/614682p1.html
With dual 1080p support (1920×1080) you have to wonder if 256MB of GPU memory will be enough for textures given the trend for high visual quality games.
It has Gbit Ethernet.
You also have to factor in though that a lot of those textures will be procedurally-generated using the shaders, to a degree that you don’t see in current PC games. This is something we saw when the PSX and Sega Saturn came out in 1994. It was in the console industry that full texture-mapping of games became popular — it wasn’t until the Voodoo 1 came out in 1996 that PC games went the same route. The same thing will probably happen with shaders in the next generation of consoles.
Why not then, just have one OS underneath that runs on the ppc core, whether it be Cell OS, Linux or whatever kernel you like, that has access to the SPUs itself, and either makes use of the SPUs individually at a user-space level, or allocates threads to SPUs itself
The Cell isn’t set up like that. SPE’s don’t do threading — they’re batch processors. Heck, you wouldn’t even want to run a regular thread on an SPE. They’ve got a 256KB local memory with no VM. You have to take that programming model into account, or performance will blow.
Overall, all the talk of Linux not being appropriate for the Cell architecture is a bit silly. A single Cell is a uniprocessor machine! A 4-Cell machine would have 32 SPEs, but those aren’t general-purpose processors, so it would still be a 4-processor machine to the OS. The SMP model is a horrible abstraction for the Cell architecture, because of the limitations of the SPEs. A far better abstraction is to treat the SPEs the same way Apple’s CoreImage/CoreVideo treats the GPU’s vertex/pixel shaders.
I fail to see why anyone would think that Linux on a PS3 is a good idea. From what I’ve seen from all the info floating around about the cell it blows as a general purpose cpu
———————————-
Its just marketing hype by Sony to make the PS3 seem more powerful than it really is. They already did a great job showing off some big numbers to impress consumers, now they are capping it off by making it look like you can buy a game console and end up with another computer as a bonus.
I will never understand *why* you people want Linux to run on such a thing like this so badly… are the drivers “open source” or not etc.
Because I could get rid of a lot of electronic grbage in my living room??? My home computer needs are fairly simple and this would be one option to throw out my PS2, TV and computer. Now I have one 24″ LCD, one TV, computer and PS2 (cross connected). When PS3 arrives, I can get rid of my linux computer because PS3 runs linux.
Don’t need PS2 anymore I (:will:) have PS3, and because of this fact I don’t need TV anymore (I needed it mainly for PS2 and movies).
So if I sell my TV and 24″ monitor, well I can get bigger LCD for the same money or even less.
So all that junk replaced by PS3 and 30″ LCD??? Sounds fine to me.
You think that Sony and nVidia are going to open source their brand new high end hardware? Please, Sony is the most controlling company around next to MS or do we have to remind you of their desire to do everything on their media formats.
As long as I can compile software on that thing, do you think that I care? But, then again Cell specs are completely open.
And seriously, what the hell are you going to do write software on your couch with a controller? Type up some documents in Open Office?
Please, stay away from drugs. After that you might realize that even PS2 has USB mouse and keyboard. You just have to buy that separately.
I mean, is a goddamn game console! Over and over I see all the effort put into putting Linux on Xbox or Linux on PS2 etc… well thats just fantastic now do something. Oh wow, so you did all that so you could get some kind of server with an 8 GB hard drive and 64 Megs of ram. Congratulations!
Look at my first answer. PS3 has 256MB and HDD up to 120GB. I would call that more than enough to exist in my living room and replace most of my current devices
It would be cool if it were used for some kind of homebrew apps, but again… stop mixing the PC and the console! Anyone remember how much WebTV sucked? Its the same principle.
Nope, completely different.
If only all the effort and futile excitement for this kind of stuff went into something useful…
This is one of the most usefull things for my living room. And I can’t tell you how much do I long for march to be here
To sum it up:
Computer has software that I need and it is not affected by game installs. Perfect
Computer does not need reinstall, because some game install broken it? Perfect
Can play games. Perfect.
Can watch movies. Yes
Can listen to music. Yes
Preinstalled linux? Perfect.
No Windows yet? Still perfect.
Got rid of more than half of electronic junk? Perfect
I would call that more than usefull.
1920×1080 32bit color is less than 8M of space. Triple buffer (for the heck of it) plus a z-buffer for just under 32M. That leaves 224M for textures, which is far larger than most video cards today.
Todays processors run out-of-order execution because Microsoft can’t make a decent compiler. Remember when people hand-optimized Pentium code because it ran in-order? In-order execution shifts the burden of programming from the CPU to the compiler. IBM has MUCH better compilers than MS. Even if you didn’t optimize the code for in-order execution, it won’t affect the speed as much as some people think.
Drivers for linux don’t need to be open source. I use nVidia’s closed source video driver on my Opteron box in 64bit linux (Fedora Core 3). So what if ALL the drivers are closed source – you’ll be able to use all the hardware at its best in linux, which is more important to most people than open-source drivers.
ALL PS2’s have had two USB ports. Most models have Firewire as well, but the new slimline PS2’s dropped Firewire since only one game ever used it. The PS3 has USB as well as wireless. You’ll be able to use wireless mice and keyboards with the PS3 (and XBox 360 for that matter).
Yes, many people won’t need linux on their game console. Some of us (ME ME ME!!!) prefer to program compared to playing games. I play games, but I spend probably four times as much time programming as gaming.
Having linux on the PS3 means that Sony can not worry about making a player program for anything other than DVD and HD-DVD, and let linux handle AVIs and other media files.
@Rayiner :
“The Cell isn’t set up like that. SPE’s don’t do threading — they’re batch processors. Heck, you wouldn’t even want to run a regular thread on an SPE. They’ve got a 256KB local memory with no VM. You have to take that programming model into account, or performance will blow. ”
Check out some of the changes Arnd is making to the Linux kernel, and see his comments. Also, see the excellent article on Ars about the Cell architecture and what it might mean to computing in future.
@Sombody : right on!! ..as long as Sony don’t charge too much for that Linux distribution and drivers. (it should be free of course
@J.F. : Exactly! Your paragraph about in-order hit the nail on the head, Cell shifts the burdon, it’s a different stratergy in dealing with throughput, much like the old days of unwinding loops and managing registers on RISC cpus….oh how I yearn for those days, hehe
Wow! Another reason to get the PS3. Does that really mean I would be able to install and run Gentoo on it?
1920×1080 32bit color is less than 8M of space. Triple buffer (for the heck of it) plus a z-buffer for just under 32M. That leaves 224M for textures, which is far larger than most video cards today.
Divide the 256MB in half, it is dual headed. There may be front, back, depth, stencil, accum, aux buffers. We are talking high end graphics. 4 * 6 = 24MB. That leave 108MB for textures, vertex arrays, etc. There are other uses for the RAM like GPU programs.
You are also leaving out a windowing system. Each app in a composited windowing system can use 24MB. Now you can only run four apps on each head before spilling. 24MB is not common for an app but it is possible with graphics intensive ones like games.
That is still a lot but we are talking a five year life time. 256MB of VRAM will probably be considered tiny in 2010.
There are apps that use all of the available 512MB on cards in the market today.
I’m fully aware that the Cell is just a normal CPU and that the SPEs are not just extra cores, but SIMD vector processors with their own local cache and connecting SPU file system. I just thought that if the cell is so good at running code for other processors, those SPEs wouldn’t mind vectorizing a little PPC machine code on the fly and getting a few threads processed at the same time. I’m sure this is possible having leafed through a pdf about ways the cell can be used. After all, integer arithmetic could be done in floating point hardware (possible loss of precision might be an issue there actually , thinking about it :oops:)
However, what if you don’t /want/ the OS running under a hypervisor, what if you want to use those cycles for actual work? Maybe sony won’t let us turn it off. The Xbox360 might be quite good for standard processing. I wonder if that will become “Linux Ready” somehow?!!
@JF: Todays processors run out-of-order execution because Microsoft can’t make a decent compiler.
Which is just false. Visual C++’s x86 code generation is excellent. Better compiler technology can generate better code for in-order architecture, but there are limitations to the improvements that can be had. Consider for example the problem of scheduling instructions after a load from memory. The latency of the load can range from a few cycles (L1 cache hit) to hundreds of cycles (L2 cache miss). The compiler doesn’t have enough information to reorder those instructions, but the CPU does at runtime.
Now, some of these problems are eased for the Cell SPEs. They have no virtual memory and no cache, so the compiler knows exactly how long each memory instruction will take. But those tricks aren’t generally applicable, they’re the result of the limitations of the Cell SPE architecture.
@Mr Contraire: Check out some of the changes Arnd is making to the Linux kernel, and see his comments.
As far as I can see, Arnd’s changes have nothing to do with treating the SPEs as general purpose processors. He’s got an “SPE filesystem”, to allow submitting jobs to the SPEs. They’re being treated as coprocessors, not unlike how CoreImage/CoreVideo treats GPUs.
As for the Ars article — I’ve read it. I still don’t see what your point is wrt my original comment.
Is buried in Beyond3d’s console forum. There are actually a lot of experts who frequent the board and there is a huge amount of information in the discussions that have taken place. At least lamen accessible. Ars isn’t always all that accurate. Another resource might be RealWorldTech, they have a ridiculous number of experts — it’s a tougher read and they don’t entertain noobs, you’re expected to educate yourself. I think they’ve covered Cell as well.
In anycase, the thing with Linux and cell is that it can be fairly aggressively attacked and rearchitected. Infact, IBM/Sony/Toshiba would have a budget to do this. Including providing the middleware necessary to target the platform in a more general fashion (targetting the machine directly is daft these days). And guess what, IBM made a large contribution in such ware. *gasp* Expect an ass load of libraries which will build generic high performance constructs allowing one to exploit Cell’s advantages in computing.
Now if I can just get a meta-programming system on there, I’ll be happy.
The basic problem with using the SPEs for processing regular threads isn’t the SPE itself (it handles integer instructions just fine — it has dedicated fixed-point units). The problem is the memory model. Very little existing PPC code can fit in 256KB of RAM, and its DMA controller is optimized for streaming access. Treating the LS as a cache and the DMAC as a regular memory controller is liable to work, but will be pretty slow indeed.
@Mr Contraire: Used to have charts of instructions for the 68060 and Pentium showing the pipe useage to make sure you got the maximum throughput… those were the days. Programming was considered a skilled job instead of one step above flipping burgers like today.
@Jon Smirl: You can expect that most PS3 games will take the whole memory for itself, so what I said applies other than your remark on vertex buffers and pixel/texture shader programs… but these are items current cards also have to budget for. In the end, it’s larger than the majority of cards today (128M), but will definitely be seen as small in five years. That is par for the course with consoles.
@Rayiner Hashem: Matter of opinion. Given I used to hand schedule code for the 68K, P1, and P2, I personally consider MS code to be mediocre at best. ICC does a much better job on Intel Pentiums – it should, it’s from Intel after all. Many programs still use hand-tuned assembly language since compilers still don’t do that good a job on scheduling and register useage. Look at any video encoder or decoder – all the heavy code is pure assembly language, with different routines for different CPUs.
Does any know if the Cell has virtualization hardware to support a hypervisor? If it does I’m sure Sony won’t let us change the one in PS3.
I shouldn’t forget that the main CPU in Cell is Power. And Power supports a hypervisor. That will probably explain why Sony will let the cell run Linux, the hypervisor will be in ROM and it will stop the use of DRM protected software.
… that Linux was available for the PS2? That did not go very far, because apparently some parts of the thing were locked to forced out of site, not to mention the price.
I personally consider MS code to be mediocre at best.
Relative to hand-scheduled code? Maybe. Relative to other compilers? No. In any case, Microsoft’s supposedly poor compilers isn’t the reason CPUs are OOO today. Scheduling for in-order architectures is inherently limited by the compiler’s lack of knowledge about runtime data dependencies. OOO architectures allow better scheduling for the simple reason that the CPU has more information at runtime than the compiler does at compile time.
A $499 white box with usb2 ports and hard disk but without mouse and keyboard and unix OS…hmm, mac mini ??
I will admit that MSVS produces better code than many other compilers. It’s certainly gotten better over the years than when it first came out. It’s good enough for most projects. Usually, it’s only not used when you have something you REALLY want to run the best it can, like the inner loop of a video codec.
By the way, a good programmer has even more knowledge about runtime dependencies than the CPU. So I guess the reason for OOO execution is more probably bad programmers than bad compilers.
I worked at MS a long time ago back when the OS was written in ASM. While I was there the MSVC group ran an ad showing how great their optimization of the sieve program was compared to other compilers. They were 30% better than their peers. Us ASM programmers just laughed, we sent them back a version in ASM that was 12x faster than their C code.
Of course we knew to do the inner loop of the sieve program (where it searches for the next non-zero element) with a rep scanb instruction.
By the way, a good programmer has even more knowledge about runtime dependencies than the CPU.
Which isn’t true. Again, consider how to schedule code after a memory reference. How many clock cycles of latency should you account for? You don’t know! Even if you know the cache-management algorithms of the CPU perfectly, you have no idea what the other processes on the same CPU have done to the cache. The CPU knows this, and can account for this.
Yes, an ASM coder can extract more performance from the CPU on any inner loop. However, the ASM coder takes longer to do it than a compiler, and ultimately his code isn’t sustainable. A given piece of ASM code might be optimal for, say, the P4, but when Intel transitions to Dothan, well, its time to rewrite your code. Meanwhile, the C code just needs to be recompiled with an appropriate compiler. Most code tends to live longer than expected. In the long run, the 10-20% you’ll lose from not writing something in ASM is overshadowed by the fact that on future processors, your formerly optimal code is no longer optimal.
Any programmer worth his salt can do the same job over the whole program as for inner loops. The thing is, it’s not worth the effort. As you point out, you’d need to account for most of the different CPUs. It’s better use of time to do most of the program in C and then the parts taking the most time in assembly.
Back when there weren’t that many CPUs to deal with, 100% assembly language programs were common. Now it’s just portions that are asm any more. But I still maintain a good programmer can do a better job over the entire program than any compiler. It’s just not worth the bother any more.
Its all very well announcig this, however, the Linux-Playstation lacked the ability to access the cd drive, parts were not opensourced due to “proprietary” rubbish.
That coupled with the crappy amount of memory made it a terrible console to run Linux on – I’m hoping, however, if Sony made the specs more accessible, hopefully then volume could be increased by those wanting a compact computer to run Linux on as a desktop solution.
He must’ve downloaded Abrash’s black book and now considers himself an expert assembly language programmer.
Anybody that says “worth his salt can do the same job over the whole program as for inner loops” is obviously a liar.
Michael Abrash would disagree with this guy and this liar is obviously no Michael Abrash
He isn’t lying, but his views are a bit out dated. As Rayiner points out, it’s not easy possible nor suitable anymore to write assembler code which outperforms a compiler.
You can do a great job on one processor an a serioucly bad one on a other one. The more you optimise the less the processor can do and you will most probably lose performance. At times where only in order excecution existed it was no problem, but know with multiple piplelines…
Well my view on hand optimisation is:
– Reduce the amount of unnessary calculations
– Try to take usage of locality
– Leave the rest to the compiler and processor
All his can be done in higher level programming languages.
“No-one can really emulate PS2 games in a usable state at the moment. ps2emu can play little bits of one or two games, very…very…very…slowly…”
Right, but what idiot would run a PlayStation emulator on a PlayStation?
>
>
A Windows-using PC Gamer, who else?
The ASM sieve code is from about 1988 when Windows 2.0 and MS Lan Manager were being written. It was on the 80286 but we may have had some sample 80386’s.
I haven’t written more than 100 lines of ASM in the last ten years, now it is all compilers. An ASM programmer can always beat the compiler for under 10K of code, but the compiler always wins above 10K.
> Has it been forgotten tat Linux was available for the PS2? That did not go very far.
I will repeat it, just in case someone is reading:
Linux on PS3, as was Linux on PS2, is nothing about hype, about “supercomputerness”, about Cell’s questionable suitability, or about any useful or stupid thing you might do with the box.
It’s about money. Sony pays much less in the EU (as much 1st market as the US is -tied sales figures-, with Japan falling tens of million consoles behind) if the PS3 is considered a computer than if it is considered a console.
Linux on PS2 did go very far. Far enough for a judge to force the EU comission pay a figure in the “low tens of millions” (according to Sony) back to Sony that had been payed as console taxes (That was appealed by the EC, but no media covered the outcome, or if they appealed at all).
Kuratagi is just putting the base so that when the time comes, he can tell the judge his product was always marketed as a computer, not a console, and save a big bunch of bucks again.
To those who forgot about it, heres the story about “PS2 is a PC”: http://www.theregister.co.uk/2003/10/02/eu_rejects_sony_ps2ispc_cla…
Now you can go on discussing how cool Linux on PS3 is or is not. Not that Sony gives a damn, though…
>as much 1st market as the US is -tied sales figures-, with Japan falling tens of million consoles behind
Just so no one accuses me of exaggerating, that is considering both PS and PS2. If we consider only PS2, Japan is “only” ONE ten million consoles behind (Japan=21 million, EU= 32, US= 36). That, taking into account that the console was launched 9 months later in Europe compared to Japan, and that it took Europe two years and a half catch Japan’s head start).
Source:
http://www.scei.co.jp/corporate/data/bizdataps_e.html
http://www.scei.co.jp/corporate/data/bizdataps2_e.html
Being that PS3 has USB/WiFi/Bluetooth as well as Memorystick/SD/compactflash it sounds perfect for a computer. Sounds pretty open-standards for Sony, esp since it has an SD slot next to its Memorystick slot.
Also the PS3 has 512MB of RAM;(256 MB of 700mhz GDDR3 + 256MB of very fast 3.2Ghz XDR)
The PPC AmigaOS4 is nearing completion and the devs have demonstrated that they are excited at the prospect of a Cell based version. AmigaOS would surely be less of a drain on the PPC core than Linux. I’m hopeful that Sony may be more supportive of a closed source proprietry OS that they can make more bucks on. Although, I guess nothing will stop Linux getting there first if only for the goodwill.
Linux already runs on Cell See the Linux patches for PPC64 “Broadband Architecture”. IBM ported Linux to Cell, for the simple reason that the source was available, and also because IBM is looking at servers/worktations based on Cell technology.
Your age (of lack thereof) is showing. Back in the DOS/Win3x days, EVERYTHING was 100% assembly. Even many Win9x programs were 100% assembly. Most Amiga programs were 100% assembly programming. I never even touched a compiler before Windows 98, with the exception of some pascal on the old Mac.
By the way, I purchased Abrash’s book some time ago. It was interesting, but I had already been programming professionally in assembly language for about a decade by that time. All my programs, from 8bit to 32bit, were all 100% assembly until my first PowerMac program. That was about 20% C and 80% assembly.
Go to Programmer’s Heaven sometime and check out all the DOS stuff. Most of it is all assembly. Go to AmiNet and check out the Amiga stuff. Much of it is pure assembly.
>>at least, they do not try to fence linux off, like microsoft does with it’s x-box
to be fair MSFT does that as a way of protecting players from cheaters, its an xbox live thing reason. too bad xbox live is a joke of a system anyway as its so laggy it reminds me of gaming in the 28.8 modem days.
to be fair MSFT does that as a way of protecting players from cheaters, its an xbox live thing reason. too bad xbox live is a joke of a system anyway as its so laggy it reminds me of gaming in the 28.8 modem days.
You need to learn to not believe everything you read. MS doesn’t really care about cheating at all, it’s to stop you from copying games. The cheating story is marketing cover.
I’m hopeful that Sony may be more supportive of a closed source proprietry OS that they can make more bucks on.
I’m curious, why would you care? Do you have Sony stock? It makes more sense for them, as a hardware vendor, to make their platform open.
bingo.
was it on engadget.com that they showed images of a prototype blade server using 1-2 cell cpus, with linux running on top?
and i recall a news item on slashdot about ibm going to talk about linux on cell in the near future.
think about it, cell is perfect for heavy duty calculations. trow them into a blade rack and then stack multiple blade racks into a hangar or similar and hook them up with low latency networking. your starting to get a heavy duty supercomputer.
and was there not talk about using cell in everyhing from microwaves to cellphones?
and isnt it claimed that the cell cpu is designed for clustering from the start?
heh, i recall a news item about iraq ordering 2000 ps2’s. the rumor was that they where to gut them and use the parts for a super computer. while it sounded far out with the ps2, a similar news item about the ps3 may well get me to sit up and take notice.
You need to learn to not believe everything you read. MS doesn’t really care about cheating at all, it’s to stop you from copying games. The cheating story is marketing cover.
—————————
That and for the sake of their business interests, they really cant be supporting Linux. Even the most anti MS person should understand all the problems of them saying “sure, go ahead and install Linux, our cheif OS competitor on our own hardware!”.
I remember the FUD that Nintendo spread about the Game Genie several years ago when it was released. After a failed attempt in courts to get it taken off the market, they spread lies saying it would ruin your games and that it would pull on a certain chip (either in your game or system, dont remember) and end up ruining things. This was all in the name of cheating. Seems to be a good excuse in the video game industry, like how the RIAA uses piracy as a reason why people are less interested in music these days.
Good thing i’ve got my Game Geine! It actually seems to make my trusty old NES easier to get working (any NES owner knows of the blinking light syndrome…) even if I dont use any cheat codes.
Simple, computers face a lower import tax when brought into the EU. Sony tried with the PS2 to convince the EU that it was a computer rather than a toy / console. I think this equates to $50 a unit in reduced tax… or at least it did… there was a couple of articals about this whole thing on the bbc news web site (news.bbc.co.uk).
Does that really mean I would be able to install and run Gentoo on it?
Yes, should be possible.
But if you’re hoping that it’s gonna cut down compilation time, you’re in for a huge disappointment.
Compilers have no need for vector processing. They almost exclusively use integer instructions and they’re about as branchy as its gonna get, which is poison for an in-order design like the Cell’s PPE.
Distributing jobs onto the SPEs won’t help either, because they have no level 1 cache and they all share a single 512kB level 2 cache. In actual fact, the horrendous cache trashing that you’d get probably means that not using the the SPEs would be faster. (If someone adopted gcc to take advantage of the SPE’s local memory, this might be another matter. But that’s a big and difficult job, so don’t hold your breath.)
All in all, compiling is among the worst possible jobs for a Cell. Expect compile times to be a lot longer than on today’s single core G5 or Pentium 4.
If you want Gentoo on a console, get an X-Box. At least it has three symmetric PPEs to make up for their individual slowness.
Scheduling for in-order architectures is inherently limited by the compiler’s lack of knowledge about runtime data dependencies. OOO architectures allow better scheduling for the simple reason that the CPU has more information at runtime than the compiler does at compile time.
Agreed.
Another problem with in-order designs is that optimised code is specific to a particular version of the processor, because instruction scheduling depends on the number of pipeline stages, number of execution units, and instruction latencies. Change any of that, and you have to change your compiler and recompile your applications to get optimum performance.
Out-of-order designs largely avoid that dependency. That’s why code generated by the Intel compiler for the Pentium 4 runs really well on Athlons too.