Dozens of PlayStation 3s sit in a refrigerated shipping container on the University of Massachusetts Dartmouth’s campus, sucking up energy and investigating astrophysics. It’s a popular stop for tours trying to sell the school to prospective first-year students and their parents, and it’s one of the few living legacies of a weird science chapter in PlayStation’s history.
Those squat boxes, hulking on entertainment systems or dust-covered in the back of a closet, were once coveted by researchers who used the consoles to build supercomputers. With the racks of machines, the scientists were suddenly capable of contemplating the physics of black holes, processing drone footage, or winning cryptography contests. It only lasted a few years before tech moved on, becoming smaller and more efficient. But for that short moment, some of the most powerful computers in the world could be hacked together with code, wire, and gaming consoles.
The PlayStation 3 and its Linux compatibility were going to change everything. Back in those days, it was pretty much guaranteed that on every thread about some small, alternative operating system, someone would demand PS3 support, since the PS3 was going to be the saviour of every small operating system project.
Good memories.
The circumstances for PS3s entry to scientific computing are less glorious than it might look.
1. The PS3 hardware was sold at a loss. It is the same dreaded (vendor lock-in) concept as with ink printer cartridges. Charge less for the device, earn much more money with the supplies. In this case, games. Only this made PS3 a good deal for scientific computing; the gamers paid for it.
2. Linux on the PS3 was a customs-evasion scheme and a marketing ploy. Sony would avoid tariffs, e.g. for importing consoles into European Union, as generic computing devices were classified differently to entertainment devices. They would also make various marketing claims to the effect of a PS3 being (theoretically) able to replace a household computer, while at the same time crippling Linux support from the start with limited access to the graphics hardware.
3. At the end of the day, Sony didn’t give a jack shit about their Linux support. It was ultimately removed with a firmware update(!) on the unwitting users. A class-action suit and settlement followed in the U.S.
Universities back then saved a few bucks and also had the opportunity to work with an exotic hardware concept (that did not prevail in future systems, so a lot of knowledge gained went to waste). It was fun, yes.
Accurate. Apple wasn’t too different with the G5 based super computers. They weren’t sold for a loss, but they were an absolute dead end.
How so? Power chips are all over the HPC TOP500. Granted, it’s probably because of Nvidia, and they’re listed as IBM parts instead of Apple.
https://www.top500.org/lists/2019/11/
The G5 was a very inefficient processor. Basically the P4 of the PPC world.
For large supercomputers the cost of operation (mainly electricity to power the machines AND the air cooling) is not trivial. So by that time it made no sense to build large scale systems around the G5, when the x86 world was offering better performance per watt ratios.
Sure power chips were still useful in HPC, but actual Apple hardware? Not as much. I was referring to the crazyness that existed around this time where you just bought existing pre built hardware off the shelf and re purposed it into part of a super computer.
I wouldn’t say the knowledge gained went to waste… the main difference is compared to modern console CPUs the memory bandwidth was more limited, modern consoles just make the cores more generic and less limited which was required at the time due to limitations.
Developers still had to learn how to break their software up into threads.
cb88,
The architecture was way ahead of it’s time and the software industry is notoriously slow to embrace change. There’s kind of a chicken and egg problem. Few developers really took advantage of the potential of the advanced features in the cell processor because conventional scalar architectures are more popular, and scalar architectures are far more popular because most developers keep targeting that. I still see technical advantages of the cell architecture, but it’s very hard to move the market. Still, it is neat that niche supercomputers were built around them.
The CPU+GPGPU models we have today are strongly influenced by popularity as much as technical merit. For better or worse, legacy architectures are very popular. :-/
Unintended consequences. 🙂 Sure Sony was trying to exploit a loophole, but it worked out in people’s favor for a time.
Researchers do lots of things that don’t have any immediate merit or application. That’s how blue sky research works. People investigate random rabbit holes, and maybe somewhere down the line, their findings are useful. It’s all about the long tail.
Plus, researchers got to publish papers while playing with weird things, and that’s all they really want out of life.
Makes you wonder if Sony removed the “Other OS” feature from PS3s in order to pre-empt a future where organizations would buy PS3s in large quantities to use them in computing projects without buying a single game ever and with Sony losing money on every one of these PS3 units sold. After the removal of the “Other OS” feature, new PS3s preloaded with the latest firmware came with no support for “Other OS” out-of-the-box, making them useless purchases for supercomputing projects. The security excuse Sony touted for removing the “Other OS” feature was pretty much BS anyways, so it makes sense IMO.
It’s the same reason some “all-in-one” printers require non-empty ink cartridges to scan documents. People are buying the things as cheap but capable 40-dollar scanners and never print a single page on them. So, all the manufacturer has to do is make the printer waste a bit of ink on “startup head-cleaning” every time and make the printer require ink to scan, and the printer manufacturer has fixed the problem.
kurkosdr,
IMHO that’s exactly right. Obviously having a business model that sells below cost is problematic.
Yeah, they really do scam people. We had an epson printer where we printed nothing but black and white, but A) the printer would waste color ink on it’s own and B) when any of the colors ran out B&W printing was artificially disabled.
We played that game once paying for $70 of color ink (that we didn’t even use, that’s shameful, epson!), however after that we just threw the printer away. I would say “I’ll never buy epson again” because of this scam, but something tells me they all do it. Now we use a B&W HP laser printer, I’m not sure how much toner gets wasted, but it does last a long time and the cartridges are far cheaper than ink ones if you don’t need color.
Are you sure you were printing B&W and not grayscale? Most inkjets use the colour inks to make at least some of the greyscale tones and the drivers aren’t always clear about the distinction between B&W and greyscale.
Also, Inkjet Printer Buying Basics 101: Go to a shop where they sell refills and remanufactured ink cartridges and ask them which printers take refills and remanufactured ink cartridges. That capability, and automatic dual-sided printing (if you want it) are the only specs that matter. As a home user, you never more than 1200dpi or fast page speeds.
kurkosdr,
Well, I didn’t merely print black and white content, I specifically checked the “black and white” option when printing, but who knows whether epson even respected the setting. I wouldn’t be surprised if epson deliberately wasted ink that didn’t need to be consumed during B&W prints to boost sales of color ink to customers including myself who didn’t need or want it.
I’m more than happy with the laser printer. It’s true, I don’t need the speed and resolution, but I won’t say no to them and the operational cost savings have been huge. After going laser, at this point there’s no way I’d even consider going back to ink.
Just found this, apparently being forced to use/buy unwanted color cartridges is extremely common with epson printers. This thread is 16 years old and people are still having the issue.
https://fixyourownprinter.com/posts/10760
Epson has had so long to fix this that it’s obvious that they’re screwing their customers intentionally.
Definitely. Of course it’s not just Epson.
I have a HP OfficeJet 6700. I flipped out one day. We were buying a house and lots of document need to scanned and signed printed. And this freaking thing starts saying it can’t print a normal black and white document because Magenta or something is out of ink. I literally drove to Staples (Office Store) that day and just got a basic black and white laser printer (Brother). It works fine. Sure the toner cartridge is pricey, but it never dries out and will print when it needs to.
I still have the HP Office Jet 6700 because it is has just the most useful scanner I haven’t been able to find in other products. You don’t need to install drivers or anything. They have a web interface where you can start a scan through the browser (using ADF or flatbed). You can choose PDF as an option. Then when it’s done, you can download the PDF via the browser like a normal download.
Probably not secure for an office environment, but for a home one, it is brilliant. Everytime i look to replace it with some kind of color laser MFC or something, I just can’t find anything comparable. The sad part is, I don’t even mind paying for overpriced ink. It’s just I rarely print in color, so this printer is always going to give me this problem because the colors are always going to get used up and dry out and it’s not going to print when needed. But the scanner is so good, I keep it just for scanning.
Yeah, it’s hilarious that they removed OtherOS for “security” but it was GameOS that wound up being broken and enabling a flood of homebrew and piracy. Seriously, I went ahead and updated my PS3 to remove OtherOS so I could put a custom firmware on it for full homebrew support. Unlike OtherOS not having accelerated video, GameOS with custom firmware allows full use of the GPU.
There are probably multiple angles. The Cell processor was available in IBM servers at the time, they even built the Roadrunner supercomputer around it, and they were probably not happy about people buying PS3s instead of IBM servers. They spent the money to build the chip expecting to get dollars from hardware and services then people buy low margin PS3s instead.
I doubt Sony cared that much since it was good publicity, and a stick in the eye to MS and Nintendo.
It’s weird to think there might be an alternate universe where Sony leaned into the computing aspect of the PS3, and they are selling Power based desktops and servers.
The “other OS” turned out to be a really bad idea for Sony, since it opened a lot of issues regarding customer support. Although it was a neat idea from a geek point of view.
“It’s the same reason some “all-in-one” printers require non-empty ink cartridges to scan documents. People are buying the things as cheap but capable 40-dollar scanners and never print a single page on them. So, all the manufacturer has to do is make the printer waste a bit of ink on “startup head-cleaning” every time and make the printer require ink to scan, and the printer manufacturer has fixed the problem.”
Interesting. What “all-in-one” printers are these and who makes them?