In retrospect, it might be a bit tough to put a circle around what constituted a workstation. Is a PERQ a workstation? Probably. Xerox Alto and Star? Definitely. Symbolics Lisp machines? Not sure. Probably? The real success stories came out of Apollo, Sun, HP,IBM,NeXT,DEC and Silicon Graphics. For a time it was a hot market, especially in what was known then as technical computing: research, manufacturing, CAD, graphics, simulations.
If you had a job where you were issued a Sun or an Apollo (back in the day) or an SGI, you were elevated. You were no longer some pleb coding in basic on a C64 or a tie wearing IBM clone user. You had entered a rarified sphere with limitless power at your fingertips. An Amiga was a grubby kids toy by comparison and the IBM PC was slow to move to graphical applications. The workstation manufacturers had fancy graphics, 32 bit processors and scarily huge margins. The designs of the boxes could be wild: The SGI Indy didn’t look like anything Bob from accounting had on his desk and you couldn’t buy anything like that at K-Mart.
UNIX workstations from the ’90s and early 2000s are definitely my favourite genre of computers. My personal white whale is definitely the SGI Tezro, the last hurrah of SGI before they went all in on Intel, closely followed by Sun’s Ultra 45, its last SPARC workstation. These machines are only getting more expensive by the month now, and people are charging insane amounts of money for these, effectively, useless, dead-end machines.
That’s why ordered all the parts for building my own dual-Xeon workstation.
When I was young, the dividing line I was taught was “PCs cost thousands of dollars, workstations and minicomputers cost tens of thousands, mainframes cost hundreds of thousands, and supercomputers cost millions”.
ssokolow,
Sounds kind of arbitrary 🙂
Thom Holwerda,
Thom, I wish you were closer, I’d have helped you with donor parts. Anyways, what are you going to build? I think it would be neat to see osnews become self-hosting, but there never seems to be interest when I bring it up, so it could just be me. You could put up an official osnews IRC server or silly avatar playground 🙂
arbitrary, but broadly right 🙂
They were wonderful things and I’m sure some people needed that level of oomph to function well at their job. Other than this I suspect they were mostly ego fueled empire building penis extensions. I sort of gawk on at Thom’s hobbies having been around at the time. Memories of the 1970’s and 1980’s was still a thing before seguing into the land that time forgot of the 1990’s. Opportunities for women still weren’t that hot when workstations were a thing. Mind you with austerity economics most young people are in the crapper today.
I’m not into building my own classic workstations from parts. I’ve been researching Dior as I want to make a “New Look” inspired skirt suit. Most people don’t know that with haute couture the structure is in the garments. Dior will make you one if you have enough money. That is, likewise, too expensive for me so I will make my own from “parts”. Plus it’s something to do.
If the focus in earlier decades was on modern design I feel things have gone downhill since. One thing I would like to see is the death of Silicon Valley grunge culture. I hated it during the 1980’s and hate it now. Can you herd of slobby future billionaires smarten up?
Excellent analogy. The workstation manufacturers of old were much like the couture houses of the past. Much like the current couture houses, there are vendors who will sell you custom $60K workstations that do not share parts with what they sell in the stores. Dell’s purchase of Alienware and HP’s purchase of a boutique house are analogous to what many of the large fashion houses have done.
Grunge culture’s going nowhere. However, some of us have spouses with couture and sections of our closets dedicated to it.
If you want the meetings with board members and execs that fund you and your programs, you need to speak the languages of $60K workstations and couture if you want to be thought of as anything other then a techy.
That was a fair comment. I’ve read similar from different sources although not rolled up in one place. Publishers do the same thing by buying up niche brands or even establishing nice brands. I think one people miss is portable skills and cross fertilisation between different domains. Monocultures and monomindsets tend to be problematic.
I made a typo. I hated 1990’s grunge fashion. It was like a wall of featureless grey descended. Clothes policing was positively fascistic. I also never got the obsession with Portishead. They were supposed to be in tune with the zeitgeist. They were alleged to produce high end art not mere music. That may just have been the UK which is traditionally insular.
Now I do keep a partial eye on world fashion. Like Thom tinkers with OS News I like exploring fashion elements from all over. There’s really quite a lot in this topic like culture, art, power and influence, and wealth distribution. It really depends were you are coming from and how you look at it.
Workstations were the systems designed to accomplish a particular task in the most efficient way possible. This often meant chips and systems designed with Only that task in mind.
For example, they didn’t include gaming graphics card features if they were intended to perform 2d rendering. It was that specialism that made them valuable, With the commoditisation of computing parts, that line has blurred into being inconsequential, as a XEONs and GeForce combo will just as capably run a Database, as a Webserver, as a Video game. I’d argue that what you are building is not a workstation, but instead a capable PC.
For years now; “server” means RAS features (ECC RAM, RAID, etc) but with poor support for a local user because typically there isn’t a local user.; “desktop” means there’s good support a local user (video, sound, USB sockets) but the hardware lacks most RAS features to keep the price down; and “workstation” mixes RAS features and local user features.
Brendan,
I agree with the classification. In general workstations are similar to desktops but with more server grade components and specs like redundancy, enterprise grade storage, and enough cooling to eliminate throttling and improve long term reliability. Sometimes there are niche features like remote access. Since there’s a heavy focus on enterprise reliability, they often don’t provide latency tweaking and overclocking like a high end gaming PC would.
Alfman,
Old servers make good workstations. (But, after replacing the fans, they are loud as jet engines).
I agree that the main difference is reliability. Better silicon, cooler running components, ECC everywhere — even on GPUs, more I/O ports, and redundant parts.
They will run at slightly slower clocks, but can run 24/7 without any hiccups.
The problem is those motherboards do not (usually) fit regular chassis, and many software will block themselves from running, asking to buy “enterprise” editions. (Even on Linux, nVidia drivers can be problematic).
sukru,
Personally I’ve had good results buying workstations used. This way you can get the high quality without the inflated prices. I would never buy a quadro card new, but with used workstations they’ll just throw it in because that’s what it came with originally.
I don’t follow what you mean software blocking itself? Can you give examples?
Yes I’ve had driver problems with nvidia GPUs on linux. The lack of an ABI + proprietary drivers will do that 🙁
It’s also very problematic that nvidia’s linux drivers have inadiquate monitoring given how much heat their cards put out. The monitering is only available on windows, which is infuriating as a linux user. We want to optimize performance and set safety margins of nvidia cards as much as windows users do. Nvidia dismisses our needs with no plans to fix it.
https://forums.developer.nvidia.com/t/request-gpu-memory-junction-temperature-via-nvidia-smi-or-nvml-api/168346/285
It’s just disappointing that nvidia still considers linux a second class platform.
Alfman,
To be honest is has been a while, but I remember some issues with needing to use Server version of Windows, instead of 8, and some interaction with nvidia drivers. (Actually nvidia might have been the main pain source).
Another slightly related point was, trying to pass the nvidia GPUs to virtual machines, but I think we had discussed that before.
Anyway, up until very recently, old server chips were offering really good performance for the money and electric spent. Things have changed a bit with 10th gen Intel, Zen 2, and M1 chips. Now a bit of math is necessary on a case by case basis.
For example: E5 v4 Xeon vs recent AMD: https://www.cpubenchmark.net/compare/AMD-Ryzen-9-5950X-vs-Intel-Xeon-E5-2680-v4/3862vs2779
Arguably, the “Workstation” has been made from pretty generic pars since the late 80’s. The last lot of proper “old school” workstations was probably the SGI machines, and they soon got superseded by commodity graphics cards
Yes, they used to be custom SGI or Sun machines. And the hardware actually had no “over the shelf” PC counterparts.
I remember being awed when I was able to remotely connect to SGI machines and render virtual reality systems over an X11 connection.
That lasted until GPUs and Ethernet networking became mainstream. Then they lost their main advantage, and regular folk could build networked “workstations” from commodity hardware.
Today a workstation isn’t even hardware, it’s software. DAW == Digital Audio Workstation, lol.
Is audio even work?
I pretty much started the CAD/Graphics era of my working life on SGi or Sun workstations, so I’d question the premise that some sort of uber promotion was required to get onboard such a system. Maybe in software developer circles access was more limited, but on the graphics side of things prior to Apple becoming dominant a Sun or SGi Workstation was pretty much ubiquitous. For a while I worked at News Ltd, in the era during which they moved from hot lead to digital, and they had rooms full of Sun or SGI preparing, processing and proofing the printed copy.
Around the same time Basic Control Theory was moving out of silicon and into software.
I’d say the closest modern equivalent to the workstation would be something like the decked out cheese grater, Mac Pro. It’s big, overly expensive, and has some rendering ‘oompf’ to it. If you could slap on IRIX or Solaris, you’d have a perfectly good Unix Workstation on your hands. Also the 50k+ price of it gives you that ‘workstation’ feel.
You have a perfectly good Unix Workstation with macOS.
Eh, there’s a lot of hardware that is “workstation” grade, without dropping a years salary on a metal box full of holes and a shiny black fruit on the side.
Both Lenovo and HP make some great workstation hardware. I tote a dual Xeon HP Z600 as my main gaming rig. Solid as a rock.
The123king,
I was going to say this as well. I don’t think a $6k workstation is unreasonable, but the Mac pro’s specs were very underwhelming at $6k: weak CPU, weak GPU, so-so RAM, very little storage. I think the chassis was nice but it’s a lot of money for a chassis bundled with low end parts. You know what though, if I were loaded, maybe I wouldn’t care, haha.
OSX is certified Unix. So the Mac Pro may be the last remaining Unix Workstation in the market.
What was computing like before the creation of the “workstation”? Mainframes. Time share systems. Minicomputers. Whatever. The machines sat in air-conditioned “machine rooms” and you shared your use of them with others. You would have to schedule appointments for when you could use the machine. Been there, done that. For me, the first “workstation” was a VAXStation. You can have the power of a VAX sitting under your desk, and it’s all yours! Really? No more hassling with the system administrators to get my memory working set increased so I can actually process my 780 KB images? No more having to deal with 40 users logged in at the same time so no one can accomplish anything useful?
A very essential feature of the VAXStation was that it had the same CPU architecture and ran the same operating sytem software as the predecessor VAX timeshare system. And it did not need its own machine room. Oh yes, networking – ethernet. Hugely important. (Thickwire. Thinwire.)
So I guess you could say that a “workstation” was a personal computing machine that evolved from equivalent high-end machines. By contrast, a “home computer” (be it PC or Mac or Osborne or any of a pletora of others) was more of a “ground up” type of computing system, since there was no equivelent high end machine from which evolution was possible.
Computing hardware and software are infinitely maleable. The distinction between “client” and “server” and “workstation” is largely marketing. My cheapo wireless router has a web server. Server?
This.
Workstation was basically a terminal or a local machine that provided a minicomputer performance level to a single user due to the jump from LSI to VLSI, where microprocessors started to match minis in FLOPs.
I think the reason the old workstations faded into history was not just that they were expensive but that they were incompatible and proprietary, much the same as with the Amiga, ST and Archimedes of the early 90s, and got more so as workstation manufacturers dumped M68K for their own processors. Who’d develop for several incompatible Unix (or not even Unix) workstations when they could develop for the PC and it could run on all of them? Even OS/2 failed to gain momentum (despite IBM and Microsoft backing) when 32-bit PCs started to become affordable, at least for business; same for PC-based Unices. Apple succeeded really because they had gained a major market share in the publishing industry, and then got themselves back into the public consciousness with the iPod.
My university had a room full of DEC Alpha machines in the mid-90s which were used for graphics but others used them for web browsing, programming and, I guess, Unix hacking. Very limited apart from the graphics stuff; not even CDE (our uni didn’t install Win95; it put NT4 on all the PCs in 1997 or thereabouts) but just window managers. They also had two rooms full of Sun machines that only computer science students had access to. Pretty sure they’re all gone now and in fact had been within a few years of me leaving in 1998 (though they kept names like “Solarium” and “Sun Lounge”). PCs were just set up for the average user and workstations weren’t; it was only when Linux came along and people wanted a desktop environment that offered similar functionality to Windows that this changed.
Idk, I chalk a lot of Apple’s Mac comeback in the oughts due to microsoft’s struggle to make a stable OS during that time period. Xp or Vista couldn’t hold a candle to Mac OSX. At the time I knew alot of publishing people that were pissed at the transition to OSX because a lot of their software had issues in OSX and the publishers of those apps were very slow to update. Quark xpress, I’m talking about you. A lot of diehard mac users were upset that a lot of the UX niceties weren’t in OSX either. I think the switchers from the Windows platform kept the OS X platform alive.
Doesn’t Mac pro pricetag put it firmly in the workstation domain?
dsmogor,
To me that assertion sounds weird because the high price is more a side effect rather than a good classification. It would be like saying “Doesn’t the Tesla’s high price tag put it firmly in the electric car domain?” Sure it’s an electric car, and sure it’s got a high price tag, but that’s not what qualifies it to be an electric car. I know it’s not a perfect analogy, but hopefully it still makes sense.
The UNIX workstations from the beginning of the 90s were my favourite genre of computers too. I have a great interest in industrial and ergonomic design and as such was focused on the whole HCI/MHI/MMI/GUI/DE area which I believe peaked around the mid-90s.
The different computer architectures – the whole CISCs vs. RISCs – was a quite interesting topic in the beginning of the 90s, since we were in a time where hardware performance improved continuously. Naturally, hardware manufacturers engineered their hardware / software platform in a way to ascertain vendor lock-in. However, around the mid-90s, proprietary hardware became detrimental to both corporate consumers as well as manufacturers, since the growth of consumer markets due to the dominance of Microsoft Windows improved the performance and availability of generic hardware and caused prices to plummet, thereby making “workstation performance” attainable to all.
I have owned a few NeXTstation Turbos, a NeXTcube, a couple of SGI Indys, a purple SGI Indigo2 and a SGI O2. These workstations were the forerunners of the optimistic paradigm shifts, aesthetically pleasing, but very noisy compared to today’s “workstation” desktops. Although unusual and colourful, the quality of plastic covers of SGI workstations designed by IDEO and Lunar Design – ironically using Vellum CAD on Macs – does not hold up quite as well compared to the NeXT products designed by Frog Design’s Hartmut Esslinger. Interesting conversation pieces…
I would say that assembling a workstation today is boring, if you look at it from the perspective of assembling generic parts in a bland case. However, if you look at it from a form vs. function approach, e.g. enclusure, cooling, price/performance per watt, interface refinement etc. then it is quite interesting.
It is too bad that the immediate future is vendor locked. Just think of the TFLOP performance of products such as XBOX serie X or Apple M1 Max equipped MacBook Pro.