For a lot of organizations that buy servers and create systems out of them, the overall throughput of each single machine is the most important performance metric they care about. But for a lot of IBM i shops and indeed even System z mainframe shops, the performance of a single core is the most important metric because most IBM i customers do not have very many cores at all. Some have only one, others have two, three, or four, and most do not have more than that although there are some very large Power Systems running IBM i. But that is on the order of thousands of customers against a base of 120,000 unique customers.
We are, therefore, particularly interested in how the performance of the future Power10 processors will stack up against the prior generations of Power processors at the single core level. It is hard to figure this out with any precision, but in its presentation in August at the Hot Chips conference, Big Blue gave us some clues that help us make a pretty good estimate of where the Power10 socket performance will be and we can work backwards from there to get a sense of where the Power10 cores could end up in terms of the Commercial Performance Workload (CPW) benchmark ratings that IBM uses to gauge the relative performance of IBM i systems.
ARM, RISC-V, POWERx – there’s definitely renewed interest in non-x86 architectures, and that makes me very, very happy.
It definitely helps that there hasn’t been an x86 phone worth the name in a decade, and now the default targets for most published applications are mobile (ARM first) or the web (architecture agnostic aside from the few remaining DRM plugins).
It’s sort of true, and certainly could be true going forward as IoT becomes ubiquitous.
But I wonder if at this time the billions of worthless, empty purpose phone Apps are really a consideration for hardware designers.
What happens to the various architectures the moment quantum computing becomes usable, potentially at that time all end user hardware is reduced to nothing more than the browser.
True quantum computing goes so far beyond client / server that software development almost becomes irrelevant replaced by imagination and art, a lot of end user hardware evaporates along with it.
It seems ripe for another light bulb conspiracy!
For two decades now, quantum computing has always been “coming soon”. I’m not sure I’ll live long enough to see that “soon” to happen though.
Except that they have operational Quantum computers now and it’s even available as a service now:
https://cloudblogs.microsoft.com/quantum/2019/11/04/announcing-microsoft-azure-quantum/
We probably won’t have them on our desktops as a while, but it’s only a matter of time until the services integrate like a co-processor in Windows, Linux and OSX (which you pay a subscription for), which normal apps can offload to, if available and online
You must have been smoking some pretty good stuff when you wrote this.
Development for Quantum computing is probably the most hardcore mathematical computing area there is.
Quantum computing will also almost entirely be MANY clients / FEW servers because every quantum machine so far is a huge supercomputer that needs to be operated at near 0 Kelvin
Your post reminded me of how they just added the word quantum in front of everything in movies like Antman 🙁
avgalen,
I disagree. while quantum computing is extremely different from what we’re used to, it’s not particularly difficult or “hardcore” mathematically.
I agree. I think one of the most sellable aspects of quantum technology is to provide stronger encryption since most people / companies can see the value in that, but beyond this is harder to see value for end users.
In principal, quantum computing provides us with more efficient ways to solve problems of NP-complete difficulty without needing exponential resources. In practice you’d need a lot of qbits to even justify a quantum computer because a conventional computer can already solve low q-bit problems at a minuscule fraction of the quantum computer’s cost. Most people wouldn’t have much use for a quantum computer even if they had one, I see it as being more valuable to researchers. The most obvious use case is code-breaking. It might help protein folding simulations.
I suppose if we can bring the price down to consumer levels, then why not! it would be neat to walk around with a quantum computer, however I still think the applications would be too niche for most people and companies.
The consensus is Quantum computing isn’t going to be viable for general purpose computing anytime soon.
So Quantum computing will be very useful for certain very specific calculations which could maybe take cut from HPC space (think top 500).
What did I just read? Technical writing is HARD, I get that, but a lot of that was just a bunch of random assertions, un analyzed data and charts with no meaningful scales.
“Which brings us to the history of the Power family of chips running OS/400 and IBM i since 2001, when the dual-core Power4 chip first shipped and put IBM on track to take over the proprietary and Unix server business as it shrank mightily at the same time. (Did IBM win and the other guys, namely Hewlett Packard and Sun Microsystems, lose? I think both happened.) Take a look at this table:”
Nothing in there shows anything about IBM winning over HP or Sun or any historical comparison. Its just “Here are some facts about the processor over the years, see it gets faster!”
“ARM, RISC-V, POWERx – there’s definitely renewed interest in non-x86 architectures, and that makes me very, very happy.”
Unfortunately, Power is on its way out.