Remember x86S, Intel’s initiative to create a 64bit-only x86 instruction set, with the goal of removing some of the bloat that the venerable architecture accumulated over the decades? Well, this initiative is now dead, and more or less replaced with the x86 Ecosystem Advisory Group, a collection of companies with a stake in keeping x86 going. Most notably, this includes Intel and AMD, but also other tech giants like Google.
In the first sign of changes to come after the formation of a new industry group, Intel has confirmed to Tom’s Hardware that it is no longer working on the x86S specification. The decision comes after Intel announced the formation of the x86 Ecosystem Advisory Group, which brings together Intel, AMD, Google, and numerous other industry stalwarts to define the future of the x86 instruction set.
Intel originally announced its intentions to de-bloat the x86 instruction set by developing a simplified 64-bit mode-only x86S version, publishing a draft specification in May 2023, and then updating it to a 1.2 revision in June of this year. Now, the company says it has officially ended that initiative.
↫ Paul Alcorn
This seems like an acknowledgement of the reality that Intel is no longer in the position it once was when it comes to steering the direction of x86. It’s AMD that’s doing most of the heavy-lifting for the architecture at the moment, and it’s been doing that for a while now, with little signs that’s going to chance. I doubt Intel had enough clout left to push something as relatively drastic as x86S, and now has to rely on building concensus with other companies invested in x86.
It may seem like a small thing, and I doubt many larger tech outlets will care, but this story is definitely the biggest sign yet that Intel is in a lot more trouble than people already seem to think based on Intel’s products and market performance. What we have here is a full admission by Intel that they no longer control the direction of x86, and have to rely on the rest of the industry to help them. That’s absolutely wild.
Well, given AMD already did the x86_64-bit extension over 20 years ago that Intel only had to later copy because their propritrary IA-64 failed so EPICly I guess AMD basically already controlled all the state-of-the-art x86 ISA development for a long, looong time 😉
Exactly, Intel hasn’t been controlling the direction of x86 ever since AMD64 aka x86-64 won the race to bring 64-bit computing to PCs (it seems weird now, but Intel expected Itanium to replace the x86 architecture in PCs, with 32-bit x86 software being executed using some kind of emulation, which btw would have been horrible because Itanium was a monopoly fully owned by Intel).
ia64 x86 emulation was only meant as a stop gap for a year or two until of course everyone would have migrated to native ia-64 code, … 😉 hw x86 silicon was also removed w/ Itanium 2 so only qemu-like software would emulate it.
rene,
I was someone who actually wanted a IA64 machine for myself. But it was financially out of reach for ordinary consumers to run ia-64 and it’s crystal clear that intel never intended it for us. Rather they intended to split the market between enterprise and consumers. In hindsight it was a bad strategy that left their enterprise IA64 CPUs without any software. It was simply priced out of reach for the hackers and students who were best suited to exploit the novel aspects of the architecture in order to create new games/demos/software in order to drive up interest. If the hardware had been more accessible to us there would have been more innovation on the software front around intel’s VLIW architecture. Instead, there was no ecosystem and that platform suffered for it. Without the native software itanium could never be more than a very expensive and inefficient x86 emulator.
I don’t know if it could have been feasible for intel to price these affordably for consumers, but a killer game at a time when parallel computation via hardware acceleration was on the rise did offer intel a good market opportunity, but obviously they didn’t take it.
There is nothing in the IA-64 instruction set that makes it unimplementable in consumer-level CPUs costing around what a high-end Pentium 4 cost at the time (even if we assume performance wouldn’t be as good, it would still be possible). You see, Intel’s plan wasn’t to compete on technical superiority (not after Itanium failed to live up to expectations), Intel’s plan was to wait until the 4GB limit became more and more restrictive for PC users and then present Itanium as the saviour of the PC ecosystem, despite the inadequate performance at running 32-bit x86 code (remember: back then, the PC ecosystem competed with PowerPC Macs and RISC workstations, which were all 64-bit). After all, Itanium run 32-bit x86 code better than those other RISC CPUs, even with software emulation. And this would also conveniently eliminate that pesky competitor AMD. Intel never expected AMD to develop x86-64 and more importantly to market it successfully. Itanium finding a niche in HP-UX (and Intel focusing Itanium designs exclusively on the enterprise) was more of a consolation price, they expected IA-64 to become the standard for both enterprise and consumer.
Also see the comment below by jgfenix, the link is an interesting read.
kurkosdr,
It’s not just the instruction set. Going by the listings here, itanium dies were roughly 4X bigger in size and transistor count than contemporary x86 CPUs of the day.
https://en.wikipedia.org/wiki/Transistor_count
I guess you might be suggesting lower transistor models for consumers? Maybe, but there are also R&D costs to consider and it’s not clear to me whether intel could have sold them for drastically less. Theoretically it might have been possible but in any case we know intel wasn’t interested in marketing IA64 to non-enterprise demographics. Still, it would have been interesting to see what could have been done with the architecture in the hands of demo-scene hackers and devs who are known for push cutting edge software techniques.
It was too cost prohibitive even for those who were interested. I don’t believe intel intended it for consumers. As for leaving customers on 32bit CPUs, PAE on x86-32 still offered something like 64GB total memory capacity, which is still more than most computers are sold with decades later. The 32bit limitation meant that processes could only have 4GB mapped in at a time, which is a technical limitation, but not a particularly arduous one especially for ~25 years ago. So Intel probably thought they had more time. But I agree with you that AMD caught intel off guard.
That is an interesting read. If AMD hadn’t been around to bring AMD64 to market, Intel’s attempts to segment the markets could have succeeded. But the existence of AMD64 changed everything. It satisfied the needs of backwards compatibility and future proofing at the same time. Whether you were a business or individual there weren’t many downsides… and on top of this AMD’s CPUs were priced very competitively. It was a huge win for fans of x86.
According to an ex-Intel engineer they had developed x86 64 bit capable chips before that.
https://logicstechnology.com/blogs/news/ex-intel-cpu-engineer-reveals-how-intels-internal-x86-64-developments-were-stifled-before-amd64-thrived
It’s not necessarily a bad thing that the future of x86_64 is in the hands of a consortium. We all remember the stagnation in the x86 landscape while AMD struggled with the Bulldozer architecture and Intel mostly tick-tocked the price of their “new chips”. Also, we don’t need stunts like Itanium, where Intel is trying to get rid of the last competitor. I would say that a few more licensors to the architecture could bring back some much needed competition.
AMD and Intel, sure, but why Google? Their business is mainly advertising and low-quality online services. Why do they need to have a say in x86?
Same reason Dell, HP, and Lenovo are on the list. Chromebooks.
Also probably the same reason Meta is on there. They design their own server hardware for their datacenters.
ssokolow (Hey, OSNews, U2F/WebAuthn is broken on Firefox!),
Probably none of them “need” to be on the list, but as tech giants they don’t want to be left off it.
I always wondered: How much chip area does the x86 “cruft” really occupy? If it’s minor, why not keep it and not completely lose the embedded market to Vortex86? And yes, the embedded market still uses DOS OSes that require 16-bit real-mode:
https://www.youtube.com/watch?v=BdSJgoP2a88
because nobody uses the 16-bit code anymore, not the EFI-BIOS and no OS. It just wants millions in design and testing and lowers clocks by some 100 MHz here and there. Long overdue to finally axe this. There also is no 16-bit x86 embedded market.
rene,
I agree there may not be enough justification to support them, but I also agree with kurkosdr on the fact that people & companies are still using these modes even to this day. I don’t think we should deny their existence, the debate probably should be around whether such niche niche users are worth supporting though.
I think a good compromise would be to implement and boot into Long Mode exclusively, but officially support software emulation of legacy modes & instructions. This way the hardware doesn’t have to implement it and software emulation can still be magnitudes faster than the original hardware ever was. In a microarchitectural CPU it probably doesn’t take that much work to support x86. The prefetcher needs to understand those modes, not the underlying cores. Maybe an optional software defined prefetcher could pull this off for legacy modes.
I literally provided an example above of an embedded device that uses DOS (hint: it’s the YouTube link).
Also, how does 16-bit real-mode code “lowers clocks by some 100 MHz here and there”? 64-bit mode doesn’t even support the execution of 16-bit real-mode code.
I think this also misses that Intel doesn’t nessacaily Want to run x86.
Setting the direction means massive of R+D as well as high risk of failure.
For example, if Intel designs x86s they spend billions on it. And, like ia_64 if they lose, they lose all the investment and have to switch to a competitor’s solution to continue their model of commodity hardware.
They are better of sharing the R+D cost, reduce the risk of ending up on a dead end alone and then leverage their fabrication and buisness agreements to their advantage.
The legacy x86 stuff is less than 1%, mostly noise/error.
8086/286 are basically emulated, and 32bit 386 stuff is not that much.
People on the internet with little background in architecture tend to overestimate how much decoding takes in a modern core, in terms of area and complexity. Compared to the major structures in a modern out of order core.
I think history tells us that corporately / commercially the right decisions were made for Intel and AMD, decisions that basically secured a duopoly at a time when the market was facing fragmentation. Good for Intel and AMD, not so good for the consumer.
FWIW, I see signs of similar things happening right now, as the major players appear to compete and “we the general public” bang on about which is the best and which one wins, we are witnessing a quiet lockout. I suppose you could argue it’s all a sham designed to avoid legislative intervention.
We should not be surprised, big organisations are almost exclusively run by lawyers not technicians or engineers!
cpcf,
RIP Sun Microsystems (at least this is who I was picturing)
I do not think it is fair to call Jonathon Swartz a lawyer. He was more of a software guy though I think his degree was in mathematics.
I guess my post wasn’t clear but I was picturing Sun Microsystems as an organization favoring engineers over lawyers. I meant it as a contrast to other big companies that are “run by lawyers”.
Alfman, bullseye!
The difference is that, last time, x86 won against everybody and the duopoly was able to take over the market. This time around, it seems very much that x86 has been relegated to a subset of the market ( not the fastest growing or most profitable parts ) and that other platforms ( ARM, RISC-V, and GPU ) are leaving x86 in the dust. In that scenario, the duopoly is not so threatening to us and is not nearly as beneficial to the duopoly. Intel almost entirely took over the computer market at one point. They are very, very far from doing that now. Nevermind AMD, just look at the market cap of NVIDIA, Even Qualcomm is worth twice as much as Intel.
RISC-V is not leaving anyone in the dust LOL.
DC is still the most profitable segment, and x86 still has a big presence there.
We’re just witnessing the usual consolidation of a mature market into duopolies. For anything other than ultra embedded and ASIC, we’re going to see an x86/ARM world for the foreseeable future.
Xanady Asem I think a lot of observations other than the duopoly are really just people wishing for something better or different. The big end of town will eventually go quantum something, that’s a niché market but I bet companies like NVidia are all in on it. The rest of us are going to get a new version of the same old same old for the foreseeable future.
Am I the only one that remembers that infamous x86S document, where Intel portrayed herself as the sole inventor of the x86_64 architecture? I mean: the concept was sound, but trying to rewrite the history…