This is a bit of an odd few days for Intel. Mere days after the board ousted its CEO Pat Gelsinger, once heralded as the chip giant’s messiah, they’re today launching two brand new desktop graphics cards. They’re aimed at the more budget-oriented consumer, and might very well be the last discrete graphics cards Intel makes, since this is one of the product lines on the chopping block.
Intel’s next — and possibly last — desktop graphics cards will begin arriving in just 10 days. Right on cue, the company has announced the budget $249 Arc B580 and $219 Arc B570, shipping December 13th and January 16th, respectively, as the “best-in-class performance per dollar” options in the GPU market.
They’re based on the same Xe2 “Battlemage” GPU architecture you’ll find in Intel’s Lunar Lake laptop chips but with more than double the graphics cores, up to 12GB of dedicated video memory, and up to 190W of power compared to their limited laptop forms — enough power to see the B580 slightly beat Nvidia’s $299 RTX 4060 and AMD’s $269 RX 7600, according to Intel’s benchmarks, but sometimes still trading blows.
↫ Sean Hollister at The Verge
As for Gelsinger’s dismissal, it seems the board forced him out after being frustrated with the slow progress the company was making in its turnaround. The fact that a finance person and a marketing person will together be interim CEOs seems to indicate the board is more interested in quick profit than a long-term turnaround, and with companies like Qualcomm being interested in acquiring Intel, the board’s short-term mentality might be winning out, and ousting Gelsinger is just paving the way for selling off parts of Intel until there’s nothing left.
Who knows, I might be reading way too much into all of this, but it feels like expecting an organisation as complex as a high-end processor makers to turn itself around in just a few years is incredibly shortsighted, and you’d think board members at Intel would understand that. If the goal is to maintain Intel as a separate, profitable entity making some of the world’s fastest processors, you’re going to need to give a CEO and leadership team more than just a few years to turn the ship around.
Within a few years we’ll know the board’s true intentions, but I wouldn’t be surprised to see Intel being sold for parts over the coming years.
Intel has been missing wave after wave. GPU, Cloud chips, IOT, Mobile and now AI.
Sometimes you see companies jumping on every bandwagon and failing as a result. But intel are the opposite. They were in a perfect position to dominate each and just watched as other took the markets from them.
I’d suggest a big restructuring is coming (job losses). Which tends to also spawn more competitive startups. We will see if Intel hitch to the next bandwagon or fall further behind
That’s exactly the same observation I describe for why Intel has been losing ground to ARM (and AMD now). They were focused hard, on the market as they defined it in the 80s. They sold commodity, high performance CPU hardware, mostly for terrestrial use – desktops and servers, and lost in basically every other category, most especially GPU – that one is indeed baffling. Like, it’s easy enough to understand how a financially bloated executive surrounded by yes men (and women) might not see how the performance commodity play is a dead end, but how could they have missed the entire GPU market? That golden parachute probably makes up for the lack of talent.
It’s the 2 sides of that which bit them. Focusing on performance at the expense of all other concerns like efficiency (at least until the current gen), has killed them in the mobile space – including laptops. And their focus on commodity hardware, has cause a rift between them and anyone who isn’t also trying to make commodity hardware like PCs and servers. Think Steam Decks, consoles, and all the various mobile form factors. Even some top end servers are starting to demand bespoke hardware configurations for AI and the like. They really lost on all fronts. And that business is not one which has natural quick pivots.
All of that is on the market conditions side. They also had steadfastly held on to the fabrication part of their business, and maybe that lead to some of these blinders. They had to justify that decision after all. They do seem to have a longer term play in the works – their current generation has sacrificed top end performance in favor of power efficiency by doing things such as ditching SMT. That’s a pretty good play, but it’s probably made the board (who almost certainly has a group think set of opinions about all of what I described above.) Despite some bad press from the lack of top tier performance from the largely know-nothings on youtube who cover this stuff (I miss Anand Lal Shimpi.) What Intel did with their current generation proves that x86 can compete with ARM in ARM’s strengths – power efficiency. It also proves that Intel really can’t communicate… I have no idea what that thing is even called. The Intel Core not-7? No idea.
Their current play doesn’t solve their commodity problem though. ARM will continue to dominate, no matter how efficient Intel and AMD can make their commodity hardware, because when someone needs a bespoke design (like Apple or Amazon) they can get it with ARM. If I were Intel or AMD, I’d be focusing on that specific problem. AMD has been addressing that to some extend through their custom design business, but it’s not the same as a company being able to just license the IP and go about designing their own silicon. They also have the innovator’s dilemma problem – they have a chip design, and they want to sell it. But what if AMD’s design team was free to just make solid designs without having to push their x86 engine? What if they could have designed an ARM SoC for Nintendo, instead of a Zen SoC? It really feels like these companies get caught up in their own myths, rather than understanding the market conditions dispassionately.
I don’t expect Intel to survive any of this. My guess is the board is just as clueless as any given CEO which all seem to come from somewhere else, but I do think it’s at least possible for them to survive.
AMD has done a fairly good job of missing the GPU. NVIDIA is the only GPU company which is making money off of this.
This goes in cycles. Procs hit Pentium 4 levels of waste then everyone starts over with efficient procs. Specs slowly start creeping up to get easy marketing wins, and then we get Pentium 4s again.
The Pentium M was very efficient back in it’s day, and so were the original Atoms. Intel just couldn’t help themselves and kept loosening the specs.
It wasn’t a bad play since there are only two other competitors in the high end fab space, TSMC and sort of Samsung, and fabs require lots of money to build. It’s a great advantage when it works, but it will cut deep if it doesn’t.
Intel botched several nodes, and they couldn’t get it turned around in time.
Hypothetically, AMD could replace the x86 decoder with an Arm decoder, and go on without much sweat. I think they still have an Arm Infrastructure license.
The x86 server space is too lucrative though. Desktops and laptops get leftovers as a marketing exercise.
This doesn’t look good, and it really doesn’t look good for US fabs capabilities. I would say nationalization of the fabs should be on the table, but the incoming administration will probably find the worst possible outcome, like giving them to the Saudis or TSMC then getting stuck with no capacity when China invades Taiwan.
Dod won’t let that happen unless TSMC builds and maintains a node factory that is competitive to Intel’s on american soil.
DOD will shut up and do what they are told by their corporate masters through the 2 parties. You have the power structure backwards. Also, the G7 is the domain of the DOD, not US soil. Putting the factory in any of those territories (or territory they control) would probably look like a great idea to the DOD’s corporate masters.
Nvidia has an interesting problem of it’s own – which is it doesn’t have anything else to sell. So of course, they pump most of their effort in to their GPU business, constantly trying to find new markets to sell in to, and it has generally done well to ride the Crypto and AI bubbles. We’ll see what happens when the AI bubble bursts.
I’d say they should look at diversifying, but silicon valley companies seem to prove time and time again that they aren’t any good at that. Intel has a fab sure, but they only use it to bolster their own projects (probably have thousands of slides about “synergy”). Imagine if they ran that division like TSMC, and just manufactured things for the highest paying bidder. Imagine if they manufactured M3 for Apple. Imagine if Intel’s fabs were producing AMD designed ARM chips for Nintendo. I just can’t see that ever happening (well, most especially now that they are hopelessly behind).
The last time Trump was in, we got 100% of the wrong policies – an acceleration of the wealth pump, with a lot of chaos, and I fully expect we’ll see the same this time.
They did buy Mellanox a while back and now have networking, and they wanted to buy Arm. They’re trying. They have had pretty good success with their Tegra procs, even if they aren’t the most thrilling thing.
However, CUDA is the thing driving the profits for NVIDIA. They’re lucky everyone else has been clueless in building a GPU compute standard.
They should buy Qualcomm for the patents and Arm procs.
Intel forgot they were a fab company, first and foremost, which happened to produce chips to keep the fabs busy.
They really should have opened the foundries up earlier. They’ve started to, but they’re not like TSMC who will fab whatever if people have the cash. It really would have made sense especially for the older nodes which are close to being paid off, or paid off.
The Arm stuff they did would have been perfect for this. High volume on older nodes.
Exactly. They’re going to pump the well dry.
There are rumors the Intel board passed on a NVIDIA acquisition years ago because Jensen being CEO was a stipulation.
GPU was never Intel’s thing, so not having a discrete GPU made sense. AMD can barely compete with NVIDIA, and they bought an entire graphics company. It’s not clear if Intel could have built a GPU or if an acquired NVIDIA would have had the same impact.
Intel really made a mistake by selling their Arm line to Marvell back in the day. The Xscale chips were pretty good from what I’ve heard.
Flatland_Spider.
They actually had GPUs in the past.
Both on low end (the mobile GMA which later became HD Graphics, which are then moved to Iris and Xe)
And the high end (Xenon Phi).
https://en.wikipedia.org/wiki/Xeon_Phi
This was a bit unique, and more like Sony’s Cell processor, where they had specialized, cut down x86 cores with lots of additional SIMD units (AVX basically)
They also had some others in the past for their ARM offerings.
What they lacked was the desktop GPU, which they entered with Arc. I have the A770 on my NUC, and it is actually quite good for the price and the form factor. However they did not realize the software requirements that came with it.
The “game optimized drivers” from nvidia are basically hot patches for incorrectly coded games. nvidia would take recent releases and optimize their engines to run better on their hardware. It is not like “we just run OpenGL/Vulkan/DirectX APIs”, but more like “we will make sure this particular game runs as best as possible”
Yes, of course, this responsibility should be on the game developers. But “shoulda and woulda” would not fix Intel’s (and AMD’s) problems.
They are finally catching up. But they could have done much more, and apparently the previous CEO’s heart was not in it. (I don’t think it is a coincidence they announced two new GPUs immediately after kicking the CEO, and not doing it in the last two years after the previous release).
>>>The “game optimized drivers” from nvidia are basically hot patches for incorrectly coded games. nvidia would take recent releases and optimize their engines to run better on their hardware. It is not like “we just run OpenGL/Vulkan/DirectX APIs”, but more like “we will make sure this particular game runs as best as possible”
Omg it’s Windows 95 all over again.
NVIDIA never stopped believing in Win95. LOL
Valve essentially does the same thing in their Linux runtime, but they do it at the Proton layer, not the drivers, at least from what I can tell. This seems more sustainable, and probably an easier target for hardware companies that don’t want to be in the game optimization business. I really would love to see Intel’s GPUs become the top tier Linux GPU. They are well positioned for it. Both AMD and nvidia have significant challenges (mostly self imposed – AMD with their seriously problematic HDMI support, and nvidia just being nvidia about their drivers) on Linux that Intel could probably obliterate. I’m sure they don’t want to build a market though and would rather just try and chew away at the edges of an existing market. Corporate America is the culture of small ambitions.
CaptainN-,
I’d also like to see that. Consumers need more competition in the GPU market. That said, it may be too late even for intel. They not only have the challenge of convincing the market to adopt intel GPUs, but doing so when their own fabs are having trouble competing. Maybe they could outsource production to a rival, The potential for an upcoming trade war adds more uncertainty.
Alfman,
They did. 🙂
https://www.tomshardware.com/news/intel-will-spend-14-billion-on-manufacturing-chips-at-tsmc-report
Flatland_Spider,
I heard that, but I thought it was temporary arrangement while they retooled. It didn’t seem like they intended it to be a permanent thing.
If you buy new intel cpus today, are they from their own fabs or are their new products being outsourced now?
Xeon Phi was much more Intel’s lane then GPU. They had something rather interesting there.
I remember that. The NVIDIA drivers are giant hack repos.
That doesn’t really help how CUDA is driving the current wave. There hasn’t been a competing standard which has gained traction.
The Intel iGPUs have worked rather well for many years, and focusing on something other then games would have been a good idea. Video encode would have been a good market to focus on, which they have an advantage in, or fixing their GPU compute. Video encode and decode does work well with Intel GPUs, and cheap GPU compute which is good enough would have been a hit.
It all goes back to high perf x86 chips being Intel’s crown jewel. Other products can’t eclipse the x86, and that is a problem.
What Intel have really missed out big time is non von-neumannan computing revolution that have happened in the past 15 years. If you look where money flows nowadays (also on micro level in your own PC expenditure) it’s the GPU that is really a central computing unit with CPU serving auxiliary role, kind of like IO processors used to in mainframe times.
The AI revolution simply made it brutally clear that the king is naked.
And taking into account that Intel have dominated HPC arena and where this revolution started it’s really astonishing they missed it.
Deskop GPU was important driver (competitive force) to all this but in grand scheme of things it’s just a side show. And I believe the destkop GPU focused mindset will ultimately be an ahilles heel that will perpetrate Nvidia downfall.
FWIW Intel has done non-Von Neumann stuff with neuromorphic architectures (Loihi) and FPGAs (Altera)
But these efforts were half hearthed at best. It’s Intel that should have defined the CUDA standard back in the day, they should have been pouring billions to have the best parallelizing compiler set on the market (including support for 3rd party GPUs and yes NVidia GPUs).
Instead they have just sat and watched as the value in supercomputer clusters gradually shifts from them to NVidia.
They should have, but couldn’t. Companies cultures are usually their undoing. Intel no different from that.
Major players are usually in the right position to take on the next wave, only to end up missing it almost completely.
It’s a weird phenomenon. Once a tech company reaches a specific size, almost invariably set up an R&D dept with the specific mission of either creating the next wave or not missing it. And almost universally, they end up missing it.
A lot of it has to do with the fact that it is very hard to have a management team which is well rounded or balanced regarding the whole pipeline; research, execution, marketing, sales, finances, etc.
Jensen has show a great understanding/grasp of that pipeline, and as a result NVIDIA has been tremendously successful.
Whereas intel has ping ponged between CEOs that were either marketing or execution heavy. Thus why they have ended up consistently missing out on massive opportunities.
Its easy to second guess and point fingers when the ship is in the process of sinking. There are many things one can point out as missed opportunities, and just bad decisions. I am certainly not qualified to speak to which of the many issues have been their downfall. Maybe a little of each, maybe one in particular. Maybe nothing anyone external to the company knows about.
My WAG, is the mobile chip miss was a bad one. They invested just enough to make what everyone knew would be a worse product, then gave up.
With Gelsinger gone, will that mean even more delays to the 18A production node? And what will it mean for the promised intel fabs that intel already got payed for to build by the EU and US governments?
Shareholders? Shortsighted?! NOooo…
Truly, so many issues could be resolved if shareholders weren’t squeezing rocks and wanting endless blood to come out of it – instantly.
Honest questions for those in the know. Are Intel GPUs a reasonable choice for linux desktop in terms of support and compatibility?
I’m tired dealing with Nvidia and I don’t live by gaming.
AMD is still the best option in FOSS. And if you are not into gaming at all, you can get a brand new RX 5300 low end card with displayport that can do 4k@120hz for 19 US dollars. It has about the same gaming performance as a nvidia 780ti but excellent drivers in linux without binary blobs.
I thought Nvidia moved towards the open drivers years ago, but I’m not a gamer and don’t have any nvidia in any of my systems, due to me not caring about any video game released in the past 20 years. But just out of curiosity ,was there more to this case? is it really just loading a blob firmware somewhere?
https://forums.developer.nvidia.com/t/unix-graphics-feature-deprecation-schedule/60588
Nope, not even close and still no documentation either.
Yes, as I understand things, the “open” GPU drivers load a firmware blob which does most of the work.
The kernel interface is open, the binary blobs for the GPUs firmware, mostly, are not.
FWIW if you have a standard distro with the necessary driver repositories infrastructure. There is little overall issues with NVIDIA.
We’ve actually had more issues with some bugs in the “open” AMD GPU drivers (weird fan ran ups and crashes during heavy ROCm workloads) that the closed NVIDIA ones. We do mostly compute with our GPUs FWIW.
As others are saying, the real drivers are proprietary. The nvidia “driver” that got mainlined is an interface that connects their blobs to the kernel. I download the source only to learn it was just boring stubs.
Even so it did actually solve a contentious problem for nvidia linux users installing drivers from nvidia: the unstable ABI. I haven’t experienced a single nvida driver breakage since then. It does nothing for FOSS, but these new “drivers” effectively solve the unstable ABI problem for nvidia by mainlining a stable ABI for themselves.
It’s weird that you would in the first place.
Let me actually answer the question:
Yes, intel video is an extremely reasonable choice here. Have been rock solid and problem free for my last couple Linux computers and it’s a reason I buy Intel with the integrated graphics.
If you’re not doing any gaming or compute. The iGPU of any modern intel/AMD SKU should work just fine with linux.
No need to invest on a dGPU.
That’s what I’ve been doing until monitor upgrade but the iGPU on my mobo doesn’t support 4k60hz which was a deal breaker for me and forced me to look for 3rd party.
*checks if they still have a toxic work culture created by stack ranking (AKA have your employees re-interview for their jobs every year and lay off the bottom performing percent). Don’t know about the stack ranking, but lots of confirmations out there that a toxic work culture persists.
Intel’s corpo culture was shitty back in the day, even without slack ranking. From what I have been told, it got much worse.
Check their stock performance history. I was very surprised to see how much they have been struggling for a long time.
Yeah, intel’s stock has been stagnant forever.
And it hasn’t really improved under this last CEO. So it was no surprise he was canned.