Today may be Halloween, but what Intel is up to is no trick. Almost a year after showing off their alpha silicon, Intel’s first discrete GPU in over two decades has been released and is now shipping in OEM laptops. The first of several planned products using the DG1 GPU, Intel’s initial outing in their new era of discrete graphics is in the laptop space, where today they are launching their Iris Xe MAX graphics solution. Designed to complement Intel’s Xe-LP integrated graphics in their new Tiger Lake CPUs, Xe MAX will be showing up in thin-and-light laptops as an upgraded graphics option, and with a focus on mobile creation.
With AMD stepping up to the plate with their latest high-end cards, it’s very welcome to see Intel attacking the lower end of the market. They have a roadmap to move up, though, so we might very well end up with three graphics card makers to choose from – a luxury we haven’t seen in about twenty years.
I must be stating the obvious, but , they’re clearly going the route of AMD.
Consider the absolute crap Optimus was (swapping between Intel HD to Nvidia GPU) even under Windows (many times with the NVidia GPU not even showing anymore in the device panel) I just hope they’ll get things right using their own GPU. Under Linux too.
At least that’s where AMD has always shined, their APU+GPU combo always worked flawlessly, even improving performance a bit.
https://www.youtube.com/watch?v=Fpf8_L16I-M
Apparently they copied and pasted their GPU from their CPU package, and are now offering it as a discrete solution, to be used with the same CPU, which already has that GPU integrated.
Am I missing something here?
Nope. It’s about as exciting as finding a slug in your boots.
So this piece isn’t about a new competitor in the market, but about a mess that was a VP’s pet project, which will quietly be withdrawn in 6 months?
I doubt it.
I’d expect than these chips will be used in “middle-end” laptops to get a little extra graphics performance (due to dedicated higher bandwidth VRAM, higher clock and higher TDP), then also used in workstations & servers (where it’s hard to find a Xeon with integrated graphics, and where a lot of motherboards come with significantly worse graphics, like Asrock, built into the motherboard).
Then, possibly in as little as 6 months, they’ll release a much more powerful version (mostly with wider SIMD and/or more cores, to make use of the die area that isn’t being used by CPUs). I also wouldn’t be surprised to see a special version aimed at GPGPU (and maybe even HPC) that isn’t intended for graphics at all.
Of course I also wouldn’t be surprised if Intel screws it all up by trying to encourage adoption while using “high profit margin” pricing, and then wonders why they couldn’t gain much market share.
From what I’m seeing so far, it will be Intel partnering with Adobe and the like to add custom optimization code for their GPUs so they get better benchmarks against Ryzen 4000 mobile processors. It reeks of a last minute attempt to fight off Ryzen’s core count and GPU advantage over the holiday season with something new and shiny.
hdjhfds,
The Iris Xe GPU alone performs better than any integrated GPU (although obviously still targetting the low end). Supposedly a driver may allow the discrete GPU to work in tandem with the integrated GPU.
https://www.tomshardware.com/news/intel-gen12-xe-dual-gpus-benchmark
If true, you may be able to get a doubling or so of performance by using multi-GPUs concurrently. Now I’m not sure if intel would support this out of the box or if it would require devs to support some sort of SLI configuration. If it’s a form of SLI then I doubt many titles will ever support it. But if it works out of the box with no software changes, which sounds feasible given the architectural similarities, then it could prove beneficial for the low end market.
As a desktop user primarily, laptop GPUs aren’t all that interesting for me. The laptops I own don’t have fast GPUs. Honestly I wouldn’t pay a premium for such a product, but I wouldn’t avoid it either. I hope they are eventually able to up their game with competition for higher end markets, but I’m still glad they’re bringing new competition even if it’s only low end for now. Consolidation has been harmful.
From the linked article:
hdjhfds,
Unfortunately they didn’t cite a source for that claim. And while I wouldn’t necessarily object to taking their claim at face value, the existence of leaked evidence to the contrary makes for the possibility that intel may in fact be working on this exact feature. In short, I don’t know. We should wait for more confirmation.
The XE Max has 4GB of LPDDR4X all to itself…. thats the only thing. The rest of the lower skus are using shared memory just like IGPs.
It’s still decidedly unimpressive.
I never got the point of low-end discrete graphics cards in laptops: They pull more power than the integrated GPU, yet they are not good enough to play games. Because most PC titles have system requirements significantly higher than the average spec (you can thank the PC Mustard Race for squealing like the entitled neckbeards they are every time a new game doesn’t have intensive graphics, and the websites who hire them as PC game reviewers obviously).
Even Fortnite recommends a GTX 660, which is far more than what most low-end discrete graphics cards for laptops can offer (so much for enabling casual gaming).
Let’s see how this new chip performs inside a laptop and how it will rank relative to the GTX 660.
The main blocker with laptops is power, not the components. It is simple physics.
For example, even though I have a 1650 on my laptop, it won’t practically play any games. Reason: It cannot pull out enough heat, meaning it cannot pull in enough power.
Gaming laptops have very large exhaust fans. So do desktops, or even the Intel NUC NUC8i7HVK. The CPU and GPU are very efficient heating elements, and all that watts that come in needs to go somewhere.
That’s something I wanted to put in my original comment but didn’t want to make it any bigger. The
“GeForce GTX 1050” your laptop has is not the same thing as a real GeForce GTX 1050 a desktop will have. So, discrete low-end graphics cards are even sillier than they sound, because they are taking an already heavily fused-off chip and underclocking it. If they don’t have the thermals, they shouldn’t put them in. Because these discrete GPUs don’t play games and the iGPU is good for most other things. I once had a laptop with an 840M, and it was simply not powerful enough to run games. The only benefit I got out of it is running PPSSPP at 1080p (the iGPU struggled with it for some reason).
kurkosdr,
It depends a lot on the system builder’s implementation. High performance parts are wasted if they’re going to be thermally throttled. For example the macbook pros have been notorious for throttling issues and often cannot reach & sustain their speced performance. I think this speaks to sukru’s point, given a thermal/power limits, it may not be possible to unlock the full potential of a high end GPU/CPU if it’s going to throttle. I don’t have the data, but hypothetically the Iris Xe could be more competitive on power/heat/battery life than it’s competitors and in laptop form factors this can be important. It might be the case that the Xe has a better trade-off between graphics/gaming performance and power. Obviously I’d need to see hard data before making any conclusions, but considering that even laptops with discrete GTX GPUs switch back to iGPU to save on power, better energy efficiency could be a benefit for Xe too.
I’ve got a laptop with a GT 525M, and for a while it was plenty powerful enough to play newer games.
Hell, Fortnight looks pretty good on it. Most of my gaming is on consoles these days, but there are plenty of laptops with discrete graphics that are more than adequate for gaming.
Yes the GPU is not at its 100%, and would be a waste trying to run games.
Yet, there are real benefits of that 1650: much better desktop performance.
I don’t play games on that machine. But having access to multiple 4K displays, hardware accelerated decoding / rendering of desktop, and freeing up CPU resources are very useful.
Yes, the card is underclocked, and at best runs at 90% performance:
https://www.notebookcheck.net/GeForce-GTX-1650-Desktop-vs-GeForce-GTX-1650-Mobile-vs-GeForce-GTX-1650-Max-Q_9850_9828_9834.247598.0.html
But, compared to that, my other laptop with iGPU has real issues, struggling browsing the web.
One of them tries to run CPU+iGPU at max performance to keep up.
The other takes a whiff of dGPU performance, and some peak CPU to actually offer a smooth operation.
There is a wide range of performance between the performance of on chip gpu and that of a desktop gpu that can play games. Even with the same design, Intel can improve the performance by moving it off of the cpu die. Because off cpu die will allow the cpu to consume more power and put out more heat doing just cpu things. And the discrete gpu can put out even more heat as well. Keep in mind gpus are also used for creative productivity tasks as well. Video rendering is offloaded to them. It may also help with battery life if the cpu itself can sleep at a lower power consumption and speed while the describe gpu more efficiently handles the video decoding.
Fotnight might recommend a GTX 660, but the minimum specs are a 2.6GHz processor and Intel HD4000 integrated graphics. HD4000 was released 8 years ago.
And, the game still looks pretty good and runs well enough at those specs, making it accessible to nearly anybody that owns a computer.
Drumhellar,
I agree. I don’t really play games, but I donated a computer with HD 4600 iGPU to a nephew who does. The computer is older, but fortnight ran fine on that 2013 era iGPU. Today’s iGPUs are approx 150% faster than that and the XE can beat those so I am quite confident fortnight will run with no problem. Maybe not with all the bells and whistles, but I wouldn’t expect it to be a show stopper for buyers of low end laptops. Mid/high-end consumers have no business buying this GPU though because surely they will be disappointed. It’s clearly not the GPU for us, but I do hope intel will eventually add more competition for mid/high end market down the line.
The GPU is not only used for gaming nowadays. Having a mid-range performance GPU in a laptop is usefull on a daily basis when you use CAD, cartographic software or photoshop like.
The Intel Xe seems to have also a nice bump in performance ni GPU computing performance.
Moreover, having a discrete GPU saves RAM and RAM bandwith for computing (never heard how much memory some AMD integrated GPU reserved – like 1Go, sometimes nearly 2 out of 4 for abysmal perfs?)
Considering Intel is pushing its “oneAPI” to program seemingly CPU and GPU, I think they have good products to propose an alternative to NVidia with CUDA for professionnal laptops. But they need more software support to come…
Nervertheless, since I occasionnally play games (on Intel integrated GPU), and since AMD/NVidia do not provide convincing new products (some low-end discrete GPU in the market are less performant thaht HD620 in dual channel mode) in that segment,I really welcome Intel back on this part of the market.
Pretty sure the Xe Max is still using shared memory + the 4GB LPDDR4X…. 40GB/s isn’t enough really for a GPU even if dedicated entirely to it. It’s possible that it is though… still, thats a drop in the bucket compared to even a RX550 64bit version which has 56GB/sand the very next notch up the 550x has 112.0 GB/s ….
Also… you would have to be talking vintage hardware for an Intel HD620 to keep up with it… bear in mind on such computers you are usually running lower resolutions to hit acceptable FPS with an HD620…. an RX550 is nearly 200% faster.
cb88
Look at the “Intel GPU Specification Comparison” table from the article here.
https://www.anandtech.com/show/16210/intels-discrete-gpu-era-begins-intel-launches-xe-max-for-entrylevel-laptops
It shows that the 4GB is dedicated and it looks like the Tiger Lake iGPU in all other respects other than clock bumps.
It’s likely that these can both run concurrently under applications targeting this setup, like opencl. There’s conflicting information whether official graphics drivers will be able to exploit both GPUs.
4GB is about right for a low end GPU. I mean, the highly sought after rtx3070, which significantly outclasses it, only has 8GB. The Xe MAX is not going to be very interesting to anybody who wants a powerful GPU, but we just have to remember that the Xe is a low end part and it will probably be sufficient for it’s intended demographic. I would have liked to see something better too, but hopefully future versions will become more interesting.