Taking NVIDIA into the next generation of server GPUs is the Hopper architecture. Named after computer science pioneer Grace Hopper, the Hopper architecture is a very significant, but also very NVIDIA update to the company’s ongoing family of GPU architectures. With the company’s efforts now solidly bifurcated into server and consumer GPU configurations, Hopper is NVIDIA doubling down on everything the company does well, and then building it even bigger than ever before.
The kinds of toys us mere mortals rarely get to play with.
To get an idea of what kind of performance one of those Nvidia SPU is delivering compared to a GPU :
https://www.youtube.com/watch?v=zBAxiQi2nPc (NVIDIA REFUSED To Send Us This)
Nvidia A100 = $9895
There’s a theory that any time the title is in all-caps or uses bombastic language, you should just ignore whatever it has to say, lol.
Yeah, Linus is a bit clickbaithy yet his vids are on spot and rather harsh with the products.
While I get that both AMD and Nvidia are pushing for ever more performance Jesus Tap Dancing Christ on a cracker have you seen the leaks about what the power draw on the next gen of cards is gonna be? if you thought the FX-9 CPUs were batshit or the worst Intel 10th gen space heaters oh boy are you in for a shock!
The AMD card is rumored to be pulling over 500w! and if that isn’t bad enough the Geforce 4090TI? Yeah you will literally have to have your house rewired if you live in the USA as its reported to be over 800 fricking watts! Considering H100 is a card designed for heavy lifting with 80Gb of HBM it pulling 700w in a server room is understandable, but those kinds of power draws on a fricking desktop?
I just hope there is a sea change in GPUs the way we went from the MHz war to the core war in CPU because I’d say its pretty obvious we are reaching the point where the heat and power usage is just spiraling out of control.
I don’t care, I’m all in for laptops, it limits power to the acceptable TDP so hardly more than 250w.
Need more because gaming ? First world problem.
We have reached the end of the road for power efficiency scaling, the only way we get more performance is by using more energy (or more efficient/specialised designs). Sure there is a bit more room to increase density for a few years but this will not help with power usage per transistor, it just makes cooling more challenging.
The new PCI power spec caps out @ 600W I think.
Yes, the power draw is huge but so it’s the performance. Not that anyone is going to be able to get any of those SKUs given the current state of supply/demand.
Most consumers will get the “sane” power tiers, which should be just fine for 4K and down anyways.
It might be very important tech, but I’m just glad it’s a reference to Grace Hopper, she is awesome.