In the tests that matter, most noticeably the 3D rendering tests, we’re seeing a 3% speed-up on the Threadripper Pro compared to the regular Threadripper at the same memory frequency and sub-timings. The core frequencies were preferential on the 3990X, but the memory bandwidth of the 3995WX is obviously helping to a small degree, enough to pull ahead in our testing, along with the benefit of having access to 8x of the memory capacity as well as Pro features for proper enterprise-level administration.
The downside of this comparison is the cost: the SEP difference is +$1500, or another 50%, for the Threadripper Pro 3995WX over the regular Threadripper 3990X. With this price increase, you’re not really paying +50% for the performance difference (ECC memory also costs a good amount), but the feature set. Threadripper Pro is aimed at the visual effects and rendering market, where holding 3D models in main memory is a key aspect of workflow speed as well as full-scene production. Alongside the memory capacity difference, having double the PCIe 4.0 lanes means more access to offload hardware or additional fast storage, also important tools in the visual effects space. Threadripper Pro falls very much into the bucket of ‘if you need it, this is the option to go for‘.
AMD is entirely in a league of its own with these processors. I keep repeating it, but AMD’s comeback is one of the most remarkable stories in the history of technology.
This is at least the third comeback. First was the athlon (unless you count that as continuing the superior 486 replacements), then leading the way with 64bit, now with zen (which is mostly a lucky break taking advantage of the unexpected stall in intel manufacturing technology and the previous AMD separation into/from global foundries).
I would argue that this most recent comeback had just as much to do with AMD acknowledging that regular consumers do, in fact, have practical use for large core-count CPUs and not charging an absurd markup for such parts like Intel historically has. The combination of decent performance and high parallelism at a reasonable price (compared to equivalent Intel Xeon chips) was a major factor driving early adoption of Zen hardware
The market which can use these chips is also incredibly lucrative. People making fun of the $56,000 MacPro don’t understand the Workstation niche is bonkers, and the next step is a compute cluster.
Actually they acknowledged it more than a decade ago, first with the Thuban hexacores and then the AMD-FX octocores but its taken this long for software to catch up.
As someone who owns both a Thuban hexacore and an AMD-FX octocore (as well as a Ryzen 3600) I can say both older chips are still quite usable even in 2021 despite their age thanks to the extra cores, certainly much more than the C2Q and Core i5 they were priced to compete against when they were released. When the Thuban came out unless you were rendering video (which I was) pretty much nothing used more than a single thread and when the FX octocores came out we were just starting to see programs use more than 1 thread but pretty much nothing used more than 4.
But there was a reason so many of the streamers on Twitch ran AMD FX and before that Thuban, because if you had jobs that could use those cores like streaming, content creation, code compiling, music editing and creation, all those cores made AMD the chip to use. I’m just glad now the software has finally caught up with the hardware so most can actually use all those cores and threads today.
Just to be pedantic the FX series were not “real” octo cores, it was just a quad core, with 4 extra integer pipes. The FX cores were awful in comparison to the contemporary intel parts, the main reason people used them were, in all honestly, price. That series almost killed AMD.
Sigh…nope sorry, each module has two 128bit FP modules that can be combined into a single 256bit FP, and guess what? Those “awful” benchmarks were made with…drumroll…the Intel Cripple Compiler. Gee what a shock a company with a history of rigging rigged, whoda thunk it?
If you look at Phoronix benchmarks, which use GCC instead of ICC guess what happens? The FX trades blows with i5 and i7 released at the same time depending on how multithreaded the app is. this is why streamers stuck with FX for so long, because streaming apps are multiplat they didn’t use ICC but GCC.
https://www.phoronix.com/scan.php?page=article&item=amd_fx8350_visherabdver2&num=8
Zen is a good attack on Intel all around. Good single thread performance, chiplets which can scale to high core counts vs monolithic which requires ultimate precision to scale to high core counts, thermal envelops which aren’t pipe dreams.
Intel bungling AVX2 helped too. AMD didn’t have AVX2 support until Zen3, but Intel’s implementation left a lot to be desired which resulted in few adopting it.
Flatland_Spider,
AMD really is doing a great job with their CPUs specs. I don’t think their progress was strictly about beating intel at CPU design though. A lot of it came down to the fact that intel fabs were stuck at 14nm for such a long time and intel falling so far behind TSMC. So while AMD makes the best performing CPUs today, IMHO TSMC deserves a lot of the credit.
While I am thankful to TSMC for pushing boundaries, at the same time I think this consolidation is seriously hurting the world’s supply chains. The CPU/GPU markets are experiencing unprecedented product shortages. I’ve been on a waiting list for graphics cards since november and at this rate I may not even be able to get the card I want in 2021 🙁
Some of it can be attributed to external factors like the pandemic but the lack of suppliers isn’t helping. Also these new crypto currencies that are designed to be ASIC-proof have been so extremely detrimental to would-be consumer applications for these GPUs. I’ve really soured against crypto mining, the cons are so much greater than the pros IMHO. The energy consumption is mind boggling with estimates like 186KWH/transaction!
https://energy.umich.edu/news-events/energy-economics-weekly-briefings/story/cryptocurrency-energy-consumption/
Anyways, I’m getting O/T.
Indeed. Some stock intel CPUs actually crash under AVX2 “torture tests” like AVX2 versions of prime95. Those that do actually need to be underclocked using “AVX Ratio Offset”. It’s disappointing to say the least. I need to set the offset to 3 for AVX2 to remain perfectly stable, but then you can’t get the full boost speeds.
http://www.reddit.com/r/intel/comments/9udomk/9900k_cant_pass_prime95/
https://linustechtips.com/topic/1099725-i9-9900k-overclocking-stress-test-questions/
While I can live with a CPU that doesn’t pass the AVX2 stress tests, this is really disappointing.
Theoretically, Intel should have an edge since they can tailor chips for their process nodes while TSMC should be more generic. Even at 14nm they should be able to optimize the designs.
TSMC absolutely does deserve credit. They’ve been able to deliver new process nodes with good yields while Intel has not.
Chiplets are a smart design. Being able to mix and match process nodes helps.
There is also Samsung who is supposed to have some nice fabs.
I’m waiting to see China start fabbing their own stuff, and Europe has an initiative to start building fabs.
I was working with a supercomputing cluster, and AVX2 was a very anticipated extension. Then it was revealed the chip could get downclocked to ~1.8GHz whenever AVX2 was in use, and you could tell people were recoiling in horror.
China already fabs their own stuff. They’re just a few nodes behind.
And why do people keep thinking Europe is void of any Fab capacity? AMD’s old Fabs were in Germany, for example.
There are a few fabs in Europe, and one of the main Fab technology manufacturers is from the EU, and one of the world’s most important semiconductor research institution is in Belgium.
TSMC and Samsung are clearly in the lead in process tech and capacity, but it’s not like the EU is completely void of semiconductor vendors and tech.
Yeah. Blockchains are a solution looking for a problem, and they haven’t found it. Aside from wasting resources, or being a giant password cracking operation sponsored by some nationstate.
I was talking with someone blockchains, and he mentioned a blockchain DB which was having performance problems. Which is exactly what is to be expected since the password hashing algorithms used as the basis for blockchains are designed to be resource intensive and slow in order to keep people from brute forcing the hashes.
https://medium.com/analytics-vidhya/password-hashing-pbkdf2-scrypt-bcrypt-and-argon2-e25aaf41598e
It’s a tough problem to address. The capital and technological investments to compete in the fab space are astronomical. Just getting a production facility going is multiple billions of US dollars, and that’s not counting the R&D to actually get to a competitive process node to actually run in said factory. I don’t see how it’s feasible to break into that market as a smaller operation.
I expect, however, what we’ll likely see in the next few decades is a fundamentally different chip/processing technology not designed around silicon wafer photolithography. At that point, the production landscape may become more diversified.
I remember when an old desktop went pop with a puff of smoke out the back and all the money I had been saving for some new sofas suddenly had to find another use. I bought an AMD Athlon x64 x2 and an ATI 2600XT and 8GB and it went like the clappers. It slayed everything I threw at it for two years before new games began to hammer the GPU. The CPU was and still is good enough for basic work and the GPU was still good for games up to 2012. Anyway, that went pop too so I bought a laptop good enough for my needs. The fact it’s Intel does irritate me. It’s also irritating that Intel play socket wars so I can’t drop a next generation CPU in with a lower TDP. AMD have always been better with socket support.
How do you make a PC go pop? I’ve never had a PC fail on me. Last PC I used for 8 years (2012-2020), one before that for 4 years (2008-2012), the one before that for 5 years (2003-2008), still working perfectly when I replaced them. I’ve had the occasional hard disk failure but that’s it.
@Luke McCarthy
Don’t ask. lol
Stop buying Lucas electronics! 🙂
Mostly by using crappy capacitors.
Things that have blown up on me:
* Nvidia card. The capacitors turned themselves inside out and sounded like a firecracker.
* Various low cost PSUs when I worked for a place which built their own PCs. 1 out of 100 would abort itself on first power up and release the magic smoke.
* Various mid-2000s Dells desktops. Not really blew up, but would quite working or be flaky.
Luke McCarthy,
I’ve seen misc breakages over the years, but for me it’s been the exception rather than the norm.
I did have a brand new power supply fry everything in the computer. The warranty technically was supposed to cover damages, but the manufacture didn’t want to honor their warranty because the computer I bought it for was an old build (a condition they added on the fly to justify denying my warranty claim). That was a bit of a nightmare and I had to start from scratch.
Also, since having kids, the laptops have incurred a lot more physical damage, haha.