In 2021, at the height of cryptocurrency mining, Nvidia released the Nvidia CMP 170HX. Designed as a compute-only card to accelerate Ethereum’s memory-hard Ethash Proof-of-Work mining algorithm with its 1500 GB/s HBM2e memory bus, Nvidia implemented the hardware using the GA100 silicon from their Ampere architecture. Thus, the CMP 170HX is essentially a variant of the all-mighty Nvidia A100, Nvidia’s top-performing datacenter GPU at that time.
Naturally, the existence of the CMP 170HX raised many questions, including its potential in applications beyond mining. Today, following the discontinuation of Ethash, these $5000 GPUs from closed mining farms are sold on second-hand markets for $400-$500 in China. It’s time to answer these questions.
This article contains a basic performance overview, a hardware teardown, a watercooling installation guide, and a repair log.
I’m glad smart people are at least trying to turn otherwise useless hardware designed for one of the most brazenly useless applications in human history into something potentially useful.
Wow, a lot of dedication went into this! A lot of work to use a restricted GPU board working with a lot of limitations. And if that weren’t bad his device broke within an hour and needed to be fixed by tracing wires and using proprietary leaked documents.
It’s a shame such powerful hardware is locked down in the first place. It’s wasteful to not maximize earth’s rather limited resources and especially to do this deliberately.
Alfman,
I am not sure it is powerful by any means. Especially in the modern context.
Even the article has significant reservations:
I would assume these are “binned” chips which failed the factory tests for standard server cards, but were underclocked heavily not to go into the trash bin and sold to “miners” for their (back then) fast memory access. Yes memory bandwidth is useful, and these offering over 1000GB/s puts them in a nice league with top end cards. But without being able to do any calculations with them, it becomes impractical.
Hence they go where they were first destined to: trash bins (or collectors).
Sure, all hardware becomes dated. But this is not a good reason to design hardware to be thrown away.
Yes I read the article, it’s exactly the reason I’m critical of artificial hardware restrictions, just as discussed by the author.
Well, IMHO hardware shouldn’t be designed this way. It should be designed to maximize utility. The problem is that capitalism has financially rewarded deliberate under utilization, which I have a very hard time accepting as ideal. Of course I don’t have a problem with products being naturally binned, but all too often manufacturers go further than this and intentionally fuse off working functionality, which is unfortunate and wasteful. As with so many of the world’s problems, the incentives are just plain regressive and that’s why things are the way they are.