Last night out of the blue, we received an email from AMD, sharing some of the specifications for the forthcoming Ryzen Threadripper CPUs to be announced today. Up until this point, we knew a few things – Threadripper would consist of two Zeppelin dies featuring AMD’s latest Zen core and microarchitecture, and would essentially double up on the HEDT Ryzen launch. Double dies means double pretty much everything: Threadripper would support up to 16 cores, up to 32 MB of L3 cache, quad-channel memory support, and would require a new socket/motherboard platform called X399, sporting a massive socket with 4094-pins (and also marking an LGA socket for AMD). By virtue of being sixteen cores, AMD is seemingly carving a new consumer category above HEDT/High-End Desktop, which we’ve coined the ‘Super High-End Desktop’, or SHED for short.
AMD is listing the top of the line Threadripper 1950X for 999 dollars, which gives you 16 cores and 32 threads, with a base frequency of 3.4Ghz (and a turbo frequency of 4.0Ghz) at a TDP of 180W (nothing to sneeze at). These are two quite amazing processors, and later next year, the pricing should definitely come down a bit so it’s a bit more affordable for regular computer use as well.
Well done, AMD. Sure, we need to await the benchmarks for more information, but this is looking real good. I’m hoping this will finally start forcing developers – specifically of games – to start making more and better use of multicore.
Thom Holwerda,
With graphics cards, developers can target very high end graphics and low end graphics simultaneously. The worst that happens on lower systems is a lower frame rate, lower resolution, or lower details. Crucially the game is still fundamentally playable on lower graphics settings.
Because of the way they’re used, CPUs are different. More CPUs can benefit back end systems: smarter bosses and AI companions, more sophisticated physics, more characters operating concurrently, intelligent speech, far larger crowds, etc. However unlike with graphics cards, it may not be so easy for a studio to support cheaper CPUs without fundamentally changing the game play experience.
For example, a game might have AI teammates that use AI to learn from and work with you to get you through game obstacles, but turning them off the teammates could break the game.
Don’t get me wrong, I’d be interested in seeing all the cool things they could do, but I suspect game studios will continue to target lower CPU specs so they can sell more games. IMHO 32 thread CPUs will remain very niche in the medium term, at least for consumer applications.
Couldn’t the studio hardcode a requirement for more threads to be availlable to enable the more sophisticated features as replacements or supersets though?
They’ll still have the R&D performed for their current title ready for their next ones.
I think it is not hardware resources that hold AI back, it is rather the low priority assigned to AI development and that it needs lots of iterations to get it right. The great FPS AIs of the past(Half Life, Halo, FEAR) used rather smart scripting and audiovisual cues to provide convincing and engaging encounters. Even in strategy games, smart scripting can make wonders, eg: Darthmod for Total War series.
Other areas, like your examples, scale rather well in my opinion and they are all related to graphical detail: the numbers of agents and the number of small entities all need heavy preparation on the CPU side before the GPU can render them, but their number do not break the game.
teamatar,
I was thinking of something far more sophisticated than those “script” examples to be honest, and with far more actors in play. The games we see today are rather limited due to the lack of CPU power.
I concede games can be fun without more sophisticated AI, but having enemies and artifacts that react to their environments with preprogrammed scripts (and animations) is extremely limiting too. In the real world there are potentially an infinite number of solutions and everyone’s path is naturally different. But in video games, rather than forcing players to follow highly scripted actions, we could use lots of CPUs to calculate what is physically possible. We’ve all encountered obstacles in games where something should work, but doesn’t because the programmers didn’t anticipate it (or didn’t have time to write scripts for it), more CPUs running physical simulations could fix that. Most game environments are still quite ridged for the same reason.
Most AI opponents are quite predictable because they’re following their scripts, but it’s not very realistic. Real humans can develop plans, in game characters should also be able to use their environments in intelligent ways that haven’t been scripted. The same intelligence should apply to in game AI teammates. IMHO AI could really revolutionize gaming.
With enough processing power in the long term, AI could actually automate game design itself: world design, plot twists, music, voice acting, etc. That’ll be close to the day machines take over in real life, haha.
Edited 2017-07-14 17:40 UTC
Scripting in AI programming does not mean that you script the scene(think about Call of Duty, or the daily patterns in open world RPGs), instead you script your AI system(this is easy to understand for strategy games, but the same happens with FPS AI). That is: you define behaviors and priorities, that a particular type of AI should follow. Like this AI is reckless or that other prefers close combat or that other teams up with others. This is all scripting.
But what counts as a good AI system and a good scripting environment is hard to define. For example a strong AI does not necessarily a good AI, because AI should be able to fail and fail in a human way, and that is hard to prepare for.
If the AI is too efficient, too good, the player will say it is cheating.
If the AI does not lose enough time, the player will say the game is too hard.
If the AI fails, it should in a way like a human does, but humans often fail in ridiculous ways, but the AI will blamed that it is too dumb.
Or imagine that the AI in the game decides that you are the least efficient in the team and the game plays itself without you.
So you want something game-like, not something realistic(like almost all of our games are, imagine a racing game where you go on the racing line for 60 laps without any overtake, or a shooting game where you guard a warehouse for 16 hours then killed by a single taliban fighter from 400 meters)
feamatar,
If it’s too realistic, it’s more of a simulation than a game, but there’s still a wide gradient between them and I’d personally like to see creative use of advanced intelligence in games, but you could be right too, there might not be that much demand for this sort of thing among gamers. In any case we’re not likely to see studios invest in this until massive core counts become much more common place.
Well, sorry, but there was AI based open world in the 90s that just did well with CPU of the time. Thinking about Half-Life (scripted), Black And White or Outcast to name a few. When you compare the CPU power of current chips with those of the 90s or the 00s, stating today chips are underpowered is a bold statement.
Read back how Crash Bandicot for the Playstation was coded, using a variant of the LISP language.
Kochise,
I’m not familiar with it. I looked up videos but it isn’t immediately evident where sophisticated AI comes into play? Do you have a link?
if we could get a Hackintosh to use them…
That would be nice.
{wishful thinking indeed}
That isn’t wishful think at all. It might take a few months for the necessary tweaks but this will surely happen in the hobby-sphere
However, most people that spend a thousand dollar/euro on just the CPU are graphic professionals and Hackintoshes just aren’t on their radar
The other people that spend this much on just the CPU are gamers and they are not interested in running macOS
(wishful thinking would be “if only Apple would sell macOS separately and allow it to work on non Apple hardware)
If a future Apple CEO made the leap again to sell MacOS to all comers / PC builders – what is the (future equivalent) most they could sell it for..
Would Hakintosh builders fork out say $999 for an unrestricted MacOS ? (with e.g. built-in AMD plus Intel optimisations; but support for only a subset of GPU’s and soundcards/chips ) ??
Less than $700 unit price and I don’t think the board would let it happen.
And I should be in the market for a new HEDT at the end of the year and TR 1950X vs i9 7900X is an enticingly difficult choice…
44 PCIe lanes (Intel) vs 64 PCIe lanes (AMD).
Considering that two graphics cards (x16 * 2) + 1 nvme SSD (x4) is already 36 lanes, I think Intel may regret only having 44 lanes.
Initially, the TDP (power draw) of the AMD chips is a bit scary, but it’s “only” 11.25 watts/core, whereas Intel is at 14 watts/core.
I don’t know what an unrestricted macOS would be, but you have got to be kidding that you think a mass market consumer/enthousiast OS is worth $700. 100-200 at most!
$250 is a reasonable price i think. As long as it’s licensed per Apple ID, and not per installation, as well as non-expiring.
I’d dream for it to transfer to newer versions of MacOS, but i imagine they’d charge a $20/$50 upgrade fee on top of that.
If they did that… http://walkingdead.wikia.com/wiki/File:Shut-up-and-take-my-money.jp…
The problem is while its obvious that Ryzen is the way to go (75% of the performance for less than half the price almost the entire way down the line? Isn’t competition grand?) the problem or both Intel and AMD which PC hardware sales have been showing for awhile now is software hasn’t kept up with hardware and even ancient PCs are “good enough” for the majority.
Lets use myself for an example, during the MHz wars I had a new gaming PC every other year and a major upgrade to the hardware during the offyear, now? I am truly happy with my FX-8320e, I have 8 cores (that sit idle for a good portion of the day) with 16Gb of RAM and 5Tb of storage…why would I need a new PC? Even the games I play are running 85-95 FPS on my R9 280 and the board can run triple CF if I wanted more so what would be the point? Heck the box I use at work is ancient, a Q6600 with 8gb of RAM and a Tb drive but all I’m doing at work is looking up parts for customers, ordering parts, downloading drivers and occasionally watching vids…what would a new PC give me that the Q6600 doesn’t already do?
If my FX-8320e system dies sure I’ll buy a Ryzen but thanks to solid caps my late father’s Phenom I system is sitting in the shop as a backup for the Q6600, today’s PCs last a VERY long time and the core wars means everything I own is quad or better, so until software actually starts using all this hardware power we are sitting on there really just isn’t a point for many of us in upgrading ATM.
bassbeast,
Pretty much. My personal computer is an ancient thing from 2008. I did buy a i7-3770 for a development box. but even that was second hand.
Consumers who want computers already have one that’s good enough. Sadly, the lower production scales of economy will likely result in higher prices for the future, I sense this is happening with laptops already. From a business perspective, I definitely see why “planned obsolescence” is appealing, even though it’s bad for the planet