“AMD described its forthcoming quad-core processor, codenamed Barcelona, in a session at today’s Microprocessor Forum. Details of the new microarchitecture on which the processor is based (codenamed K8L) have been known for some time now. Still, the event brought some new info, and here are some highlights that I’ve culled from some of the reporting on it.”
I read the article, summarized:
Intel’s core architecture is better, but AMD’s system architecture is better. Therefore, they should be about equivilant. Only time will tell.
Fun to read, but it’s all things we know already. Can’t wait for AMD’s next gen tech.
Whatever happens, the overall performance per watt per dollar of a fully configured system is now the most important measurement of success.
I’d like to add per watt of dissipated heat to that list. Noise is an important factor.
Although better manufacturing technologies can affect it, dissipated heat is very closely related to total number of watts consumed, meaning that it is already basically in that list. (Low) Noise is nice, but the average guy buying a Dell probably figures all computers sound the same so I’m not sure it would be an important factor for popular success.
Unless of course the desktop was really-really-really small, then to the “Joe Dell”, a ‘cooler chip’ is a better chip.
99% of the public doesn’t need two cores, let alone four. Hell, in 5 years, maybe 64bit will actually be utilized.
The only thing these dual core, and quad core, systems are/will be good for is acting as space heaters and running up electric bills.
Heck Norton needs 2 cores by itself.
But in reality who knows what kind of software and ideas can come to the desktop because of dual/quad/… cores. Its not that the public doesnt need it, its that they dont have the software to utilize it for something generally useful.
Has anybody read an article “Miltiplied”Linux Desktop Migration Startegy ( see OSNews link http://www.omni-ts.com/linux-desktop/linux-desktop-migration.html )?
Many modern CPU are already very very idle and that’s where Novell sees an opportunity for for their “multiplied Linux Desktop” . In other words they are after better utilization of current hardware which is good.
When INTEL announced an end of “fighting over the CPU speed” era year ago some analist raised the questions what’s gonna be next battlefield for CPU manufacturers. And now, as we can see, its packaging more CPU cores on die. And I don’t see where it’s gonna end up.
However, I’m still happy with my fanless Mendocino CPU (Celeron 500 MHz) which is crunching PCLinuxOS code now just as good as seven years ago when it was introduced to consumer PC market.
————However, I’m still happy with my fanless Mendocino CPU (Celeron 500 MHz) which is crunching PCLinuxOS code now just as good as seven years ago when it was introduced to consumer PC market.————-
My pop keeps telling me “you need multicore” “you need multicore, that 2ghz you have is garbage” which I can’t say I agree. That would still leave me with the same fundamental problem. The biggest bottleneck out there is the hard drive. Not the processor.
Man, I can’t wait for Samsung’s flash SSD’s to drop in price. All I need is 16GB for the boot drive, that’s plenty big enough for SuSE.
I’m going to have this computer for a long time………….
Depends on what it is your are doing.
Something as trivial as burning a CD or encoding an mp3 can really suck down the CPU, being able to offload those types of operations onto another core and be able to do what you would normally do while they are running is a BIG plus.
One can already do that today on a single 32-bit CPU running @ 1.5 GhZ. But of course an extra core would be most welcome here.
But for those few minutes it takes to burn a CD, it’s not really a problem that one may have to be somewhat relaxed about the use of his/her PC.
Ok, so a DVD.
I have to say, a system running at 1.5ghz isn’t going to do a whole lot for these different tasks.
You won’t be able to do anything other than these individual things, except for maybe using your browser.
I can do a whole lot with 1.5 GhZ system. At least with 1024 MB ram everything works reasonably. CPU-power isn’t so important (The number of cores are more important). The barrier is still primarily the harddisk.
If the application is sluggish on a multi mhz system, it must be replaced. It’s simply crappy software.
I can easily write documents or hear music while burning a disc. But of course, opening applications and documents take longer time.
Good advice: Avoid sluggish applications.
sounds like a mini “big iron” to me. those machines have a cpu pr storage controller, so that when the user wants a file moved, copied or whatever, the main cpu can just hand the job over to whatever storage-cpu(s) it is that owns the media in question.
hell, a GPU is basicly a specialized CPU. if 3dfx and others had not started putting a specialized CPU to do 3d maths onto the graphics cards we may well had seen a extra core on the die just to handle that load.
the big issue will become syncronization, so that the diffrent cores get what they need when they need it.
I guess the AMD/Wintel Quadcore processors is aimed at the server market, just as Power and Niagara.
But as always I guess some twisted minds, with far to much mony on their hands, will buy this tings for their game-box.
Personally I think I will hold on to my Barton for a little while :o)
Dual core 32 bit has been out (desktops), what, 3 years….64bit like 2 years? I suppose the only great equalizer is video games. Even then, there’s not much that my aging AthlonXP 1700 won’t run, video game wise.
Although, it’s a great position to be in from a consumer standpoint. When the 4 core deals come out, the high end 2 core chips are going to be real cheap. Pretty much guaranteed that buying a dual core chip today will last you for about 5 years easy.
I think that quad-cores are great. And I would love to have one. Multi-Core CPUs will finally force software developers to write multithreaded programs.
And I can think of many applications that can max out these CPUs. Ever tried running Visual Studio 2005 on a single core CPU? Or Eclipse? It is not a very pleasant experience.
Programs like 3D Studio Max will also run much faster. Probabilistic voice recognition programs will have much better recognition rates.
And games will also run better. And I am not talking about stupid 3D shooters, even though I am looking forward to Crysis. What I want is a go game that can beat me.
I am not a good go player, but I still manage to beat every single go program out there. So just because you only use your computer to read email and browse the web does not mean that everybody else does!
Just play chess ;-P
Seriously, go is not that sort of game computers are really good at because the board is too large, creating too many possibilities.
Very rough estimate:
chess-board: 8×8=64
go-board: 19×19=361
Now the number of possibilities increases approximately exponential with time “t”, meaning:
possibilities(chess)~64^t
possibilities(go)~361^t
Now chess opening theory is about 10 turns (there’s more but real chess computers stop relying on theory from this point on).
10 turns means:
possibilities(chess)~64^10=10^18
possibilities(go)~361^10=4*10^25
7 orders of magnitude between them.
Moore’s law predicts sufficient computing powers no sooner than 2040. If it remains true.
There are problems where brute force is a suitable strategy but go is probably not one of them.
But you might hope for quantum computing, evaluating all possible choices at the same time.
Now you only need a way to make the machine let you win from time to time 😀
On a side note, I find it really strange to put more and more cores into one machine and at the same time virtualize the machine so that it can be used by several OSs and people at the same time.
I know it makes sense but it feels _really_ strange…
I think I’ll wait until quads are somehow mainstream and get a really cheap dual core then for ripping or compiling stuff in the background.
Edited 2006-10-12 13:16
Some things to remember about virtualization:
Most operating systems need at most 20GB – 40GB at most for their OS drive. I’m hard pressed nowadays to find a drive below 250GB and really now wouldn’t buy one less than 320GB.
So what you get with virtualization is the abilty to save space and power. Instead of 4 boxes with 4 cases, 4 power supplies, 4 raid-1 sets, etc, just reduce it to a single box with 2-4 cores and a single raid-1. Attach some amount of storage to it, logically split it and give to each virtual copy.
This is great for basic legacy Windows office infrastructure which is ridden with server software that refuses to play nicely with other server software (exchange, ms-sql, domain servicer, etc). Basically give each Windows server a virtual sandbox to play in, a sandbox backup being simply a copy of the server image.
Not the way I would suggesting running unix based infrastructure though (software seems to play more nicely with each other, virtualization only needed for security purposes).
Your numbers aren’t all that accurate, because in chess certain pieces can only move in certain ways, lowering the total number of possibilities even further. However, the general idea is correct – for as many possible moves a computer has to evaluate in chess, the corresponding number of Go moves is much, much larger. I believe the best Go programs running on the fastest computers are about equivalent to an intermediate player at this time, and it will be a long time before they become as good as an expert.
Edited 2006-10-12 15:25
RandomGuy: I know about the complexity of go. I am really a lousy go player, but even I can beat go programs when I concentrate. A friend of mine is a very good go player (dan).
A few years ago we tried writing our own go program. As you mentioned, using brute force is quite hopeless. So we tried having a very low search depth and a very advanced position evaluation.
We got it to play good strategically (e.g. making decent opening moves without an opening library, good liveness estimation). But we could never get it to play good combinations.
most avrage joe that i have seen relly doesn need multiple cores.
thay have so many crap processes running that the computers are barly running.
just fixed a computer for a friend no speed machine mind
but a 800mhz p3 when i got it it could not even play mp3’s becus of all the crap that was on it.
Ugh, 32bit is a huge headache. In many cases going to 64bit can allow simplification of operating systems and software. With 32 bit there’s always this possibility that a process *might* blow out the 4GB virtual address space you are stuck with.
(one good example, how many digital camera photos you have? What if I decided to write software that just mapped all your photos to virtual address space? Oh sorry, that doesn’t work on 32 bit!)
With more cores, the power is there when you need it. As someone mentioned, things could start to happen in the background without the user knowing it. If what’s in their face that they’re working on runs with one processor, then you can use the other one (or however many) to do other stuff.
(Or then again, what about that whole bunch of digital photos sitting on disk? Might be nice to thumbnail them a bit faster since they’re directly addressed in virtual memory and now the io throughput is substantially higher…)
As long as the new stuff is still cheap, I welcome it. Now time to get rid of this old trashy instruction set based on the old 8086 and get one that doesn’t need to suck up die space & power just to decode it into something useable! The space saved from the instruction decoders can go into more cores!
Disclaimer: I’ve been working with 64bit linux now for almost 3 years, where I worked before we had deployed it live on a 24/7 fully website shortly after.
Edited 2006-10-12 15:08