great so finally you have 2 cores, one running all your anti programs (anti virus, anti spyware anti god-knows-what) and one so you can actually work.. so effectively you have good use of a single 3.2ghz P4… wow!! at a 3000 dolar price tag, thats a bit steep…
on a side note, if you can do , word, excel etc with a pda, or even with a nokia communicator (in my case), why on EARTH are windows, mac, office, etc.etc. so “#%”#%!” bloated and huge?
imagine the speed of your machine, or the responsiveness if you have a total memory footprint of the gui, the applications etc of 32MB….
yea i know.. drivers, directx etc. so what? it would be perfect to make these modular and load them IF you need them, not at all times..
I know on linux ther is tinyX etc but i don’t think it would work fine with a full desktop and accellerated drivers…
Dual core systems are mostly suitable for server applications. i do not think they will provide major gains apps like games (app speed might even be slower due to the lower clock frequency). power consumption is also ridiculous. So, this is more of a marketing thing. However, double core Opteron is another story, i think this is a chance for AMD to gain some market share in low-mid end server market since Intel will not provide a contender until 2006 IMHO.
“The chips represent Intel’s latest thinking on advancing PC processors. Instead of driving rapid increases in speed, the chipmaker is now focusing on adding performance (…)”
Is this another way of saying that Intel ran out of gas and couldn’t produce a 4 GHz chip that didn’t fry the motherboard ?
This dual-core stuff makes my head spin. However, I think most “gaming, multimedia or professional applications” will run just fine on a cheap system such as the following :
Dell Precision 530 with dual Xeon 1.5GHz, 512MB, 40GB, CD, ATI Radeon LE, Win 2K COA, $359 (from eBay).
Granted, it will be necessary to add a bigger hard drive and a beefier video card, but it certainly won’t reach $3,000.
There is absolutely no doubt mlticore processors can provide major gains on desktops, as long as the apps are written to have multiple threads of execution. Server apps have long been written like that.
But here is something not everyone realizes: software complexity and the cost of software development rises considerably because except in trivial applications, it is considerably easier to develop and debug single-threaded programs than multi-threaded. So while having multicore CPUs is a definite advance, not having any increase in CPU clock speed is bad, because developers are forced to make their apps multi-threaded if their apps are going to see any speed increase on the new machines with those CPUs. In fact, this new Intel CPU, as well as the new Pentium D series, run at a lower clock speed than the top-of-the-line single-core Pentium 4s, so an app which cannot take an advantage of more than one CPU unit (which are most apps today) will actually run faster on the latter.
On the whole, this technological advance imposes a considerable burden on software development and instead of touting multicore capabilities as if they are better than a pure Mhz increase, as they do if you have noticed, they must admit it is the other way around — having pure Mhz is considerably better. But marketing being the field where they study how to sell used toilet paper, no wonder Intel and AMD have turned on new reality distortion fields, and a lot of people actually believe them.
I find it logical that Intel and AMD get all the hype around dual-core (they are the market-leaders, in the end), but there’s a company that beat them both to it.
How anyone could spend that much money on architecture which is unreliable and probably using an operating which is unstable is beyond belief. Of course, there is one plus; just think how fast it will reboot when required to, hourly or daily!
Although expensive Apple’s current machines and operating systems is one combination that’s hard to match in performance and reliability.
“Although expensive Apple’s current machines and operating systems is one combination that’s hard to match in performance and reliability.”
I couldn’t agree more!
Rant code, activated…
Well, ok… except for the “expensive” part. Relative to what Mac’s used to cost (I bought a 9500/120 for $5,000, when they first came out). G5’s are downright CHEAP! $3,000 gets you a computer than is so blazingly fast, compared to a top of the line 604 processor back then, it ain’t even funny!
They’re only considered “expensive” when compared to crap PC’s which are a dime-a-dozen, anywhere, from a billion different manufacturers, 365 days a week! And you can’t even figure out who has the best manufacturing quality control, because the exact same motherboard’s design quality can vary from board to board from even one manufacturer!
How anyone could spend that much money on architecture which is unreliable and probably using an operating which is unstable is beyond belief. Of course, there is one plus; just think how fast it will reboot when required to, hourly or daily!
They aren’t that unreliable and unstable, my Dad’s WinXP-SP2 P4 has been up 20 days, which is better than my linux box’s (a Sempron 2400+) current uptime which is currently 8 days, though that’s still good enough for desktop use.
That said, there is no way I’d pay that sort of money for a desktop system. I’ll probably wait until dual-core Athlon64s are a reasonable price before getting a dual-core desktop.
There’s no way in hell I would spend $3000 on A Wintel setup.
It says in the article that this is aimed at Gamers/Video editors.
If I was a Gamer and had $3000 to blow I’d buy a Playstation2, X-box, and a Game cube, and a PSP. I’d still have over $2000+ left to buy games for all four systems.
If I had $3000 for a video editing system. I’d buy the $1999 dual 1.8 G5 Power Mac and pay $1200 for the new Final cut Studio production bundle.
Cutting edge x86 machines are expensive spyware/adware/virus magnets.
“But here is something not everyone realizes: software complexity and the cost of software development rises considerably because except in trivial applications, it is considerably easier to develop and debug single-threaded programs than multi-threaded.”
So what will that mean for F/OSS, when the only motivation is “an itch”?
All that matters is what Intel’s miracle marketting machine says, most people will swollow it. They still try to do as they’ve done in the past:”We have the latest technology, now pay!!!”. Well, let me put some light on the situation here:
First of all, Intel didn’t come up with dual core, but AMD did. Intel is so lame, it aint even funny. Here is what happened: Years back, AMD realised that it would become harder and harder to raise clock speed and keep some efficiency in manufacturing process/cost/power consumption, so they where looking at dual core. Intel on the other hand wanted to raise Ghz more and more, so Joe Sixpack would buy their product, until they f###ed up with Prescott so badly. That was a step backwards from Northwood. AMD made their plans public years ago for x86-64, of course, Intel didn’t want to hear about it cause it would heart their Itanium. Of course AMD pulled it of, but the strenght of the Athlon 64 doesn’t come from that 64 bit instruction set, but from a reworked k7 core. Keep it simple, that is. All that x86-64 is, it’s a way to handle huge ampunts of memory, especially virtual memory, so in the future when we wanna play Unreal Tournament 2008 or Quake 7 we can actually use more than 2Gb at one time, cause the other 2 belong to the kernel…won’t get into that right now. So back to square 1, AMD realized that it needs to increase paralelism to increase performance, without increasing to much manufacturing cost. So they tought about dual core. Now that Intel messed up, they took 2 technologies that AMD came up: dual core and x86-64. None of these revolutionary, as others have donne it before, but in the imediat neighbourhood and easy to implement for Intel, and x86 compatible. So now, they came out rushing and screaming how revolutionary they are. I had a P4 Prescott, 530J, and it was junk. Went for a Athlon 64 3500 (939) with nForce4 Ultra. My system with all the extras +montior is $1200. Can’t beat that. It boots XP Pro SP2 in 3 seconds, plays all the latest games, needs little power, runs cool. I hate Intel with a passion, as they’ve turned into the MS of processors, aka Chipzilla. When Athlon 64 dual core comes out, they will run cool and smoke the Precott based dual cores, and will be allot cheaper. Plus they will be compatible with current mobos, which is a plus. And no, DDR2 doesn’t pay off. Intel is asking $55 for its 955x chipset, add to that manufacturing cost and all that, see how much the 955x mobos will be, over $150.
…if it would run on one. It would go so fast that it would probably start going backwards in time!! You’d get your work done before you even started it.
First of all, Intel didn’t come up with dual core, but AMD did.
Did they really, or did you mean between Intel/AMD?
Multicore has been around in the server/embedded world for some time. Broadcom had got a quad core MIPS communications CPU chip out right now for example (BCM1480
Intel have messed up, but have not messed up big time as I see a lot of people make it sound. I have used one of the new Prescotts for several months and it is a very quiet machine most of the time, even under heavy loads (continued 100% CPU use). The Prescotts have problems, but the problems are not great and they are still quite powerful machines in comparison with AMD64, and even have SSE3 support which AMD will only introduce now. Granted, for the latest games the AMD64 is better, but not a lot of people play the latest FPS games, so people don’t care much about it and it isn’t a real marketing advantage for most people. So if AMD hoped to pull ahead based on gaming performance, they cannot do much unless there is a very significant gaming performance increase, which there isn’t. Regarding the 64-bit advantage, most people don’t even run a 64-bit OS, let alone 64-bit apps, so this too hasn’t been a big advantage until Intel’s introduction of their 64-bit Prescotts.
Intel also have one significant advantage over AMD, not directly over the processors, but the motherboards — they make their own chipsets, so if you are a clueless buyer and go for an Intel board from a random motherboard manufacturer, you are bound to get better system stability than an AMD board from a random motherboard manufacturer.
Granted, the AMD NVidia chipsets are high quality and seem to be very stable, but do clueless buyers know enough to order only them?
For example, in recent years I’ve had 2 VIA motherboards with AMD processors, and with both I have been bitten with heinous hardware incompatibility problems — VIA’s IDE channel problems. After I switched to a mid-range Intel board with their ICH IDE controller, there were absolutely no problems like that, neither a second Intel system I bought at the time had any of those problems.
I am not trying to favour Intel, and I do think that the Prescotts have significant problems, but from experience I think the problems are not significant enough to give a considerable advantage to current AMD processors,.
“So what will that mean for F/OSS, when the only motivation is “an itch”?”
Regardless of motivation, the cost of development rises considerably and equally for everyone.
Think of it like that: whether you are Intel’s CEO or Free Software Foundation’s Richard Stallman doesn’t help your one little bit if you need to lift a certain weight with your body. Same holds for using your mind on programming tasks.
However games are the only widely used software which has performance bottlenecks due to hardware. In other types of software performance of current cpu’s isn’t needed (I myself use old palomino 1700+ and it’s faster than I need it to be)
So if speed isn’t important, the important thing becomes price…
And about the mobos – remember that Via exhibits the same problems when coupled with Intel (and where I live price is important so clueless people just go with Intel cpu (just cpu), nvidia graphics (noname manufacturer) and creative soundcard (the cheapest there is) “because they’re best” and really don’t care about anything else)
Not completely true. Running BeOS a number of my programs are dual threaded just in writting them. Fewer still are multi-threaded very easyily just from the nature of BeOS programming.
Where multi-threading is a real problem is when you are:
A) Trying to convert a single threaded program to multi-threading. Too often the program is already optimized for single threading in the first place and will not convert as it stands, and you are back at square one basicly writting a new program from scratch but with the added burden that it must be compatible with the original program.
B) The program is single threaded by nature. Multi-threads may be used to add features but they do nothing for the original core use of the program. Pushing multi-threading for the sake of multi-threading is a waste of programmer time and effort.
C) If you have to support both versions and you are trying to share as much code as possible. For some programs each and every thread is the same code and whether you run a single thread or multiple threads makes no diffirence except for the speed the job gets done. Other programs however need diffirent threads to run to get the job done. Trying to maintain single and multi-threaded version while also keeping as much code in common as possible is a real head-ache.
Multi-threading works best when you have programs that respond well to having more than one thread of core code running, the problem can be broken down to parellel jobs, and you can concentrate on writting code that only needs to run as a multi-thread program.
I used to be an AMD fan back when they made 486 CPU’s the 486DX-80 smoked anything intel had at the time. But after they believeing the hype around the K-5 (aka Skillet) man those things ran HOT! I lost interest. The K-6’s and on up were plauged by random weird crashes and lock ups no one made a decent chipset for these CPU’s those that did had power problems hardware incompatibilites..not so much AMD’s fault but still they should have produced their own chipset to go with their CPU’s. Recently i decided to try their AMD64 the 1.8GHZ offering mainly interested in the integrated Memory controller(Yah at least one part of the chipset is AMD made). and of course the 64-bit ness to run Fedora 64-bit. Long story short the server was retasked to be another game machine running XP secondary to my P4 2.4GHZ HT game machine. In basically the same configuration the AMD64 machine totally and utterly smokes my Intel machine and is as stable. My intel fanboy days are pretty much over my next game machine with most definitely be AMD64 based
As it stands now with current software typically used by home users and in most businesses you are correct. Where these dual core processors will mainly benefit consumers is in the server market and workstations for things such as film work, game design and research. Since the software I use for animation, rendering, etc does benefit from multiple processors the realization that dual core technology is no longer a pipe dream is exciting.
Now it all depends on actual cost factors between competitors Intel and AMD. A few things that sway me more to Intel is 1. First on the market to have dual core processors, 2. These will be EMT64 (64-bit) while supporting backwards compatibility with 32-bit applications, 3. The new P4 (or is it P5 now?) will be like having 4 processors and dual Xeon workstation/server will be like having 8 processors due to dual core and hyperthreading technology, 4. Rumors that Intel plans to release a mobile dual core version for power user laptops.
“great so finally you have 2 cores, one running all your anti programs (anti virus, anti spyware anti god-knows-what) and one so you can actually work..”
Bla bla bla, seriously, it is not that hard to keep a Windows box clean, I don’t even run anti-virus on my Windows box and if I do its AVG which is pretty low on the resources side.
Common Sense makes a big deal, and if you are reading OSNews I’d hope you’d have that. I’ve never had a spyware or adware infestation on my machine.
“Cutting edge x86 machines are expensive spyware/adware/virus magnets.”
The ignorance in here hurts sometimes…idiots I tell ya.
It is Really Really Really EASY to keep a clean windows system. I have several of them at home and work.
I also have a windows 2003 system running coldfusion, mysql, and that I use as a workstation everyday that generaly goes about 30 days between reboots(I shut it down when i’m gone for more then a day)
You just had the bad luck of choosing the most popular, but not the most stable and nonproblematic chipsets… (I myself did this mistake also…with my first and second AMD mobo; simply because reviews are based only on performance and the info I needed was difficult to find – but I had found it; most people probably didn’t care or went to Intel)
Basically:
Via – avoid
for K6: Ali
K7: AMD (PURE! without Via southbridge), SiS starting from 735 (but avoid ECS apart from their first, original 735 mobo) and Nforce2
I worked at a PC repair shop last year for almost a year. VERY few of my customers came back with Spyware / Virus problems. the only ones that come to mind are the ones that had teenage kids that would look at porn all the time.
So yes, it is simple enough that I could instruct non-computer people how to prevent this kind of stuff within a matter of 5-10 minutes.
If they wanted to get real work done, why are they clicking on ads, visiting shady websites, accepting attachments from unkown senders, etc, etc, etc ,etc….
I keep wondering where all of the trolls out there that are constantly crying that the G5 dual processor Macintosh computer is grossly overpriced. After all, $3000 for a computer with no monitor, etc. should have triggered an outcry.
Oh well, it is nice to read a thread that is not bothered with such nonsense. Thank you to all.
What you said about price used to be correct, but AMD’s newest offerings (AMD64) price is on a par with the Intel Prescotts’ price. It indeed used to be with older AMD and Intel CPU’s that the former were considerably cheaper (amazingly, about 2x cheaper), and your Palomino that you quote is a good example of a CPU that is comparably fast and considerably cheaper than Intel’s offerings at the time. But with the recent processors from both companies, the price differential is negligible and your argument doesn’t hold.
I trust you are right about VIA’s problems with the P4’s. But you can check the following fact yourself — VIA’s chipsets (yes, they do Intel chipsets, some people think they do only mobos) for Intel CPUs do not hold much of the market in comparison with Intel’s own ones. Intel’s chipsets are far more numerous and known to be very stable. There are also a lot of machines with Intel CPUs AND Intel motherboards (Intel chipset + the rest chosen/designed/manufactured by Intel), that being the most stable combination for which there isn’t a match in AMD land.
So if one shoots in the dark and builds a random AMD CPU and AMD-based motherboard — the key word here is *random*, it is significantly more probable to come across some stability problems than a random Intel-based solution. And even if some instability = a little instability, it is still a huge inconvenience because our computing machines are so delicate.
@Earl Colby Pottinger
I wrote “…except in trivial applications”. Writing a BeOS GUI, which by design requires the use of multiple threads, is a trivial case.
Your points A, B and C are correct, as is your last paragraph, but how about the difficulty of making code thread-safe so it can be executed by multiple threads? Is that an easy task? No. Does this task needs to be done when you need to be programming for a single thread of execution? No. How about avoiding race conditions, deadlocks, priority inversions and other thread-related problems? Are those problems easy to debug? Not by a long shot — they are exceedingly difficult to reproduce so you can vaguely know what the problem is at all.
I am currently doing multithreaded programming, and I do know first-hand it requires considerably more effort. I cannot estimate well how much more effort, but it is definitely considerable. But another programmer working on a large codebase, Tim Sweeney of Unreal Engine fame, has dared to give an exact estimate. He said in a recent interview entitled “Multi-core and multi-threaded gaming”:
“Implementing a multithreaded system requires two to three times the development and testing effort of implementing a comparable non-multithreaded system”
He may be wrong, but if he is correct, and there’s good chance he is, that’s 2x-3x more effort for the same program!
Also, John Carmack has stated his negative opinion of the trend for multiple cores for the same complexity reasons.
What I’ve said only doesn’t hold true to the absolute top of the line. However, when comparing middle to low of the line, I can see that prices of AMD cpu’s fall much quicker, ergo: AMDs are much cheaper for the same performance.
And about Via – don’t forget about context, I was talking about situation where i live.
About Via…those are mostly problems of the past. I wouldn’t choose Via definetely, but their current chipsets are at least ok, when not using some unusual PCI card (regarding AMD, I didn’t follow how they’re doing on Intel)
“There are also a lot of machines with Intel CPUs AND Intel motherboards (Intel chipset + the rest chosen/designed/manufactured by Intel), that being the most stable combination for which there isn’t a match in AMD land.”
There isn’t a match?
I have my machine for 3 years on 24/7 (practically – I’m not paranoid about uptime). It’s simply uberstable.
Furthermore, my department recently bought Intel i865 + P4 3GHz. Interestingly, admin adviced to save work often due to some minor stability issues…(I’m not saying that this is a rule; simply that pure Intel doesn’t guarantee stability)
They aren’t that unreliable and unstable, my Dad’s WinXP-SP2 P4 has been up 20 days, which is better than my linux box’s (a Sempron 2400+) current uptime which is currently 8 days, though that’s still good enough for desktop use.
OK, here goes…
The old G3 with 10.2 I’m currently using to edit these comments has a current “uptime” of 58 days 14 hours 38 minutes… and my record to date is a cool 155 days. Had a lightening strike and lost power, otherwise, well you know!
Please keep in mind, I also use the x86 platform, however, only for FreeBSD.
PIII-500, custom cooling fans, BX-440, Corsair memory, IBM disc, Matrox G550, 3Com, Plextor 48X, and the “way cool” Wave Master all aluminum case. This is the unit used for testing various operating systems. Although very old, it still works.
Checked the prices again, you are right. Intel is in more trouble with desktop processors than I thought. Well then, I’ll refrain from any more arguing about Intel vs. AMD, except say that from direct experience with the Prescott processors, they are not at all crap as many people say them to be and they have SSE3 which AMD introduces only now. In fact, I am glad that I’m wrong about this, because it means Intel will certainly be forced to lower prices eventually.
Regarding Intel chipsets + Intel CPU stability — ok, shouldn’t have said “there isn’t a match”. That was certainly an overstatement on my part. My point was only that Intel being both the manufacturers of their CPUs and chipsets are more likely to do a great job with fitting both together, and a lot of people confirm that.
Regarding VIA — I am not sure. Google for “Radeon 9600 VIA problem” (remove quotes) — it is very fresh.
I wouldn’t agree that Intel’s in trouble…even if their CPU’s were horrible (they aren’t, they’re just little worse than AMD right now IMHO), Intel would still ride on the name itself. (and btw, is there any use of SSE3 in commonly used apps already?)
And you’re right, chances of succes with Intel are generally greater (not counting such specific markets as the one here, where everyone’s going for cheapest possible option)…too bad you have to know what’re doing to get the same kind of stability of AMD (and reviews don’t help at all :/ )
And while talking about mostly nonproblematic Via currently I’ve meant their chipsets for K8…K7 is more and more dead after all.
And don’t worry, I’m not planning to use Via either, in any form, be it chipsets or specialized chips :> (with the possible exception of Envy – too bad I can’t find Chaintech card with it here :/ )
“$2,900, sans monitor and speakers, with 512MB of RAM, an 80GB hard drive, a 256MB Nvidia GeForce 6800 graphics card and a DVD-ROM drive.”
Seems a little expensive for the dell Dimension XPS.
I bet it would be possible to diy a smoking dual opertron computer for the same price.
At $ 2,900, I would rather opt for a dual 2,5 GHz G5.
great so finally you have 2 cores, one running all your anti programs (anti virus, anti spyware anti god-knows-what) and one so you can actually work.. so effectively you have good use of a single 3.2ghz P4… wow!! at a 3000 dolar price tag, thats a bit steep…
on a side note, if you can do , word, excel etc with a pda, or even with a nokia communicator (in my case), why on EARTH are windows, mac, office, etc.etc. so “#%”#%!” bloated and huge?
imagine the speed of your machine, or the responsiveness if you have a total memory footprint of the gui, the applications etc of 32MB….
yea i know.. drivers, directx etc. so what? it would be perfect to make these modular and load them IF you need them, not at all times..
I know on linux ther is tinyX etc but i don’t think it would work fine with a full desktop and accellerated drivers…
my 2 cents
Dual core systems are mostly suitable for server applications. i do not think they will provide major gains apps like games (app speed might even be slower due to the lower clock frequency). power consumption is also ridiculous. So, this is more of a marketing thing. However, double core Opteron is another story, i think this is a chance for AMD to gain some market share in low-mid end server market since Intel will not provide a contender until 2006 IMHO.
“The chips represent Intel’s latest thinking on advancing PC processors. Instead of driving rapid increases in speed, the chipmaker is now focusing on adding performance (…)”
Is this another way of saying that Intel ran out of gas and couldn’t produce a 4 GHz chip that didn’t fry the motherboard ?
This dual-core stuff makes my head spin. However, I think most “gaming, multimedia or professional applications” will run just fine on a cheap system such as the following :
Dell Precision 530 with dual Xeon 1.5GHz, 512MB, 40GB, CD, ATI Radeon LE, Win 2K COA, $359 (from eBay).
Granted, it will be necessary to add a bigger hard drive and a beefier video card, but it certainly won’t reach $3,000.
woohoo. im going to go out and get one of these for *SURE*!
There is absolutely no doubt mlticore processors can provide major gains on desktops, as long as the apps are written to have multiple threads of execution. Server apps have long been written like that.
But here is something not everyone realizes: software complexity and the cost of software development rises considerably because except in trivial applications, it is considerably easier to develop and debug single-threaded programs than multi-threaded. So while having multicore CPUs is a definite advance, not having any increase in CPU clock speed is bad, because developers are forced to make their apps multi-threaded if their apps are going to see any speed increase on the new machines with those CPUs. In fact, this new Intel CPU, as well as the new Pentium D series, run at a lower clock speed than the top-of-the-line single-core Pentium 4s, so an app which cannot take an advantage of more than one CPU unit (which are most apps today) will actually run faster on the latter.
On the whole, this technological advance imposes a considerable burden on software development and instead of touting multicore capabilities as if they are better than a pure Mhz increase, as they do if you have noticed, they must admit it is the other way around — having pure Mhz is considerably better. But marketing being the field where they study how to sell used toilet paper, no wonder Intel and AMD have turned on new reality distortion fields, and a lot of people actually believe them.
I find it logical that Intel and AMD get all the hype around dual-core (they are the market-leaders, in the end), but there’s a company that beat them both to it.
Sun beat them all to it with dual-core.
http://www.sun.com/processors/UltraSPARC-IV/index.xml
Expect octo-cores (that’s 8 cores) in Q1/Q2 2006 from Sun.
Took the words right out of my mouth!
How anyone could spend that much money on architecture which is unreliable and probably using an operating which is unstable is beyond belief. Of course, there is one plus; just think how fast it will reboot when required to, hourly or daily!
Although expensive Apple’s current machines and operating systems is one combination that’s hard to match in performance and reliability.
available in laptops.
now that is an nice advance as well.
“Although expensive Apple’s current machines and operating systems is one combination that’s hard to match in performance and reliability.”
I couldn’t agree more!
Rant code, activated…
Well, ok… except for the “expensive” part. Relative to what Mac’s used to cost (I bought a 9500/120 for $5,000, when they first came out). G5’s are downright CHEAP! $3,000 gets you a computer than is so blazingly fast, compared to a top of the line 604 processor back then, it ain’t even funny!
They’re only considered “expensive” when compared to crap PC’s which are a dime-a-dozen, anywhere, from a billion different manufacturers, 365 days a week! And you can’t even figure out who has the best manufacturing quality control, because the exact same motherboard’s design quality can vary from board to board from even one manufacturer!
Rant code, deactivated…
Luposian
How anyone could spend that much money on architecture which is unreliable and probably using an operating which is unstable is beyond belief. Of course, there is one plus; just think how fast it will reboot when required to, hourly or daily!
They aren’t that unreliable and unstable, my Dad’s WinXP-SP2 P4 has been up 20 days, which is better than my linux box’s (a Sempron 2400+) current uptime which is currently 8 days, though that’s still good enough for desktop use.
That said, there is no way I’d pay that sort of money for a desktop system. I’ll probably wait until dual-core Athlon64s are a reasonable price before getting a dual-core desktop.
There’s no way in hell I would spend $3000 on A Wintel setup.
It says in the article that this is aimed at Gamers/Video editors.
If I was a Gamer and had $3000 to blow I’d buy a Playstation2, X-box, and a Game cube, and a PSP. I’d still have over $2000+ left to buy games for all four systems.
If I had $3000 for a video editing system. I’d buy the $1999 dual 1.8 G5 Power Mac and pay $1200 for the new Final cut Studio production bundle.
Cutting edge x86 machines are expensive spyware/adware/virus magnets.
“But here is something not everyone realizes: software complexity and the cost of software development rises considerably because except in trivial applications, it is considerably easier to develop and debug single-threaded programs than multi-threaded.”
So what will that mean for F/OSS, when the only motivation is “an itch”?
I would not get a dual core Prescott from Intel. No matter what. It is stupid. Dual Opterons or FX’s are different story however.
All that matters is what Intel’s miracle marketting machine says, most people will swollow it. They still try to do as they’ve done in the past:”We have the latest technology, now pay!!!”. Well, let me put some light on the situation here:
First of all, Intel didn’t come up with dual core, but AMD did. Intel is so lame, it aint even funny. Here is what happened: Years back, AMD realised that it would become harder and harder to raise clock speed and keep some efficiency in manufacturing process/cost/power consumption, so they where looking at dual core. Intel on the other hand wanted to raise Ghz more and more, so Joe Sixpack would buy their product, until they f###ed up with Prescott so badly. That was a step backwards from Northwood. AMD made their plans public years ago for x86-64, of course, Intel didn’t want to hear about it cause it would heart their Itanium. Of course AMD pulled it of, but the strenght of the Athlon 64 doesn’t come from that 64 bit instruction set, but from a reworked k7 core. Keep it simple, that is. All that x86-64 is, it’s a way to handle huge ampunts of memory, especially virtual memory, so in the future when we wanna play Unreal Tournament 2008 or Quake 7 we can actually use more than 2Gb at one time, cause the other 2 belong to the kernel…won’t get into that right now. So back to square 1, AMD realized that it needs to increase paralelism to increase performance, without increasing to much manufacturing cost. So they tought about dual core. Now that Intel messed up, they took 2 technologies that AMD came up: dual core and x86-64. None of these revolutionary, as others have donne it before, but in the imediat neighbourhood and easy to implement for Intel, and x86 compatible. So now, they came out rushing and screaming how revolutionary they are. I had a P4 Prescott, 530J, and it was junk. Went for a Athlon 64 3500 (939) with nForce4 Ultra. My system with all the extras +montior is $1200. Can’t beat that. It boots XP Pro SP2 in 3 seconds, plays all the latest games, needs little power, runs cool. I hate Intel with a passion, as they’ve turned into the MS of processors, aka Chipzilla. When Athlon 64 dual core comes out, they will run cool and smoke the Precott based dual cores, and will be allot cheaper. Plus they will be compatible with current mobos, which is a plus. And no, DDR2 doesn’t pay off. Intel is asking $55 for its 955x chipset, add to that manufacturing cost and all that, see how much the 955x mobos will be, over $150.
Ranting off.
Wolf Man.
Intel’s CPUs specs does not measure up to anything near AMD’s specs. I would rather build a dual-core AMD Opteron, or Athlon64 system.
…if it would run on one. It would go so fast that it would probably start going backwards in time!! You’d get your work done before you even started it.
Good points on bloat. I agree. Shoving the toilet into the kernel is crap.
Linux should be getting smaller and faster and not more bloated. KDe and Gnome suck donkey balls .
Lets make X windows smaller and faster.
Lets make window managers smaller and faster.
Lets make kernels smaller and faster.
First of all, Intel didn’t come up with dual core, but AMD did.
Did they really, or did you mean between Intel/AMD?
Multicore has been around in the server/embedded world for some time. Broadcom had got a quad core MIPS communications CPU chip out right now for example (BCM1480
).
Intel have messed up, but have not messed up big time as I see a lot of people make it sound. I have used one of the new Prescotts for several months and it is a very quiet machine most of the time, even under heavy loads (continued 100% CPU use). The Prescotts have problems, but the problems are not great and they are still quite powerful machines in comparison with AMD64, and even have SSE3 support which AMD will only introduce now. Granted, for the latest games the AMD64 is better, but not a lot of people play the latest FPS games, so people don’t care much about it and it isn’t a real marketing advantage for most people. So if AMD hoped to pull ahead based on gaming performance, they cannot do much unless there is a very significant gaming performance increase, which there isn’t. Regarding the 64-bit advantage, most people don’t even run a 64-bit OS, let alone 64-bit apps, so this too hasn’t been a big advantage until Intel’s introduction of their 64-bit Prescotts.
Intel also have one significant advantage over AMD, not directly over the processors, but the motherboards — they make their own chipsets, so if you are a clueless buyer and go for an Intel board from a random motherboard manufacturer, you are bound to get better system stability than an AMD board from a random motherboard manufacturer.
Granted, the AMD NVidia chipsets are high quality and seem to be very stable, but do clueless buyers know enough to order only them?
For example, in recent years I’ve had 2 VIA motherboards with AMD processors, and with both I have been bitten with heinous hardware incompatibility problems — VIA’s IDE channel problems. After I switched to a mid-range Intel board with their ICH IDE controller, there were absolutely no problems like that, neither a second Intel system I bought at the time had any of those problems.
I am not trying to favour Intel, and I do think that the Prescotts have significant problems, but from experience I think the problems are not significant enough to give a considerable advantage to current AMD processors,.
“It boots XP Pro SP2 in 3 seconds”
No, it doesn’t, not even close.
“So what will that mean for F/OSS, when the only motivation is “an itch”?”
Regardless of motivation, the cost of development rises considerably and equally for everyone.
Think of it like that: whether you are Intel’s CEO or Free Software Foundation’s Richard Stallman doesn’t help your one little bit if you need to lift a certain weight with your body. Same holds for using your mind on programming tasks.
However games are the only widely used software which has performance bottlenecks due to hardware. In other types of software performance of current cpu’s isn’t needed (I myself use old palomino 1700+ and it’s faster than I need it to be)
So if speed isn’t important, the important thing becomes price…
And about the mobos – remember that Via exhibits the same problems when coupled with Intel (and where I live price is important so clueless people just go with Intel cpu (just cpu), nvidia graphics (noname manufacturer) and creative soundcard (the cheapest there is) “because they’re best” and really don’t care about anything else)
Not completely true. Running BeOS a number of my programs are dual threaded just in writting them. Fewer still are multi-threaded very easyily just from the nature of BeOS programming.
Where multi-threading is a real problem is when you are:
A) Trying to convert a single threaded program to multi-threading. Too often the program is already optimized for single threading in the first place and will not convert as it stands, and you are back at square one basicly writting a new program from scratch but with the added burden that it must be compatible with the original program.
B) The program is single threaded by nature. Multi-threads may be used to add features but they do nothing for the original core use of the program. Pushing multi-threading for the sake of multi-threading is a waste of programmer time and effort.
C) If you have to support both versions and you are trying to share as much code as possible. For some programs each and every thread is the same code and whether you run a single thread or multiple threads makes no diffirence except for the speed the job gets done. Other programs however need diffirent threads to run to get the job done. Trying to maintain single and multi-threaded version while also keeping as much code in common as possible is a real head-ache.
Multi-threading works best when you have programs that respond well to having more than one thread of core code running, the problem can be broken down to parellel jobs, and you can concentrate on writting code that only needs to run as a multi-thread program.
I used to be an AMD fan back when they made 486 CPU’s the 486DX-80 smoked anything intel had at the time. But after they believeing the hype around the K-5 (aka Skillet) man those things ran HOT! I lost interest. The K-6’s and on up were plauged by random weird crashes and lock ups no one made a decent chipset for these CPU’s those that did had power problems hardware incompatibilites..not so much AMD’s fault but still they should have produced their own chipset to go with their CPU’s. Recently i decided to try their AMD64 the 1.8GHZ offering mainly interested in the integrated Memory controller(Yah at least one part of the chipset is AMD made). and of course the 64-bit ness to run Fedora 64-bit. Long story short the server was retasked to be another game machine running XP secondary to my P4 2.4GHZ HT game machine. In basically the same configuration the AMD64 machine totally and utterly smokes my Intel machine and is as stable. My intel fanboy days are pretty much over my next game machine with most definitely be AMD64 based
As it stands now with current software typically used by home users and in most businesses you are correct. Where these dual core processors will mainly benefit consumers is in the server market and workstations for things such as film work, game design and research. Since the software I use for animation, rendering, etc does benefit from multiple processors the realization that dual core technology is no longer a pipe dream is exciting.
Now it all depends on actual cost factors between competitors Intel and AMD. A few things that sway me more to Intel is 1. First on the market to have dual core processors, 2. These will be EMT64 (64-bit) while supporting backwards compatibility with 32-bit applications, 3. The new P4 (or is it P5 now?) will be like having 4 processors and dual Xeon workstation/server will be like having 8 processors due to dual core and hyperthreading technology, 4. Rumors that Intel plans to release a mobile dual core version for power user laptops.
“great so finally you have 2 cores, one running all your anti programs (anti virus, anti spyware anti god-knows-what) and one so you can actually work..”
Bla bla bla, seriously, it is not that hard to keep a Windows box clean, I don’t even run anti-virus on my Windows box and if I do its AVG which is pretty low on the resources side.
Common Sense makes a big deal, and if you are reading OSNews I’d hope you’d have that. I’ve never had a spyware or adware infestation on my machine.
//anywhere, from a billion different manufacturers, 365 days a week!//
Wow … what calendar do you use? I’ll bet you’re ooooold.
“Cutting edge x86 machines are expensive spyware/adware/virus magnets.”
The ignorance in here hurts sometimes…idiots I tell ya.
It is Really Really Really EASY to keep a clean windows system. I have several of them at home and work.
I also have a windows 2003 system running coldfusion, mysql, and that I use as a workstation everyday that generaly goes about 30 days between reboots(I shut it down when i’m gone for more then a day)
You just had the bad luck of choosing the most popular, but not the most stable and nonproblematic chipsets… (I myself did this mistake also…with my first and second AMD mobo; simply because reviews are based only on performance and the info I needed was difficult to find – but I had found it; most people probably didn’t care or went to Intel)
Basically:
Via – avoid
for K6: Ali
K7: AMD (PURE! without Via southbridge), SiS starting from 735 (but avoid ECS apart from their first, original 735 mobo) and Nforce2
@Kev,
Of course its really really easy to do something if you already know how.
Tell that to the 20+ people who I’ve recently cleaned all the adware/spyware off of their Wintel machines.
I guess were all idiots for wanting a computer to get real work done.
I worked at a PC repair shop last year for almost a year. VERY few of my customers came back with Spyware / Virus problems. the only ones that come to mind are the ones that had teenage kids that would look at porn all the time.
So yes, it is simple enough that I could instruct non-computer people how to prevent this kind of stuff within a matter of 5-10 minutes.
If they wanted to get real work done, why are they clicking on ads, visiting shady websites, accepting attachments from unkown senders, etc, etc, etc ,etc….
I keep wondering where all of the trolls out there that are constantly crying that the G5 dual processor Macintosh computer is grossly overpriced. After all, $3000 for a computer with no monitor, etc. should have triggered an outcry.
Oh well, it is nice to read a thread that is not bothered with such nonsense. Thank you to all.
@Anonymous
What you said about price used to be correct, but AMD’s newest offerings (AMD64) price is on a par with the Intel Prescotts’ price. It indeed used to be with older AMD and Intel CPU’s that the former were considerably cheaper (amazingly, about 2x cheaper), and your Palomino that you quote is a good example of a CPU that is comparably fast and considerably cheaper than Intel’s offerings at the time. But with the recent processors from both companies, the price differential is negligible and your argument doesn’t hold.
I trust you are right about VIA’s problems with the P4’s. But you can check the following fact yourself — VIA’s chipsets (yes, they do Intel chipsets, some people think they do only mobos) for Intel CPUs do not hold much of the market in comparison with Intel’s own ones. Intel’s chipsets are far more numerous and known to be very stable. There are also a lot of machines with Intel CPUs AND Intel motherboards (Intel chipset + the rest chosen/designed/manufactured by Intel), that being the most stable combination for which there isn’t a match in AMD land.
So if one shoots in the dark and builds a random AMD CPU and AMD-based motherboard — the key word here is *random*, it is significantly more probable to come across some stability problems than a random Intel-based solution. And even if some instability = a little instability, it is still a huge inconvenience because our computing machines are so delicate.
@Earl Colby Pottinger
I wrote “…except in trivial applications”. Writing a BeOS GUI, which by design requires the use of multiple threads, is a trivial case.
Your points A, B and C are correct, as is your last paragraph, but how about the difficulty of making code thread-safe so it can be executed by multiple threads? Is that an easy task? No. Does this task needs to be done when you need to be programming for a single thread of execution? No. How about avoiding race conditions, deadlocks, priority inversions and other thread-related problems? Are those problems easy to debug? Not by a long shot — they are exceedingly difficult to reproduce so you can vaguely know what the problem is at all.
I am currently doing multithreaded programming, and I do know first-hand it requires considerably more effort. I cannot estimate well how much more effort, but it is definitely considerable. But another programmer working on a large codebase, Tim Sweeney of Unreal Engine fame, has dared to give an exact estimate. He said in a recent interview entitled “Multi-core and multi-threaded gaming”:
“Implementing a multithreaded system requires two to three times the development and testing effort of implementing a comparable non-multithreaded system”
He may be wrong, but if he is correct, and there’s good chance he is, that’s 2x-3x more effort for the same program!
Also, John Carmack has stated his negative opinion of the trend for multiple cores for the same complexity reasons.
What I’ve said only doesn’t hold true to the absolute top of the line. However, when comparing middle to low of the line, I can see that prices of AMD cpu’s fall much quicker, ergo: AMDs are much cheaper for the same performance.
And about Via – don’t forget about context, I was talking about situation where i live.
About Via…those are mostly problems of the past. I wouldn’t choose Via definetely, but their current chipsets are at least ok, when not using some unusual PCI card (regarding AMD, I didn’t follow how they’re doing on Intel)
“There are also a lot of machines with Intel CPUs AND Intel motherboards (Intel chipset + the rest chosen/designed/manufactured by Intel), that being the most stable combination for which there isn’t a match in AMD land.”
There isn’t a match?
I have my machine for 3 years on 24/7 (practically – I’m not paranoid about uptime). It’s simply uberstable.
Furthermore, my department recently bought Intel i865 + P4 3GHz. Interestingly, admin adviced to save work often due to some minor stability issues…(I’m not saying that this is a rule; simply that pure Intel doesn’t guarantee stability)
They aren’t that unreliable and unstable, my Dad’s WinXP-SP2 P4 has been up 20 days, which is better than my linux box’s (a Sempron 2400+) current uptime which is currently 8 days, though that’s still good enough for desktop use.
OK, here goes…
The old G3 with 10.2 I’m currently using to edit these comments has a current “uptime” of 58 days 14 hours 38 minutes… and my record to date is a cool 155 days. Had a lightening strike and lost power, otherwise, well you know!
Next…
Please keep in mind, I also use the x86 platform, however, only for FreeBSD.
PIII-500, custom cooling fans, BX-440, Corsair memory, IBM disc, Matrox G550, 3Com, Plextor 48X, and the “way cool” Wave Master all aluminum case. This is the unit used for testing various operating systems. Although very old, it still works.
@zima
Checked the prices again, you are right. Intel is in more trouble with desktop processors than I thought. Well then, I’ll refrain from any more arguing about Intel vs. AMD, except say that from direct experience with the Prescott processors, they are not at all crap as many people say them to be and they have SSE3 which AMD introduces only now. In fact, I am glad that I’m wrong about this, because it means Intel will certainly be forced to lower prices eventually.
Regarding Intel chipsets + Intel CPU stability — ok, shouldn’t have said “there isn’t a match”. That was certainly an overstatement on my part. My point was only that Intel being both the manufacturers of their CPUs and chipsets are more likely to do a great job with fitting both together, and a lot of people confirm that.
Regarding VIA — I am not sure. Google for “Radeon 9600 VIA problem” (remove quotes) — it is very fresh.
I’d certainly not go with VIA any time soon.
I wouldn’t agree that Intel’s in trouble…even if their CPU’s were horrible (they aren’t, they’re just little worse than AMD right now IMHO), Intel would still ride on the name itself. (and btw, is there any use of SSE3 in commonly used apps already?)
And you’re right, chances of succes with Intel are generally greater (not counting such specific markets as the one here, where everyone’s going for cheapest possible option)…too bad you have to know what’re doing to get the same kind of stability of AMD (and reviews don’t help at all :/ )
And while talking about mostly nonproblematic Via currently I’ve meant their chipsets for K8…K7 is more and more dead after all.
And don’t worry, I’m not planning to use Via either, in any form, be it chipsets or specialized chips :> (with the possible exception of Envy – too bad I can’t find Chaintech card with it here :/ )