“Intel’s C++ Compilers for Linux and Windows smoke GNU C and MS Visual C++ in number crunching benchmarks” the article at Open-Magazine suggests. We are not surprised at all, as we already talked about this on OSNews, months ago.
“Intel’s C++ Compilers for Linux and Windows smoke GNU C and MS Visual C++ in number crunching benchmarks” the article at Open-Magazine suggests. We are not surprised at all, as we already talked about this on OSNews, months ago.
GCC targets how many systems now?
Les see Netbsd runs on these syetms; http://www.netbsd.org/
<P>
acorn32, algor, alpha, amiga, amigappc,
arc, arm26, arm32, atari, bebox, cats,
cesfic, cobalt, dnard, dreamcast, evbsh3, hp300, hpcarm, hpcmips, hpcsh,
i386, luna68k, mac68k, macppc, mipsco,
mmeye, mvme68k, netwinder, news68k, newsmips, next68k, ofppc, pc532, playstation2, pmax, prep, sandpoint, sgimips, sparc, sparc64, sun2, sun3, vax, walnut, x68k, x86_64
<P>
GCC does more than C/C++, it compiles Fortran, Java, and my favorite Ada95.
Besides scientific computing uses Fortran and Ada, not C++.
Where is their fast Fortran compiler, where is their Ada compiler?
GCC is not supposed to be fast its supposed to be cross platform,free,and OSS.
Intel is just pushing their products by trying to get everybody to use their compiler.
>GCC does more than C/C++, it compiles Fortran, Java, and my favorite Ada95.
Gcc’s primary languages are C/C++. And at the end of the day, this is what we are examining today. The effectiveness on C/C++, not if a compiler can do good coffee or not.
If your application is a huge one (eg. compile a whole OS which normally takes hours and hours), or your application requires the compiler to generate optimized code to maximum (rendering apps, 3D games, OSes again etc), then the Intel Proton compiler can make a huge difference.
Sure, for you and me, and the rest of the gang, GCC does the job well, for free. But if you are a professional, or you like fast code or your time is money, Proton is the way to go. No question about it.
I am sure there is space for both compilers in this world. But trying to underestimate or try to make Intel look like the big bad wolf in this specific case, is at least, moronic.
G++ is hardly the best choice if you use C++. I haven’t tried gcc 3.0 yet but people say that it doesn’t do c++ well enought yet to replace 2.x
But as mentioned, it’s good for being cross-platform.
The article seems to say that the visual studio .net c++ compiler is as fast as the intel compiler and it’s out now (in fact I’m using it right now)
GCC targets how many systems now?
What % of developers will care about more than 2 or 3 of those systems you listed? Intel doesn’t go after the fringes, especially the dead or near-dead ones. VAX? Atari?? It isn’t worth it to them. Developers will use the best tool they know of available on their preferred platforms. If I want to compile C++ apps on Windows (which I don’t), I’ll use Intel if it is the best Win C++ compiler. If all I can get for my “evbsh3” (whatever the heck that is) is gcc, I guess I’ll have to use that.
I wonder how Intel’s compiler compares with Borland’s free one…
Where is their fast Fortran compiler,
From the *first* paragraph of the article: “And for serious old-line number crunching applications, Intel has a Fortran compiler which employs the same optimization techniques.”
Ok, it’s faster but how small is the generated binary?
In any case, fast is cool!
I wish Palm would offer both GCC and Intel compilers for Intel BeOS!
ciao
yc
Yeah right Intel only goes after the fringes, except Intel considers x86 and Itanium to be the fringes. Linux, and BSD gain their strength by NOT being tied to a specific Arch. GCC is what allows opensource to work. Without GCC being able to be freely distributed ,targeted at all systems, opensource development would end. Intel would love to hamstring Linux and BSD to Intel processors by getting eveyone to use an Intel compiler.
Also GCC’s primary languages are not C/C++. GCC started out only doing C but its now a compiler suite of tools. I use the Gnat compiler. Gnat is the version of GCC that compiles Ada. Gnat is written mostly in Ada and uses GCC when you call it using make.
The next version of GCC will have Ada support built right in.
Do they mention what versions of GCC or the VS compilers they used?
Yeah right Intel only goes after the fringes, except Intel considers x86 and Itanium to be the fringes.
What???
Linux, and BSD gain their strength by NOT being tied to a specific Arch.
You think they will be because of Intel? Not likely.
GCC is what allows opensource to work.
Whether or not open source is actually working is a matter of much debate. Besides, I always thought it was the people and philosophy that make it work, not the specific tool. Tools can (and will be) replaced.
Without GCC being able to be freely distributed, targeted at all systems, opensource development would end.
Not much confidence in open source, eh? Don’t you think the philosophy of oss can overcome the loss of ONE tool? And who said gcc is going anywhere?
Intel would love to hamstring Linux and BSD to Intel processors by getting eveyone to use an Intel compiler.
This is called COMPETITION and it is good! The oss community had better get a basic education in free market economics and learn to deal with it… by improving their products and marketing them successfully, not by crying “that’s not fair!” when a competitor appears and dares to go after their market.
However, at $499 for the first license, Intel isn’t going to take over Linux software development in my lifetime. IMHO, Intel is after professional developers who program for profit and IT departments writing stuff for internal use. A lot of oss people seem to have a hard time understanding that most software development is done to make money for somebody. The typical Linux user’s distate for the idea of actually PAYING for software (into which one or many people poured a substantial amount of sweat, time, and maybe money) is a substantial barrier for Intel. If I start any projects for Linux, it will be with gcc (or a free Java clone) for that reason.
I have lots of confidence in OSS. My point was that Open source software lives by the source. You can’t expect the developers to maintain binaries for every system there is. If they do a good job with portability then you download the source and compile the app yourself. BUT this requires a compiler that is totaly free for everyone to distribute and use. The compiler is the KEY tool for open source, lose the compiler and everything just stops. If people start using and becoming reliant on an Intel or a Borland compiler that only works on x86 then we lose all portability.
If I have to buy a compiler from Intel to be able to get the latest version of fooapp to work then we have become reliant on Intel. Add to that the people running Linux on PPC or whatever would be out of luck.
I have no problem buying software. This same argument always seems to come up, those evil OSS people that won’t pay for some application. Oh for the good old days when everybody had to pay for every program they used.
Seems to me that you are the one that is not wanting the competition.
If companys want people to buy their software they will have to make it better than whats freely avaliable.The market will have to adjust to lots of free software.
Also be careful not to confuse free as in freedom, to free as in no money.
If I had to pay the GNU foundation each year to keep GCC free(as in freedom) then I would.
For me freedom is more important than a small increase in speed.
I have lots of confidence in OSS. My point was that Open source software lives by the source. You can’t expect the developers to maintain binaries for every system there is.
This assumes that certain developers actually develop for every system there is. Is this so?
If they do a good job with portability then you download the source and compile the app yourself.
Great…in theory. Do a google search for the reality.
BUT this requires a compiler that is totaly free for everyone to distribute and use. The compiler is the KEY tool for open source, lose the compiler and everything just stops. If people start using and becoming reliant on an Intel or a Borland compiler that only works on x86 then we lose all portability.
Good point, but I still say it just won’t happen, and for the very reason you just said.
If I have to buy a compiler from Intel to be able to get the latest version of fooapp to work then we have become reliant on Intel. Add to that the people running Linux on PPC or whatever would be out of luck.
I still say Intel’s compiler isn’t aimed at the people who program oss apps. If they are, they will fail. The oss developer community won’t go there. Linux users already have enough of a problem with dependencies, library versions, etc.
I have no problem buying software. This same argument always seems to come up,
…and maybe there is a reason for that, no?
…those evil OSS people that won’t pay for some application.
Evil oss people? Your words, not mine. What I usually hear is from the oss people about those evil profit-seeking closed-source developers, and the ‘moral superiority’ of oss!
Oh for the good old days when everybody had to pay for every program they used.
Those days never existed. Shareware has been around for a long time, as has freeware. Copy protection was a big thing then, of course, but systems were developed to break it.
Seems to me that you are the one that is not wanting the competition.
LOL! Sounds like I touched a nerve! And you base that statement on what? Am I selling Intel compilers?? This is reality; if the people who develop gcc (and the oss community as a whole) actually begin to feel heat from Intel (which I don’t think will happen to any serious degree), then they will have to improve gcc to meet the challenge. But, if they were to sit on their hands and worry and complain (and they are not…they are always improving), they will lose.
If companys want people to buy their software they will have to make it better than whats freely avaliable.
This is a legitimate point for debate. I think it is wrong to assume that “freely available” is better. From my viewpoint, if oss developers want people to use their software, they will have to make it better than what people pay for today. It may be in some cases, and not so in others. In many cases there is no “freely available” alternative to do the job. Name me one such program that can replace CorelDRAW, for example.
The market will have to adjust to lots of free software.
It did that years ago!
Also be careful not to confuse free as in freedom, to free as in no money.
I know the difference, but in practice, most “free software” (open source) is given away. Are there any Linux desktop apps of any significance that are both oss and sold for money, either as shareware of shrinkwrapped? I’ll bet it is a very small percentage.
If I had to pay the GNU foundation each year to keep GCC free(as in freedom) then I would.
Do they have a way of accepting donations? If so, would you, as an oss supporter, voluntarily contribute?
For me freedom is more important than a small increase in speed.
From the viewpoint of a developer or a customer? People who get software at no charge generally shouldn’t complain about a 30% speed difference, but in some cases, as Eugenia said, it can make a huge difference.
<P>
Why is this “freedom” more important than application performance to you? This is not flame bait; I really want to know. Your opinion probably represents others.
Sorry about the confusion caused by my unintentional formatting goof in my last post.
Buzz,
Uh, you make it seem like Intel was evil for releasing a product that makes their processor more efficient. If Honda released a new chip for their Accord that makes it run 50 miles to the gallon, most people would say that’s a good thing. So what if it doesn’t work on a Ford Taurus? Honda shouldn’t be held responsible for that, it’s Ford’s job. Should Intel be chided for refining their products?
BUT this requires a compiler that is totaly free for everyone to distribute and use.
Yes, and that’s why gcc is around. Intel’s compiler won’t make gcc go away.
Nobody is forcing you or anybody else to use it. Intel’s business is selling processors (big market), so I can assure you that they won’t force you to use their compiler (relatively small market). Like I said above, Intel’s compiler won’t make gcc go away. The more compilers available for their product, the better off they are, because the Intel Compiler is aimed at a different market segment than gcc.
Why the “Editor-in-Chief” is calling her readers moronic is beyond me. Looks more like “Editor-is-Weak” to me. H E L L O ?
Because I am unlike any editor in chief you ever seen Pete. You know me for more than 2 years now, come on. If I have to say something, I will say it. No matter if I am the president of EEC, not just the e-i-c at osnews. If you want me to be the person who tries to keep a distance, or to keep away from the forums in order to play it ‘safe’, or to shut up so I won’t create controversy of any kind, you may want to visit CNet, not osnews. This web site is free for everyone to speak up their minds. Including the editors.
cool euginia, and i have grown to love this site. i think people dislike the intel one because it is not open source? they dont know what it contains?
Amen!
🙂
cc in Caldera(SCO)’s UDK, watcom’s cc and some other compilers are compilers “easy to get”. I’d like to know what compiler has what features.
gcc is good for my purposes, icc is just a pain to install if I don’t have RH.
I’ll stick on gcc.
if icc is faster now gcc will be faster next time.
BTW who wants Intell? AMD and Motorola are more interesting ^^;
i’m all for faster compiler but it would be good if the software industry would work a bit more the source itself, why just gain 5-10% in compiler speed when you can sometime get 1000% speed increase just with clever code desing. A good exemple is lookup table, nobody seem to use that this days anymore
” I have lots of confidence in OSS. My point was that Open source software lives by the source. You can’t expect the developers to maintain binaries for every system there is.”
This assumes that certain developers actually develop for every system there is. Is this so?
“If they do a good job with portability then you download the source and compile the app yourself.”
Great…in theory. Do a google search for the reality.
I found myself writing a linker for my own OS project, I originally wrote it in C with only two or three architectures in mind, worked great until one of my developer bought a system from ebay, a nice alpha box, 54lbs of pure pleasure. Turned out that my code wasn’t portable to this architecture. GCC _is_ the tool that allowed me to port it over and make this developer able to continue help us out using his latest acquisition.
Sure people develop mostly for one or two systems, but there will always be someone somewhere that will try the app on his exotic system.
Without GCC and their large portability, anybody owning such system would be restricted to whatever app was develop for those only. GCC allows those ppl to actually use the latest developed app.
Quality and portability are the key elements to OSS success.
A good exemple is lookup table, nobody seem to use that this days anymore
Be reassured, it’s still around and widely used. Everytime you connect to an IRC server running ircd you are using calltables. Pretty sure every time you call up a system function either in Linux or FreeBSD you use one also.
Problem with calltables is that you need a reason to use them, like dispatch the control to a function based on a specific id. Most software have no use for that. The only place where I could see this used is like message types received after a XWindow even or the like, where you could use the message id to redirect the control to the right handler. The problem with that is simple, to guarantee portability you must not assume the specific value of a named constant to be the same over the course of 10 years nor to be the same from one system to the other. So yeah, you use a switch() statement and a bunch of case:
If either gcc or intel’s c compiler are decent then code should be pretty much interchangable between them and other decent c compilers. That’s what standards like ANSI are for.
It seem Intel Have remove the link to the version of their compiler for linux that was free but came with no support, lucky I already have it.
Good on Intel. GCC may be free, but it needs some serious work on the quality of code it produced.
Intel/AMD chips are pushing 2GHz… but need this huge fan/heatsink. And I need to BUY a new motherboard and new memory…
Intel C/C++ compiler gives 50-100% speed improvement by optimising the SIMD… this is great for games and multimedia… not much else… and it doesn’t help me when what I really want to do is slightly modify a proprietry application… or let one application talk to another… if only the compiler could make up for the hours/days/millenia STOLEN by that.
Open source lets me make those changes… if I can program in C/C++ (which I can, but most ‘puter users can’t) and if I can have the application developer standing beside me explaining how things work (which I do NOT). Still takes hours/days/millenia. It’s not very FREE (beer-wise) if I have to SPEND that much time trying to understand it.
I can’t believe that after 25 years (of C and slightly less of the x86) we haven’t come up with some more imaginative and successful ways of using computers to give people MORE FREE TIME.
If we are still using the successors to C and the 8086 in another 25 years, then it will be a defeat for Free Software. Well… at least a defeat for this prophetic rant.
Still, boys need their toys. What would I even do with that free time?? And all that waste must be keeping lots of people in jobs. Scratch all the above.
Faster for his processors, yes, but only for Intel processors. Run the same application on an AMD processor and the performances are down. It’s easy to optimize the compilation for it’s own processor, in particular case when you know all on it.
Speaking as a professional application developer on one of the largest codebases known to man, performance is a major issue. In the multi-billion dollar industry of software development, it is easier than you can ever imagine to piss off an end user with lack-luster performance. End user’s don’t care how many lines of code are running in the background. If an operation consumes an unhealthy amount of time (unhealthy being more than the user anticipates), the user becomes extremely irate. This is usually signalled by bald face yelling into the computer monitor accompanied by aggrevated mouse movement and clicking. Sometimes users even hurdle pieces of computer paraphenalia from their office or cubicle, causing serious physical injury to their neighbors!
In any case to each his/her own. As already mentioned above, GCC is ubiquitous from a platform perspective. The number one rule of engineering is to use the right tool for the right job; am I wrong? So, use VC++.NET or icc on Windows, because it is faster; use gcc on Linux, *BSD, BeOS, or any other platform where you know the packages you’re compiling were designed to be built with it (or icc, if you can get it to work); or use ADA if you feel you just can’t stomach all this modern computing crap. Any way, it is your choice.
If I wanted a “Write Once Run Anyplace” (note the carefully averted copyright infringement) app, I’d probably write it in C assuming GCC as a compiler. If however I wanted to make a bunch of money selling a targeted application to a bunch of Windows users, I’d probably like to stick to a more sure thing. I mean really, if VC++ or icc compiles in an optimization error, you can bet your ass you can call an 800 number and get a hot fix (or workaround) in a matter of days if not hours. If you’re working away with some XXX.YYY.ZZZ rev of GCC and you run into an optimization error which causes an application fault… get ready for news-group-hell, where you’ll undoubtedly be told to upgrade to XZX.YYZ.ZXY.3.1415926… version which behaves completely differently with regard to optimization, and may or may not resolve the issue.
So writing computer programs isn’t an exact science, who said it was? In fact, it isn’t even close! We don’t have to take the FE, nor do we get to engage in OE rituals, and we certainly aren’t eligible for PE accreditation.
For my job, I target a very specific customer. Business employees running Windows2000/XP are my target, and I’m want to use the best tools to provide those users with the most performance bang for the buck.
i’m not a c/c++ programer yet, but a funny idea i had was that since intel had a c/++ compiler for it’s processor then amd would probably come up with a compiler and then moterola, and then more third party compilers come into play. then it becomes a compiler war where thousands of programs flame eachother for using their compiler prefernce. and then this becomes as big as the os war.
just a funny thought. nothing to be serious about. not saying that would ever happened. for all i know it’s probably already happening. lol.