Performance enthusiasts Jack Shirazi & Kirk Pepperdine, Director & CTO of JavaPerformanceTuning, follow performance discussions all over the Internet to see what’s troubling Java developers. While surfing the Usenet newsgroup comp.lang.java, they came across some interesting low-level performance tuning questions. In this installment of Eye on performance, they dive into some bytecode analysis to try and answer some of these questions.
Is this person retarded? How did they get a job at IBM? This person knows nothing about compilers or VMs or JITs. The speed of pre-increment version post-increment? Huh? How about advanced compiler optimizations like partial redundancy elimination, array bounds check elimination, inlining, dataflow analysis, threaded code, dead code elimination, constant propagation, instruction scheduling, object restructuring, garbage collection, stack allocation!?
What useful program ever spent more than one hundred of one percent of its time performing pre-increment operations??
http://www.bagley.org/~doug/shootout/index2.shtml
Check out ocaml . Fast,lean and less lines of code.
Does everything java with less resources(including lines code)
Check out javas memory usage. Not even funny.
objectOcaml does everything java does.
pre is faster.
int a = array(++i)*80;
==> load i
==> inc i
==> store i
==> use i in indexing array (i can be trashed)
==> load array value
==> * by 80
==> store answer in a
int a = array(i++)*80;
==> load i
==> use i in indexing array (don’t trash i)
==> load array value (don’t trash i)
==> inc i
==> store i
==> * by 80
==> store answer in a
The (don’t trash i) are very important. If the program gets too complex, you may run out of registers and need to store it on the stack for save keeping until your ready to INC it later.
The operations needing to be done are the same; the main difference is that with pre the compiler can change the value and then carry on. With post, the compiler needs to keep the old value around until it gets to a point where the old value isn’t used any more, and then do the INC and store.
The only thing he’s proven is that you can’t benchmark a bunch of i++ and ++i. Remember if their on a line by themselves, then the compiler will optimize them both to the ++i code. Any preformance difference wont show up until you run out of registers while using them in complex statements.
That’s pretty …. uh…. interesting but useless. Like the 1st poster mentioned, how much time is actually spent doing pre/post increment operations?
What do you want to say? So Ocaml should be better than C#, VB.NET, Objective C, Eiffel, and Smalltalk too. So why it’s not as popular as these forementioned languages? The answer is clear: it’s not ready for wide-scale enterprise requirements…
Somewhere else I heard someone saying that REBOL is far better than Java! Bullshit! It’s much slower than Java! The problem is that because it does not have a user-base as large as Java does it’s not tested in all aspects and its user-base do not want much from it…
Oh my god!
OCaml’s typing system is as weak as JavaScript! Sorry, how can such a language be productive? As you know safe refactroing (class renames, method renames, …) can not be used in such languages such as the type of the variables are not clear before runtime…
This article was very interesting. It answered a trivial question sure, but there are other questions that are not so trivial where the approach would be beneficial.
I’ve never done any bytecode analysis and I probably wouldn’t have thought to do so either (just like I don’t do machine code analysis for C or C++) Is there a refence for bytecode anywhere?
As for xxx language vs yyy language, just grow up. There is room in the sandpit for everyone.
You are absolutely correct!
I’ve been programming for 20+ years now, and I’ve managed to pick-up quite a few languages. This past 2 1/2 years, I’ve actually been working at a contract site where I’ve had to put all of them (yes, all of them, including 80×86 assembler) to work.
I suppose we all have our favorite languages. My absolute favorite is C. Yes, even though it is a difficult language to master and tends to blow-up in your face (at the absolute worst times, like when the client is looking over your shoulder ;-), it is still my favorite.
My next favorite is Perl. I use it for pretty much all my text parsing, and quite a bit of database access as well. Like Larry said, Perl makes easy tasks easy, and hard tasks possible.
I suppose the third favorite language on my list would have to be Java. It may not be the fastest language around (but it is getting better with each JRE release), and it may be somewhat verbose (but not so much, when compared to Cobol or Pascal), but I find it fun to work with.
> OCaml’s typing system is as weak as JavaScript! Sorry, how can such a language be productive? As you know safe refactroing (class renames, method renames, …) can not be used in such languages such as the type of the variables are not clear before runtime…
Ever tried entering ‘let square x = x *. x;;’ into a Caml interpreter? You’ll be surprised what you get: ‘val square : float -> float = <fun>’ — a function signature.
God, I can barely contemplate the mind-job the commercial software industry must have done on you. Argh!
1) Whether OCaml is ready for the enterprise is debatable. The language implementation is top-notch, but it doesn’t have all the application server support, and the sheer number of third party libraries that Java does. In the end, the biggest problem is that not enough people know it, because the universities are too busy pumping out Java programmers.
2) Just because there are no type declarations doesn’t mean that the language is untyped. The language is actually very strictly typed. It uses the Hindley-Milner type system, which is a mathematically-precise algebriac type system. Java’s type system is static, but ad-hoc, which means that it is fundementally flawed. The so-called statically typed Java cannot achieve type safety without runtime type checks. Read the paper here:
http://matrix.research.att.com/vj/bug.html
3) Like many modern languages, OCaml uses something called type inference. The compiler sets up a contrainst system by inferring the types of variables. For example, doing something like “i = 1 + 1” will cause the type of variable ‘i’ to be defined as integer, because 1 is an integer, and the ‘+’ operation is specified as mapping an integer to another integer. The compiler will then attempt to solve to constraint equations. If the constraint system cannot be solved, than the program is type-unsafe and must be rejected.
As for popularity, it is well known that the commercial software industry has a habit of rejecting powerful languages for stupid reasons. They rejected Lisp because it was too slow. Now, we’ve got Lisp achieving 90% the performance of C, and Java hovering at 60% They rejected a number of languages because they had garbage collection. They rejected Smalltalk because it was easier to retrain C programmers in C++ than in Smalltalk. To this day, the commercial software industry is reinventing stupid XML-based inference engines because they don’t know that languages like Haskell and Clean have those features built-in. Occasionally, there are companies that see the light. Google makes extensive use of Python. IBM put a lot of resources into training programmers in Smalltalk. Apple got a whole bunch of Lisp people together and invented Dylan. Apple/NeXT has brought Objective C to via Cocoa. Even Microsoft has put some resources into a .NET version of Ocaml (F#). But at the end of the day, hype and backwards compatibility concerns win out, and pioneering companies need to follow the mainstream.
I forgot one more thing: The Europeans seem to be quite a bit ahead of the Americans in this regard. The French university system teaches Ocaml to all its beginning programming students. INRIA, the French research organization, extensively uses Ocaml and Scheme. Ericsson, the telecommunications company, uses a concurrent, dynamically-typed language called Erlang to build mission critical systems. Even in the United States, many universities expose students to a number of different language paradigms. By their second year, our CS students know Scheme, Java, and C. Lots of AI courses are taught in Prolog and Lisp. At Caltech, the big first-year language is Eifel (I think it depends on which prof you get) and at MIT, its Scheme.