This is the first in a series of articles about the history of Java on the Desktop, from my perspective as a developer who started working with Java in the late ‘90’s. I’m writing this, partly as a background for why I created jDeploy, a developer-friendly desktop deployment tool for Java. Despite the ominous tone of this article’s title, I believe that Java is a compelling platform for modern desktop applications. Stick around for the whole series to find out why.
Isn’t Java still one of the languages every aspiring programmer learns in school?
I can’t stress how much runtimes were hated by us users back then (Java and .NET). Not only you had to download the runtime (which was a big deal back in the dial-up days), but you also had to re-download it every now and then in order to get those “updates”. All that just to run a single app.
Which made every user scream in exasperation: “What makes your app so special that it requires from me to install a special tool to run it? Why can you take the time to learn whatever it takes to ship a real app? Is your time so special that hundreds -if not thousands- of users have to install a special tool over dial-up so you don’t have to learn whatever it takes to ship a real app?”
At least, these were my thoughts back then.
It’s the reason Paint.NET is so little known despite being an excellent application.
For .NET, Microsoft “fixed” this problem by integrating .NET into Vista and above (which means you had it installed whether you wanted it or not), but Java didn’t have such as advantage.
Also, all this was entirely avoidable. Sun could have make Netbeans export a Windows native binary, a Mac OS X native binary, and a JAR for everything else, but they just had to pretend all OSes in the world were equal so they could pretend Solaris workstations were relevant on the desktop (despite the fact that they weren’t because they cost as much as a car).
Another pet-peeve of mine regarding Java on the desktop is how ugly the default theme was (and is). If Swing can’t do real native, at least make a non-native theme that looks good. It’s not like all applications back then used the native theme (see PowerDVD, WinAmp) but their custom themes aren’t ugly.
That said, Java is big in the business world (client-side and server-side), because it’s set of libraries and frameworks (like Springboot) are very good, so you should learn it anyway because it opens many employment opportunities.
I always found the JVM to have terrible startup time and to be especially bad about acting more like VirtualBox than Python/Node.js/etc. in how it manages memory.
For those reasons, I always used Python with py2exe when I wanted a self-contained Windows program.
I also agree with kurkosdr on the nativeness, which I solved in my Windows days by using wxPython for portable GUIs.
Once I switched to Linux, I definitely wasn’t Running Java, because, whether it’s AWT, Swing, or SWT, I’ve never found a Java GUI that didn’t feel at least a bit sluggish on X11, and that’s before you throw in the configuration tweaks you need to make it work on a non-reparenting window manager.
(Hell, at one point, the JRE flat-out wouldn’t work on multi-head systems unless you hex-edited the binary to change the string “XINERAMA” into something else… I’m pretty sure that was covered on Gentoo Wiki back when it held the role Archwiki does now. “Second-class citizen” is an understatement.)
…and that’s before you consider the “Python makes it easy to call native POSIX APIs. Java makes it awkward to step outside their ‘lowest common denominator’ take on portability.” angle that Eric S Raymond blamed as one of the reasons for non-applet client-side Java failing on Linux.
As for applets, Flash was just a superior experience all around. If for no other reason, because it supported allowing you to structure your applet so some code could be running and usable while the rest downloaded… even if most people just used that for custom loading screens.
Hubris-induced death of a thousand cuts is my verdict.
I actually agree with the ‘lowest common denominator’ take on portability, because once you allow for “platform-specific” features in the source code, you lose source code portability. So it should be made awkward so it will be only used when absolutely needed.
The problem with Java was the idea that a single executable should run everywhere, which on the desktop was a solution looking for a problem to solve. But you see, Sun management wanted to pretend that Solaris was relevant on the desktop despite its tiny marketshare, and, in their utopian vision of the future, all apps would be JAR executables so an OS with a 0.1% market share would be equal to Windows and Mac OS in terms of available software. The idea of Windows and Mac OS running native code and everything else running non-native JAR was downright offensive to them.
(also, sorry for the typos in the original comment, timer expired)
kurkosdr,
I agree with your other points and like you, I found Java Swing & AWT to be pretty bad as well.
But I disagree with you here. The ability to run class file is very useful. Being able to run software without regards to the local computer OS & architecture was very nice for both users and developers! You start to appreciate this when you work with heterogeneous environments. It’s what enables android apps to be portable. Distributing applications as native binaries should be a thing of the past IMHO as it severely impeded portability and new architectures, solidified the x86 monopoly, and it even encourages developers to target suboptimal generic CPU profiles so that more users can still run the software. These binaries are extremely counterproductive for those with new CPUs and nobody wants to have binaries for each sub-architecture.
Bytescodes are a very good solution to all these problems and they have evolved to the point where native execution is extremely fast. It’s thanks to these developments that modern javascript engines/php engines/etc are so fast and portable. The intermediate bytescodes of java and .net are fast today too, but I’m not a fan of either of them due to the politics behind them.
Of course sun never effectively realized it’s goal. They lacked control over platforms, which made for a bad experience and to make matters worse microsoft invoked it’s embrace, extend, extinguish strategy with them giving users a notoriously terrible runtime while coming up with their own .net ripoff. Incidentally I actually liked .net, but it was still a blatant ripoff.
The problem there is that, from what I saw, much of Java’s API work leaned more toward AWT than Swing.
“It’s not supported on all platforms? Force the developers who consider it non-negotiable to either use something other than Java or resort to some third-party JNI shim rather than taking responsibility for providing a higher-level API abstraction for common core functionality like many parts of the Rust standard library now do… and a lot of Linux developers decided to just use something else.”
Do I love things that are cross-platform portable? Yes… but it has to be in a way that actually appeals to developers, rather than just relying on marketing to management to force the issue. That’s a big part of it. Not a lotta management forcing the issue in the Linux ecosystem… especially back then.
Nope. I stand by my original assertion that, in the consumer space, bytecode runtimes are a solution for a problem nobody is looking to solve. An OS will typically consolidate on a specific ISA and everyone makes sure to buy and sell hardware with that architecture when buying or selling that OS.
Android is a good example of this: Google tried to force a bytecode runtime so they could potentially move to another architecture in the future (for example MIPS64 or RISC-V) but developers ended up using native code for stuff like games and software codecs anyway (hint: performance is never enough, and bytecode will always be performance-hobbled compared to native), so any x86 or MIPS hardware running Android had to emulate these native apps, so the industry still had a reason to consolidate to ARM because other ISAs offered less performance for the native apps.
Basically, with bytecodes you resort to always running something like Rosetta, and you always take that performance hit, just for the case you need to move out of your current ISA in the future (like Apple had to do for PowerPC). Which in a utopian environment people care a lot about lock-ins would work, but not in the real world.
Basically, ISA lock-in sucks, but bytecode interpreters suck even more. A royalty-free ISA like RISC-V is the only realistic option of avoiding lock-in sometime in the future.
BTW Java has the additional problem that users have to maintain the runtime. That’s why I think not offering the option to compile Java to native was a missed opportunity. Instead of getting a nice middle ground of having native binaries for the two most popular OSes and JAR for everything else, most devs just stuck to using C++ with Qt (even for relatively simple apps) and only produced native binaries for the two most popular OSes.
BTW before anyone starts talking about ahead-of-time compilation or JIT caches, I want to say that native binaries for things like games and software codecs can have assembly-written snippets and also make use of heavily-optimizated compilation (the kind of optimization that is potentially breaking optimization options and hence can only be enabled only by the developer).
So, even in the Android utopia where use of bytecode is encouraged for most apps, you still have those few apps that use native (games and media players with software codecs).
tl;dr: Only RISC-V can save you from ISA lock-ins (sometime in the future), not bytecodes, deal with it.
Here is a good example btw: https://www.theregister.com/2014/05/02/arm_test_results_attack_intel/
kurkosdr,
The assertion is wrong. I accept that bytecodes can have cons as well, but they do in fact solve real problems that “binaries” cause including those I’ve outlined earlier.
I’ll give you that early bytecode platforms used to perform badly. But their performance has gotten much better to the point where the performance reputation is unfair. When you actually measure the performance of optimized code the gap in performance has mostly been closed.
You mention games and codecs, but it’s typically the OS/GPU doing the heavy
lifting for those rather than the application. (Not to get too far off track, but even the high performance graphic shader instructions sent to the GPU drivers include their own bytecodes because they need to be recompiled for different GPUs).
There is a big difference. It’s actually tricky to emulate machine code because it wasn’t designed to be run on other architectures. You’re taking machine code that was already optimized for x86 and its weird quirks and then you have to try to derive the intent of the underlying code without access to source. You end up having to emulate the features as they were optimized for the x86 rather than how it would be optimized for the new architecture, which can impede the use of native optimizations.
On the other hand when you have a purpose built intermediate byte code you have crystal clear programing instructions before any architecture quirks and optimizations are applied. This not only makes the task a lot easier, but it also closes the gap with native performance. Furthermore some native build process are already generating binaries derived from bytecode anyways! Consider the LLVM compiler suite’s IR (and .net’s equivalent) where all languages are compiled down to an intermediate bytecode representation before being compiled to native. If you distribute that bytecode then you can effectively build native code that targets virtually every architecture. This bytecode isn’t some gimmick or watered down variation of what native gives us; the exact same native binary that you would have gotten can be derived from the bytecode itself!
This solves all of the problems with native binaries that I brought up in my comment earlier. The main con is the additional startup time to compile the code on the target, but there are many ways that JIT platforms have mitigated this.
Yes, and I concede that a bytecode not being supported as a first class citizen on the platform can be frustrating. This was a problem for Sun and to make matters worse microsoft pushing a poor quality JVM made things even worse even when it wasn’t sun’s fault.
kurkosdr,
But that has less to do with performance and more to do with the fact that developers want to reuse existing C code that is the defacto language for systems programming (for better or worse). For developers it’s easier to use production-ready optimized C code than having to rewrite something for android.
I am pretty sure that Chrome’s AV1 decoder on my 2014-era laptop does not do it on the GPU. That’s the kind of cases you need assembly: to get those hotspots that take 80% of the time (reverse-DCT, macroblock calculations, sub=pixel motion compensation, loop filter) under control. Also, games make use of either assembly or potentially-breaking-optimizing levels (optimization level: o3) for things like physics code.
That’s an issue I completely glossed over (because it supposedly doesn’t exist in our utopia): tons of C and C++ code in codecs and in game libraries.
So, you need assembly for those two kinds of programs (and that’s without taking into account Javascript compilers, spreadsheet macros that also need speed, or even just old code).
So, you do need to provide the ability to run native code after all (Android learned this the hard way, it’s one of the reasons we all pretend Android 1.x never existed). And once you have provided the ability to run native code, a certain ISA will be at an advantage to another.
So, bytecodes are more like insurance you hope you don’t have to cash out (I mean, that’s what Android does with its supposed ISA-agnostic-ness) than a real strategy to achieve ISA-agnostic-ness. Which means the market won’t care about them and will consolidate on the ISA that has the advantage of running those native apps directly. Which means bytecodes won’t save you from real-world ISA lock-in.
Did I mention RISC-V already?
kurkosdr,
Well, I read that but it didn’t really change my opinion. It reinforces how serious the cons of native binaries become when it comes to architectural competition….
Native binary translation is just barely passable as an interim measure, but for sustaining alternative architectures I think you’ll agree it’s just a no go. You need better performance than machine code translation can offer. And yet, as the article highlights, most app developers won’t even bother providing native binaries for non-dominant alternatives. 🙁
kurkosdr,
But we should not be glossing over that point because it is by and large the biggest hurdle…
I don’t object to the assertion that the industry is unlikely to change its ways. I was merely objecting to the assertion that bytecode doesn’t solve some of the real problems we have with native binaries. Your link highlights those problems.
Spot on. Add to that the fact that, especially as the years went by, that Java updater liked to bundle other things with it that you didn’t want (Ask Toolbar anyone?). Essentially you ended up in a situation where if you didn’t update the Java runtime, you were vulnerable to security issues in it. If you did, and you weren’t careful to download the right package manually, you ended up with adware and the security issues inherent in having that garbage on your machine. Most users didn’t know how to deal with it, didn’t want to deal with it, and just stopped using Java applications and applets on their machines as a result.
Well, it doesn’t actually solved the problem. It’s like when you ask a professor how to solve the issue of compatibility in documents and presentations (OOXML vs ODF) and he answers “everyone should use LaTex). Just, no. It ain’t gonna happen. No OS vendor (not even Google) can afford to do without the performance benefits of decoding AV1 using native code (including assembly bits) or allowing the use of C/C++ code for games. Because if they do, they will lose the market (Google almost made this mistake with Android 1.x). And once you allow native code, your bytecode becomes an insurance policy you hope you don’t have to cash out, which means you probably won’t. As long as the products are kind-of-okay, you will keep using the established ISA.
This is why I said “solution looking to solve a problem nobody wants to”. Nobody complains that their native binary AV1 decoder is too fast or drains too little battery, or their game FPS are too high, and they wished they could trade those for more ISA options.
The goal should be to have an established ISA that is royalty-free, and then maybe you are on top of something.
kurkosdr,
Using that logic you could argue that Rust lang cannot solve memory issues or more dramatically that seatbelts doesn’t solve the people flying through windshield problem… It’s one thing to say we’re not going to solve the problem because people aren’t going to use the solution(s), but it’s not the same as saying the solution doesn’t work. The fact is they DO work, but the bigger challenge is getting people to use them.
Incidentally it almost seems silly that there was a time when many people did not wear seatbelts for their own safety before laws mandating that they do. We ought to agree that changing people’s behavior can be a challenge even after we have a solutions that work.
I disagree with this too, there are a lot of people who wish things would just work everywhere and I think they should, but native binaries have been an obstacle. I made this point earlier, but even when running native binaries, those binaries are often not tuned for your processor.
kurkosdr,
Speaking of solutions to a problems that people have… Native binaries are a huge problem for the alternative ISAs that you’d like to use. With a portable bytecode you could switch architectures and keep on trucking. But now because applications are being distributed as machine specific code, you can’t just switch since you are dependent on all your vendors publishing new versions of their software for your specific architecture. You’ve created all the elements needed for a classic catch-22. Application developers aren’t going to bother building for unpopular architectures. Yet architectures aren’t going to become popular without those applications.
Ego the widespread use of bytecodes would benefit even you. By rejecting them as a solution you are actually creating the very problem that keeps alternative architectures from being viable.
Don’t get me wrong, I think your resistance to change is quite representative of the industry at large, but I also think it’s counterproductive.
First of all, things will not “just work” everywhere because there are other incompatibilities higher up the levels, such as incompatibilities in OS, third-party services and app store. Take for example the Steam Deck, which has a run-of-the-mill x86 processor but has incompatibilities with Windows games due to different OS, and also take for example Amazon’s Android devices, which offer the same basic APIs but have different location services APIs and App Store. These differences are marketing/strategy enforces, so bytecode won’t make things “just work” everywhere.
But even if we assume the same OS moves between architectures (let’s take MacOS for example), the fact Apple doesn’t want to solve this problem should tell you something. Apple prefers the here-and-now performance and battery life benefits offered by native AV1 decoding in Chrome and native execution of filters in Photoshop or Premiere over ISA independence.
I get your frustrations with ISAs, but I am of the idea that there is never “enough” performance (our needs grow with the hardware). What we need is a royalty-free standard.
kurkosdr,
I’m well aware of that. It’s very hard to change the path we are on because there’s a lot of energy invested in the way we have things. That’s the one point we’ve consistently agreed on. However I think if we were already using bytecodes as the norm then no one would be complaining for vendors to move abandon portable bytecodes in favor of native binaries. It’s just nonsensical. It buys us nothing and would just be introducing the problems that we have today.
I don’t buy your argument. Bytecodes that compile to native using modern sophisticated compilation engines don’t have a performance problem. I already pointed out earlier that you can even use the same bytecode that’s being used to produce native binaries. You could even use the IR bytecode from the LLVM compiler that apple uses.
To be clear, I can acknowledge that operating systems may have a reason to go beneath high level languages, but for the vast amount of software we’re talking about there just isn’t a benefit in bypassing the compiler. Heck even in linux the proportion of assembly code is absolutely minuscule.
I say we deserve both and technically bytecodes are an important step to making us significantly less dependent on specific CPU architectures.
Well the lowest common denominator is fine if Java kept up to speed on desktop developments.
It’s been a long time, but I remember trying to build a Java desktop app back in the day and I wanted to have a windows system tray icon that you could action on.
There wasn’t a Java interface for this. Now you can of course say that a system try icon is ‘Windows’ specific. But it’s a huge market that you can’t really ignore it. It’s also not an obscure use case. The idea of a context menu for a minimized state isn’t odd. I remember being really frustrated at that and had to go find some native libraries to do this. I’m like why couldn’t they just add a supported flag or something to the call, so as an application you can check for it. They did finally add it, but to me they just didn’t keep up with the times quick enough.
As a kind of general note, they really suffered from OS integration. I’d say .NET also suffered from this, which is really weird because even in Windows, .NET was integrated poorly. If you tried to run a .NET application in Windows without .NET installed, you got some obscure error message. One might think Windows could detect it as a .NET application and suggest or automatically download the correct .NET version. Apparently not. But java as well could have been much better integrated in terms of downloading and supporting different versions and setting all those startup flags (memory….). It really was a poor experience. They just didn’t put in the platform specific work to make it viable.
Yamin,
+1, I agree with all the points you are making. Bit of a shame that things turned out this way really.
In terms of microsoft I think it happened because microsoft acts as a bunch of smaller entities competing with one another and they don’t function well as a whole. They’ve built tons of frameworks to replace win32s but I’ve always felt this sense of “do as we say, not as we do”.. They were not dogfooding their own products.
The articles title is fundamentally correct, given the historical Desktop specificity, and the wider premise is correct as well.
Java when available on a modern platform is very high performance, and often comfortably outperforms popular equivalents, that I suppose is because it is mature and evolved, of course some will disagree.
Now, even in my current slightly myopic IoT focussed world I’m starting to see Java use pop up in many applications, particularly when gadgets gain enough grunt to crunch data by making use of established Java libraries. Which is a bit ironic for me, because for the bulk of the last couple of years I’ve been working in C, Rust or even ASM to do stuff I would have previously done in Java but it was just too bloated for the hardware. All of a sudden for many of those niche applications I can just go back to using the established Java stuff, I should have spent the last couple of years on holiday and just waited for Moore’s Law to bring IoT up to speed!
I would say that Java (and .Net too) is interesting on server side because of its exceptional tooling. It has a high quality choice of libraries for pretty much anything. You can deeply observe in production what is happening to you application and still keep adequate performance.
I’m curious for part 2 or 3. The current status is actually quite bright if you ask me. Of course Java is old and has its ugly aspects. But despite this there is still plenty of improvements going on. JavaFX has a steady following and is actually quite fun to work with. Using MaterialFX (https://github.com/palexdev/MaterialFX) it looking damn good as well.
And since the last few years we’ve seen a rise of desktop applications built with web tech and electron, I think JavaFX deserves a spot here as well. I would choose Java over JS any day.
JDeploy seems interesting, but I’m curious to know what the additional benefits are over jlink built images. Since Java 9 building platform specific runtime images is also possible and thanks to the new module system (JPMS) they are quite small as well.
And of course there is Graalvm which can compile most Java code to native binaries resulting in very small binaries and low runtime memory usage. Maybe this can be beneficial for desktop usage too.
I remember that when Swing was released I decided to try it and perhaps to learn to program it. So I decided to test the examples in the JDK. 15 minutes later I decided to forget it because they were unusable in my computer and it was above the minimum requirements.
Whenever I download one of those “runs everywhere” Java applications and it actually works, it is a Eureka moment. I have exactly one Java app that I rely on – it digitally signs PDF files. This after 20+ years of trying to parse the Byzantine Java ecosystem. You need Java 2? That means you need to download J2SE1.2. Or a web browser plugin. Or an SDK. Or a JRE. What once worked no longer does. Kind of like chasing that pot of gold at the end of the rainbow.
Jetbrains is Java based, no? I think a lot of his griping is due to some inexperience at the time, It was very very normal to use something like instalshield to build installers that would handle all of the complex requirements of applications for windows. Maybe he was comparing it to OSX native apps at the time? But seriously for any Microsoft built application that didn’t use pure Win32 C code, there was at least one runtime library you needed to install. Likely many others, and you’d get errors if you had the wrong XML library installed. Oh man what a fun time to explain to ticked off customers that they needed to update their xml library in order to get the software to work, and the installer wasn’t smart enough to figure out which version of XML they had installed.
Java was a world better than dealing with the myriad of problems that were in C++. It was by far a better choice for most at the time. Microsoft’s J++ ( which was the subject of a lawsuit Between Sun and MS), was great, bypassed all of the problems he listed in the article but pissed off the purists and Sun by having windows only improvements, which lead to C#’s birth.
Yes,
I remember reading the technical design decisions on why they came up with C#, and then it clicked: they more or less solved all the issues I had with Java at the time.
Sun was very persistent in not listening to feedback, and made some really questionable decisions that hindered Java on the desktop. They wanted to keep the basic virtual machine, and optimize for embedded systems, which never took off.
They could have embraced J++ and actually pivoted to a collaboration effort with the Microsoft dev team. It of course meant losing some pride, but Java would have been relevant on the desktop, and Visual Basic would have been retired much earlier. (Yes, the main goal of J++ was to replace Visual Basic with something modern).
Agreed on all, but Did I then or would I now trust Microsoft of that generation to keep J++ compatible with non Microsoft platforms? Absolutely not. That’s why they lost the lawsuit in the first place: they agreed to keep it cross platform and not extend the language, then they immediately, did just that. If you don’t agree with the terms of a license, don’t sign the licensing agreement.
The history would be different if they could compromise.
The changes are not actually too major. They mainly focused on adding ActiveX support, and event mechanisms. (All VB controls are ActiveX controls, even the basic buttons):
https://en.wikipedia.org/wiki/Visual_J%2B%2B#J++_compared_to_Sun's_Java_implementation
Of course, there was a very open-source-unfriendly management at that time. So your point holds, too.
The simple truth is this, any language or framework that becomes vastly popular is fundamentally good, no matter what it’s opponents claim! All serve a purpose, all solve a problem, all create problems, nothing is perfect.
Excessive complaints are just another excuse from those who feel the guilt of procrastination.
cpcf,
I assume this was meant sarcastically because we’ve seen some pretty bad languages become popular, haha. Fortunately though most of them don’t pass the test of time.
ASP was absolutely terrible but was instantly taken up as the standard for MS shops. PHP was literally built by inexperienced amateurs and boy did it show, but it was there at the right time and was well positioned against ASP and perl, which was too weird to transition away from ASP. Thankfully PHP has since evolved past it’s roots and is much better today, but this transition happened after it became popular. There’s still some legacy cruft which detracts from it. but it is what it is.
I’m so glad that javascript won over vbscript in the browser, but honestly I think this had more to do with netscape’s early market lead than anything to do with the language.
Actionscript became popular, but that was because flash was popular and not because anyone liked actionscript.
I think Java probably would have done better on desktops if it were better optimized for GUIs at the begging. It became popular, but only with server applications.
C is in a league of its own. We all still use it as the standard for nearly every low level system despite there being wide consensus over it’s problems.
Enough rambling commentary for now 🙂
Alfman
Apologies for not being more explicit, the long term persistence of any language is implied in my post and what I mean by vastly popular. Lots of languages experience a popularity as a trend but very few endure. C is the perfect example.
As you mention, these languages are what they are.
I’m not a fan of the concept of a universal machine, I firmly believe that niches develop for very good reasons and we should not try too hard at making any one solution fit all. But I often read that implied in posts discussing various languages, someone wants one language to fit all and touts it as “The Solution”, well we probably need a language to become like English, a thief, a thief in the same way our DNA collects fragments of virus over time!
Not every language is good for the desktop. Not every language has to be great for desktop. Java is great for servers. Go GUI programming did not even take off, but I love it for writing command line tools.
We might hate electron for its memory consumption, but it really allows the front end folks create great, portable desktop apps.
Not every language is good for the desktop. Not every language has to be great for desktop. Java is great for servers. Go GUI programming did not even take off, but I love it for writing command line tools.
We might hate electron for its memory consumption, but it really allows the front end folks create great, portable desktop apps.