“Android is fairly unique in the ways it allows multiple applications to run at the same time. Developers coming from a different platform may find the way it operates surprising. Understanding its behavior is important for designing applications that will work well and integrate seamlessly with the rest of the Android platform. This article covers the reasons for Android’s multitasking design, its impact on how applications work, and how you can best take advantage of Android’s unique features.”
the linked page is 404
Edited 2010-05-01 19:11 UTC
Just remove the \” from the link
Nice and verz informative article. I actually believe the need for multitasking as we know it now should be completely eradicated. When you think about it, opening applications is just a crutch designed to obfuscate memory limitation of today’s PC architecture. Apps are really just an abstraction, interfaces between user and data. Ideally, user shouldn’t even worry about opening an app to be able to work with data. We can get there in the future when there is no difference between short term (RAM) and long term (HDD) memory.
I think that’s where Apple (and others, of course) is headed with state-saving on iPhone and dock in OSX, which further diminishes the indication of running/not running applications with each release.
That’s just wrong.
Multitasking has very little to do with memory (albeit memory management might be required) and everything to do with concurrency of processes – i.e. faking concurrency by interrupting program execution every now and then to let another process use the CPU(s) for a while.
And yes – it does context switching – but that is not necessarily the same as memory management.
That’s exactly the point, as mobile devices are low on RAM and do not have suitable storage for swapping/paging as larger computers do. So memory management IS key to serious multitasking on the phone.
By that logic Amigas did not multitask.
In games/demos they’re not. Because there was no room in memory and CPU resources for both OS and games.
How assigning order of precedence to each process is now being coined as faking priority seems rather odd.
Concurrency of threads per process, now that’s an even finer grained process methodology.
This concept of unified Memory for both RAM and HDD is not new, in fact is has been practiced for years in IBMs Minis: http://en.wikipedia.org/wiki/IBM_System_i#Features
What is the link between multitasking (having several apps running seemingly at the same time, crucial for any modern computing) and opening apps (which means, I suppose, loading them in main memory) ?
And why would opening apps be bad ? Do you want to keep everything running in the background permanently ? It would require an insane amount of memory to get everything properly on a computer where the whole Adobe Creative Suite plus Windows Vista plus Cubase are installed, as an example. And if you use swap space, you effectively go back to the need of loading the app from the HDD (and hence will experience lag and everything).
Do you think that main memory is going to be large enough that we won’t need slow but large memories someday in order to hold all apps in the average computer ? I don’t see that happening at the moment, because at least until now, each time computer RAM increased, software got more bloated in order to fill it, in an amoeba-like fashion… But who knows, those MRAM and memristor things could change the computing world as we know it…
No. It’s this illusion of user-data interface and the unified “data” model coming with it that is an abstraction. Apps are needed at a core of an OS, in order to separate various pieces of work. It allows countless Good Things, including better security, bug prevention, hardware allocation to one task at a time, and so on…
Yes, maybe the unified data model should be pushed even further. Even though doing so works only when you make the assumption that all applications live to work with user data (A model which has some limitations when you have modern computing in mind : What about games ? What kind of user data do they operate on ? Same for web browsers ?)
But would it catch up or not, I think that apps (processes) will continue to exist at the core of an OS as a developper abstraction for a long time. It’s a rather simple abstraction, and it works damn good. It’s like threads : everyone seems to hate them, but I did not encounter an Occam-compliant replacement yet…
Well, consider computing a long time ago. HDD were 20 MB large and it seemed huge. If we continued to make applications like on those days, the 2 GB of RAM that most modern computers have would be huge, and would allow one to keep everything in RAM and only store saved data on non-volatile memories.
Sadly, it didn’t happen. Instead, every time RAM did grow bigger, software got bloated enough to fill it, in an amoeba-like fashion. DOS fit on a few dozens of KB of memory, now Windows XP takes hundreds of MB and its successors even more, though the purpose of operating system (help user to safely play with data and apps and help developers to code said apps) did not change that much. And let’s not get me started about Steinberg and Adobe software compared to their freeware counterparts (if any).
Even if something as fast as RAM and as large and nonvolatile as HDD/SDD appeared someday, it will stay expensive and unreliable for ages. During this time, RAM would have got larger and faster, software would have got ten times bigger, and the old organization would take place again. At least that’s how I see it.
Saving state cannot be applied to everything, mostly because…
-> Storage memory is incredibly slow. If you have 10GB of apps running in memory + swap, saving it when the machine turns off and loading it when it turns on will take several minutes, which is unacceptable.
-> Endlessly saving state is not good for cleanness. Of course, in an ideal environment where bugs and memory leaks wouldn’t exist, it would be fine. But human specie is not ready for bug-free coding. If there’s a memory leak somewhere, memory usage of the OS will gradually increase, to the point where the whole software will fill the entire RAM and swapping will occur, leading to major performance loss. Sometimes, it’s good to close all data, turn of the computer, and go back to a “clean” state, instead of always keeping some mess around in main memory…
-storage memory +nonvolatile memory
I must have fell asleep for a second…
I know all that, Neolander I was talking about user interaction paradigms as a whole.
Of course what they don’t happen to mention is that googles method for multitasking also sucks down battery like a fat pig sucks down slop.
If I as a user say “close the email program” but then the email program just sits there running checking for email every few minutes for the next few hours and my battery is then dead, there’s only one thing to blame going wrong. Poor multitasking management.
Google may think software can do a better job, and maybe it can, but it certainly isn’t doing a very good job yet even as of android 2.1. Perhaps instead of just simply taking memory in to account for the lifetime of a process/application they should take in to account the battery situation and allow me to set some desired battery life. If I want maximum battery life and am willing to put up with a little bit slower app startup times that should be allowed.
Agreed. And to me, that sounds like an argument in favour of the “old school” UNIX (and BeOS) way of dealing with mail. Rather than a single end-to-end “MUA” application, you’d have at least 2: a viewer/reader application, and a daemon that would handle the actual communication with the server(s). So instead of having a full-fledged MUA running 24/7, all you need is the mail_daemon.
IMO, that’s one of the most interesting aspects of handheld devices/platforms: the hardware limitations are leading to the re-discovery of concepts that were previously abandoned as “old-fashioned”.
Pretty sure you have the option of uninstalling that app.
Why would you ever want a non-realtime OS to starve hardware resources? It should provide meaningful trace/report information so users know which app is causing the load, provide users a way to terminate misbehaving apps, and provide a interface to allow apps to release non-essential resources during scarcity, but that’s about it.
Android and iPhone “multitasking” don’t deserve the term. It’s a marketing ploy by non-technical bimbos to apply a well-known technical term to something that clearly isn’t.
WebOS, Maemo/meego and Blackberry all do a much better job of providing true multitasking.
I don’t really have time for Android, but as Dianne Hackborn wrote the article, it got read 🙂