The HTC HD2 is probably one of the most enduring mobile phones out there. While it originally shipped with Windows Mobile way back in 2009, it has become one of the most hacker-friendly devices out there, and hackers have managed to port virtually everything to the device – various versions of Android, MeeGo, Ubuntu, and Windows Phone have found their way to the HD2. Russian hacker Cotulla, responsible for many of these ports, has just announced the next big port: Windows RT is now running on the HD2.
Cotulla announced this rather bizarre achievement via Twitter, with a stream of photos to prove he got the job done. Both the Metro environment and the plain old desktop run on the HD2, which presumably includes Office RT and the like.
In order to get all this to work, I’ve been told Cotulla had to write his own EFI bootloader – no small feat, and it demonstrates his dedication and skill (as if that was needed considering he already ported both Android and Windows Phone to the HD2). The fact that Windows RT runs on the HD2 at all is a small miracle – this is a phone from 2009, running on a single-core 1Ghz Snapdragon processor with 576 MB RAM.
We don’t have any video material (yet), and the port hasn’t been released to the public, so we don’t yet know how well it performs. The legality of it all is of course shaky as well, but then, back in the Windows Mobile days Microsoft simply looked the other way and silently approved the ROM community. It would be a nice gesture if they continued to do so today.
I’ve always wanted an HD2, and this news really isn’t helping.
Contrary to popular belief and the ever lasting senseless GHz-GB march, a 1 gig CPU with half a gig of RAM is plenty enough to run almost any modern OS. The problem isn’t that the system is too weak, it’s that software developers are so inundated with an embarrassment of riches that is modern hardware performance, that they’ve become complacent and instead of writing software properly, they succumb to the “meh just get more powerful hardware” mentality.
The examples of this are abound all over the software landscape. Take for instance Office, a suite who’s core functionality has remained unchanged for essentially two decades, yet compare the system requirements for Office 2003 and Office 2013 (10 year difference):
Office 2003:
CPU: 233 MHz+
RAM: 128 MB
HDD: 400 MB
Office 2013:
CPU: 1GHz+ with SSE2
RAM: 1GB
HDD: 3GB
Each of those specs has increased 4-8 fold and yet, most changes in functionality were largely under the hood and nothing that would justify the absolute ballooning in sysreq’s.
Samsung galaxy mini -192 mb ram, 600mhz, android 2.3 -no real problems.
Actually, Office is now very light on system resources — principally because its actually resource usage has NOT grown at anywhere near the rate that hardware has improved.
Ten years ago, I might’ve launched Character Map to grab some symbol that I didn’t know how to type. These days, I just launch Word. It’s just as fast — in fact, it’s faster, because I’ve probably got that character on my MRU list in Insert –> Symbol.
The thing you have to remember is that the stated “System requirements” are very different from actual resource consumption.
It may “require” 1 GB of RAM, but that’s with the OS loaded. Excel 2013 (x86) uses 18 MB at launch. Word 2013 (x86) uses 25 MB. PowerPoint 2013 (x86) uses 27 MB.
It may “require” 3 GB of disk space fully-loaded, but that’s if you install the fully-loaded everything-bundle. Install Word, Excel, and PowerPoint, and you’ll use around 1 GB. And if you really want, you can delete other components that you don’t need.
It does require a 1 GHz CPU — but probably not because it needs 4x the clock speed. No, it’s because they’ve decided to turn on the SSE2 compiler optimizations. What would be the point of requiring a 233 MHz CPU with SSE2? They don’t exist.
The reason that Microsoft Office “requires” more system specs than it actually uses is simple: Because it can. Because PC specs have gotten so good that you’d be very hard-pressed to find one that could not meet the specs. Because it’s better to overspecify the requirements than to underspecify. Because it costs more to support someone running a system more ancient than that.
If you install Office 2013 in a Windows 7 VM, turned off most system services, reduced the RAM below 1 GB, and then restricted CPU utilization to, say, 10%, I bet you it’d still run.
Just like running Windows RT on an HTC HD2.
Edited 2012-12-27 23:37 UTC
To expand on the previous comment a bit, the reason the specs call for a 1Ghz CPU and 1GB of RAM are that the minimum OS requirement is Windows 7 – and the minimum requirements for Windows 7 call for a 1Ghz CPU and 1GB of RAM…
Yes, it needs 3GB of disk space, but that is mostly because of the way the installer works (it puts the entire install image on your harddrive – even if you never really install any of the optional components). The “core” parts of all the Office apps probably only take up about 350MB at the most.
The requirements are simply a side-effect of the minimum OS required. They can’t state lower requirements than the OS itself calls for…
The only thing special is they chose to require SSE2, which seems reasonable since it is available in all AMD/Intel processors made since 2003. Windows 8 requires SSE2 as well…
Don’t want to sound like I’m piling on, I agree completely with you overall point. Its just that Office is a bad example of software bloat – its actually quite lightweight relative to most Office Suites (OpenOffice I’m talking about you).
http://www.hostcult.com/2012/08/libreoffice-36-vs-openoffice34-vs-m…
Edited 2012-12-28 01:49 UTC
Agree, Office might not have been the best example, just something off the top of my head. In any case, as for the RAM requirements, I think you’re way too optimistic there – the measures you gave most probably don’t include shared libraries and a host of other resources (OSes like to lie about such things). As for disk resources, I see no reason for them to inflate 8x, even with added functionality – I mean, we’re talking nearly a full DVD’s worth of data, the equivalent of some 100000 full-PAL JPEG images.
As for OOo/LibreOffice, again, fully agree, that thing needs an intense diet, real bad.
It’s not so much lying as it is being difficult to give an accurate representation of it. If the same shared library is loaded by multiple apps, it’s actually only loaded into memory once and then mapped into each corresponding app’s address space (at least the code section ; the data and other modifiable sections are mapped COW). That consequently makes it difficult to accurately assess the true memory usage of a given app, since, if the library is being used by more than one, you can’t really lay the blame for that additional memory onto any given app, hence the complex and often confusing numbers one gets as far as memory usage goes.
I know, don’t worry, I do a little kernel development in my spare time. But I might not have expressed exactly what I meant when I said “lying” (of course everybody would like to know the exact value, including kernel developers, if such a value could be easily obtained).
That’s true, but again, it’s still not the whole story.
There’s an MSO.DLL and an MSORES.DLL that collectively take up 210 MB of RAM. But these are loaded once — even if you load all the Office programs simultaneously.
And even then, it doesn’t actually take up 210 MB of physical memory. Remember that Windows NT has a demand-paged virtual memory system. Those files are demand-paged into memory, 4 KB at a time. Until one of the Office programs actually calls a function located on a certain page, then that page is not actually taking up any RAM.
The Excel icon now takes 173 KB of disk space. One icon. (At 256×256, 32-bit color, as a high-quality PNG.) Now think about how many other icons there are in Office.
And not just icons. Fonts. The new equation editor. A new ligatures-aware layout engine. New PivotTables. More accurate statistics functions (the old ones are still there, to get bug-for-bug backward compatibility).
One man’s bloat is another man’s feature. You may not use a given feature, but other people do.
It’s nice to focus on DLLs, but I also said “and a host of other resources”. What I mean specifically under that heading is, for instance, garbage collector cruft, i.e. uncollected objects. Even a small GC program can easily accumulate hundreds of megabytes of RSS before the garbage collectors bothers doing something about it.
I know, that’s why I was talking about shared libraries.
I’m fairly certain they are paged at larger increments than that, otherwise the initial execution would be so exceedingly slow that even startup would take minutes to complete. What most likely happens (and I’m simply talking from *nix kernel experience, so feel free to correct me) is that the I/O subsystem pulls the files in in much larger chunks into the I/O buffer cache. Subsequent mapping of pages happens much faster (via process page table manipulation), but the blocks holding the DLL contents still take up space in the page cache, although much less than pulling it in completely in one go and also in an evictable manner subject to other kinds of memory pressure.
And that is a positive point how exactly? Also, most UI icons, I would hope, are not 256x256x32bpp.
A good-sized compressed TTF font is a few MB in size, at most. Unless there is a couple thousand fonts hidden in office somewhere, this contributes a few dozen MBs at most.
Depends. If it’s a GUI with controls and some live code behind it, I would give it 10MB of extra. Given some icons and help pages, perhaps 30-40MB.
New code: 3-4 MB (if the layout engine is really, really big, like 500KLOCs+).
How much space do their definitions/code take on disk?
I’m sorry, but now you’re into the silly things. This is just a bunch of interpreted code. Perhaps, and now I’m being very generous here, 10MB of new code, 30MB tops. I mean it’s not like they re-implemented the entire office suite in it, they just added some utility functions.
So in summary, you’re still only 1/10th the way. I know where the 9/10ths went – it’s the animated sequences, the uncompressed BMPs on disk, the tons of various pre-made media, like music and video stuff for shit business presentations that nobody likes anyway. If you took it away, I doubt anybody would really complain, but then you wouldn’t have the latest and greatest incremental update to sell to your locked-in business customers. There’s money to be made!
Obviously, I wasn’t giving an exhaustive list of where every single MB went on an Office install. I was just pointing out some of the features that I use that weren’t present in Office 2003. In other words, I’m not getting nothing for my gigabytes, as you seem to think.
That having been said, you seem to be unaware that the templates and clip art have been mostly moved online in Office 2013. Only the most commonly-used templates have been left in the default install. (And it’s the non-core Office applications — Publisher and Access — that are the biggest offenders. Word, Excel, and PowerPoint have really been slimmed down.)
My Office folder is 1.7 GB. 1.0 GB is DLLs, 192 MB is EXEs, 112 MB is fonts. A whopping 2.5 MB consists of those “uncompressed BMPs” that you’re so worried about. Animations and music and videos? I can’t find them. Maybe you can.
As for memory consumption, I don’t know why you’re going on about garbage collectors for. The Office programs are unmanaged, and written in a combination of C++, C, and (a little bit of) assembly.
—
You seem to be intent on being upset about something, for whatever reason.
Instead of throwing random stuff against the wall to see what will stick, perhaps it’s better to first look into it. Maybe you’ll discover that the situation is not as bad in reality as you seem to think it is.
Edited 2012-12-28 21:56 UTC
I have written my fair share of application software and that is the split I was usually seeing. 1/10th code, 9/10ths resources. There’s one issue though, it’s easy to inflate the resources – simply save your app icon at higher res and you’re done. It’s much harder to inflate the code part.
How much space do these core 3 apps take up? I’m not an Office user, I’ve only got Microsoft’s own claims online about the sysreq’s to go on about.
What. The. Fuck. 1.2GB of executable code? In Office alone? What the hell is Microsoft bundling in that thing? Just for reference, all of the executable code, including the kernel, device drivers and application software on my desktop Ubuntu system clocks in at around 1.2GB. Just for comparison, the 3.2 Linux kernel contains around 15 million lines of code (not all of it compiled, but a good deal) and compiled on x86 comes to around 120MB, so unless Office contains 5-10x that much code, there’s something else at work here. It is just my suspicion that these EXEs and DLLs are packed full of auxiliary resources that you didn’t estimate, or there’s tons of static linking going on (or both).
Do you have access to the source code? If not, how can you say that? Also, merely being written in C/C++ does not mean you don’t utilize garbage collectors. A GC can be an implict part of the language (Java, C#, etc.), or one can use it explicitly (Boehm GC, OpenStep’s refcount GC, GObject, etc.).
As I seem to have to endlessly elaborate, I was talking about in my original post about the “meh just get more powerful hardware” mentality. I don’t care two shits about how exactly Office is implemented, what I’m talking about is the fact that a program of such a type shouldn’t have the system requirements it does (I already agreed with galvanash earlier that Office’s sysreq’s are most likely like that just to reflect the sysreq’s of the OS around it). We’ve had sophisticated desktop software 20 years ago which ran on 1/100th of the resource footprint. I used to run an OPENSTEP 4.2 for Mach on a 200MHz CPU with 64MB of RAM – a full Unix-like OS with a Microkernel (Mach), device drivers written in “managed” Objective-C and application software talking to the display server using an interpreted language (Display PostScript), rendering TrueType fonts and all that jazz. Nothing has changed fundamentally.
Actually, given the above DLL+EXE size measurement, you confirm that it is much, much worse.
But you don’t need to get more powerful hardware. A $200 netbook will run Office 2013 just fine.
In the 1990s, we actually did need to buy new systems just to run the latest software. Now, we don’t. To insist that developers spend their time making their code as small as possible is a waste, when they could instead spend that time on performance optimization, or battery efficiency, or — heaven forbid — a new feature that greatly improves the software’s capabilities.
Do I care that Chrome is ten times bigger than Netscape Communicator 4.0, which was considered to be very bloated? No! I can do much more in Chrome. The extra 200 MB is well worth it to me.
What’s in Office 2013 that takes up so much space? I’m not going to list every feature, but the largest single feature appears to be PowerPivot (183 MB). It’s essentially a full OLAP database engine, bundled with Excel. Is it bloat? It is if you don’t use it! After all, Excel is a spreadsheet, not a database.
But the thing is, Excel is often used as a database. Or, it’s used to analyze data that came out of a database. To someone trying to analyze hundreds of thousands of rows of data in Excel, the feature is invaluable.
“Bloat” just means “all the features that I don’t use.” As I said previously, one man’s bloat is another man’s feature. One don’t have to be able to list every single feature and account for every single MB to realize that.
Googling garbage collection RSS gave… inconclusive results :p – as a mostly layman, I wonder what you meant there?
And folks wondered why I have Office 2K7 on my desktop but run Office 2K on my netbook 😉
But I would say in some ways you are right, in some you are wrong. take browsers for instance, IE and all the Chromium variants run in low rights mode, run tabs and plugins in separate processes and all that takes memory and cycles, but the trade of gets you better security and less likely to crash. Compare this to FF where i don’t know how many times I had managed to crash even the latest version, all it takes is one misbehaving web page to knock the whole thing down.
And I don’t know about everybody else but personally if it comes down to using some of my memory or using swap, please use the memory! One of the things I love about Win 7 is how its superfetch learns all the programs I use and when I use them and has them all loaded and ready to go, so much nicer than launching and waiting on the program to load from disk.
Finally when it comes to Office? frankly they have added a LOT of features in that time, whether you use them or not is another matter but they have added a lot. Personally I just wish everyone wrote a “lite edition” for your tablet/cellphone/netbooks and a “deluxe” edition for your laptops and desktops so we’d have the right tool for the job. If you wanna use a little more CPU to give me more features on my hexacore desktop hey, no problem, but when I’m on my AMD Bobcat netbook I’d really prefer if you didn’t suck my battery dry.
How dare you make so much sense? I completely agree! One of the reasons why I am still running office 2003!
Another Office 2003 user here. Bought a used box and been happy since. I also have LibreOffice and Softmaker Office, but use Office 2003 99% of the time. The only problem is Excel files with more than 64k rows, but LO Calc can handle that.
They were saying that for techical reasons phones that ran WP7 won’t run Windows RT. This was always viewed with some suspicion.
I fully expect that either
– the developer will be perused in the courts
– they will put specific code in RT to stop this from happening again.
The game of cat and mouse begins.
You’re probably thinking of WP8, not RT.
I believe the lack of UEFI and TPM were the big reasons MS didn’t support WP8 on WP7 devices.
RT was never targeted as a phone OS.
That RT can run on an HD2 should not be a big surprise.
Microsoft’s initial testing of Windows on ARM was on such devices.
http://blogs.msdn.com/b/b8/archive/2012/02/09/building-windows-for-…
Well, you know, there is already code in there that’s supposed to make this impossible. The guy just circumvented it all, and if Microsoft added more such code it’d just get circumvented as well sooner or later.
It was a logistical issue. Ensuring a smooth update from WP7 to WP8 for new devices, while taking care of data migration for two different OSes and figuring out a way to provision and update millions of existing devices without a hitch is no small feat. It is likely the resources to do so were not there to justify the gain.
Not that I agree with the decision, I think it was poor planning on Microsoft’s part.
Not poor planning. A profitable decision.
I really don’t think profit was a motivator in the decision. More like incompetence and poor product planning.
If MS didn’t have resources for their most important upgrade in decade what on the earth are they spending on?
But Microsoft considers Windows 8 to be their most important upgrade in a decade.
Windows Phone 8 isn’t. One could argue that it ought to be, but the fact is that the Windows Phone team is tiny by Microsoft standards. For whatever reason, they haven’t dedicated that many resources to it.
Edited 2012-12-28 21:47 UTC
The Windows Phone team is one of many within Microsoft. Even with a Company the size of Microsoft’s, there are still time constraints on engineering resources.
Think about all that the Windows Phone had to develop in-house prior to Windows Phone 8:
– Their own fork of the CE Kernel (hybrid CE6 and CE7)
– Their own fork of the .NET CF (3.7 vs 3.5 on CE6/7)
– Their own mobile XAML Stack
– Their own telephony and data stacks
– Their cloud infrastructure for Zune and the Application Store.
Now with the unification of Windows 8 and Windows Phone 8 a lot of that work is offloaded to the Windows team (Kernel, Cloud Services, .NET CoreCLR, Telephony+3G/4G stacks, and in the future hopefully XAML), so hopefully moving forward some of that engineering talent is freed up to work on other issues.
This internal alignment within Microsoft will take multiple product cycles likely, but in the end should represent an organization that is a lot more nimble.
It still is a testament to the Windows Phone team that they managed so much internally for so long.
A port is when you rewrite all or part of the OS to run on new hardware. This is using a vulnerability in the OS and or hardware to shoehorn an existing OS on to hardware that it supports, but does not officially run on.
Windows NT was ported to run on ARM. Microsoft did this work, helped by the fact that Windows NT is quite portable, and has run on MIPS, Alpha and PowerPC in the past.
This guy did a clever hack, sure, but it’s not porting. He didn’t port anything.
A UEFI bootloader, drivers, and much more likely had to be written to get any decent level of support. It most definitely is a porting job, albeit with the architectural port done for him already. For the most part.
This isn’t a simply copying of system files to the device and flipping an on switch.
By the comments on the page, he didn’t write any drivers, Windows 8 supports UEFI already, all he would have had to do it is get it on the device, and fool it into booting windows.
I still don’t think it’s a port.
The article mentions he wrote his own EFI loader, no small feat.
Also, I’m puzzled by the assertion that no device drivers were written for a platform with no driver support for the host device.
Things don’t just magically start working.
writing a bootloader is not porting an OS. I am not trying to take away from what the guy did, I am just saying the language is wrong. A port is something different than this. You need source code to do a port.
I agree with you that is really isn’t a port, but a Surface tablet from what I can read is using a Tegra 3 (arm v9) processor and the HD2 is using a Qualcomm Snapdragon S1 (arm v8). The device drivers between the two are immensely different.
What I did notice is that the screen is not oriented properly, probably because there is no drivers for the accelerometer, we don’t know if there is any hardware acceleration for the video or network devices working. Even if he has managed to get this running on the hardware we don’t have any clues if this is functionally useful.