“As early as mid-December, consumers will be able to go to retail stores in the United States and Australia to purchase a Surface with Windows RT. Additional availability will be added in a number of countries in the coming months.” Sales might indeed benefit from, you know, allowing the world to actually buy your halo product. Us Dutch won’t be getting the Nexus 4 and 10 either.
So lack of sales is forcing them to go to wider distribution channels so they can’t keep all the revenue from in store sales. Likely this is “plan b” or (early) “phase 2” being activated to try to get a foothold in the tablet market.
I would really hate to be tech support for the store that sells Surface RT. I actually have no problem with Surface RT other than I think it would have been better without including the desktop at all, but have a problem with Windows 8 on anything over 13 inches.
The problem is the customer will purchase a Surface RT and not understand why their applications won’t install or run. “But I installed Chrome on my desktop running Windows 8 and it runs just fine.” “Yes but that isn’t the same Windows” “But they look the same and they both have a desktop” “Yes but this is different…”
I’ve heard people dish on Windows 8/RT for this reason, and they contrast it with Apple’s smooth transition from PPC to x86 back in 2005.
The difference is that Apple invested the time and money in building a universal binary system. They had done a platform jump before with 68k to PPC, and they knew what to do to keep their users happy.
Microsoft, on the other hand, has been building on the same monolithic base from day one. Instead of doing the universal binary thing, they decided to push two incompatible architectures in parallel. This causes fragmentation and confusion, and I think it was the worst thing they could have done.
Yes and no. The NT kernel was designed/built with portability in mind from the get go. Microsoft never got their act together (with regards to portability) when it comes to the user/application space, however.
Originally, yes – not since NT4. NT4 took the UI out of user space and put it in the Kernel, and so in doing, made portability much more involved. Up to NT3.51, it was designed so that every subsystem could be replaced easily – this is where a lot of the “NT is highly portable” comes from. e.g. NT hasn’t run on PowerPC, DEC etc, *since* NT 4 was released. Any later port would have been a larger undertaking than “just recompiling the kernel and creating a new HAL”.
Though MS almost did release Windows 2000 for Alpha, IIRC – there was some RC. Afterwards, they still managed to release versions of Windows for the Itanic…
Indeed, every version of NT has been released for at least two architectures.
My guess is that Microsoft has a group specifically to maintain a non-x86 build of NT. By doing this, they make sure that platform dependency does not creep in, and that NT remains portable.
—
NT 3.1 and 3.5 were available on x86, MIPS, and PowerpC.
NT 3.51 was available on x86, Alpha, MIPS, and PowerpC.
Windows NT 4.0 was released for x86 and Alpha.
NT 5.0 (Windows 2000) was released on x86 only. (The Alpha build made it all the way into the first Release Candidate, before being killed off.) A year later, though, Microsoft released an Itanium port of Windows 2000.
Itanium was maintained in Windows NT 5.1 (XP / Server 2003), NT 6.0 (Vista / Server 2008), and NT 6.1 (7 / Server 2008 R2). By the end, Itanium was clearly dead — there were more servers running PowerPC and SPARC than Itanium!
For NT 6.2 (8 / Server 2013), Itanium was finally dropped. Not to worry, ARM came in to replace Itanium.
I don’t think that is fair at all… In both of Apple’s platform changes, the chosen path for backward compatibility was emulation. Emulating 68k on PPC (or PPC on x86) is a totally different proposition than emulating x86 on ARM. In the first case you are doing the emulation on hardware that is dramatically faster and more capable than what the original binaries were targeted against. Also, other aspects of the systems had improved as well – memory becomes cheaper so you generally have much more of it to work with, and more memory bandwidth as well. it is quite a different matter to go the other way around – ARM systems are both much slower and generally have much less memory than a typical x86 machine.
Point being, assuming Microsoft would have given ARM equal footing to x86, the real problem is not allowing new software to target both platforms equally – which is what something like universal binaries solves… The problem is supported existing x86 software on ARM
How do you do that when a typical ARM system performs like an x86 system from 10 years ago? In many areas, i.e. floating point, the performance delta can be as much as 2 or 3 orders of magnitude. There might be a handful of modern x86 apps for windows that would perform adequately on ARM through emulation, but most would simply be completely unusable.
Apple never even tried to solve this problem. Do you see iPads running OSX apps? You giving Apple credit for doing something they never did, and criticizing Microsoft for doing exactly what Apple did – which is pursue two separate parallel platforms.
The focus of my comment was on the PPC to x86 shift; I only referenced the 68k to PPC transition to show Apple’s prior experience with the process.
Yes, there was PPC emulation for legacy apps, but it was never intended to be the solution because it was simply untenable for most practical applications. Apple fully intended to use universal binaries (so-called “fat binaries” that contained executables compiled on both PPC and x86) from the start. They had been building and testing x86 versions of OS X and core applications from the very first release of the new OS. There was extensive documentation and support in the ADP for using fat binaries, whereas Rosetta was deemphasized and rightly considered legacy.
But you’re right, I should have clarified all of that.
That’s a bit unfair. The difference is more nuanced… PPC->x86 of Apple was within the same kind of devices (PPC desktops & laptops -> x86 desktops and laptops). The universal binaries don’t encompass Apple mobile devices.
And that’s the jump MS is doing, desktops & laptops -> mobile.
I’ve heard people around the tech circles talk about this, but it’s mostly repeating a talking point someone else has fed them.
Microsoft has managed this transition better or the same as Apple. Windows Store apps support x86 and ARM seamlessly. The appropriate architecture is pushed down to you (in the case of C++), or it is JIT’ed on the fly in the case of .NET then NGEN’ed at a later time by a service. For JavaScript it is AOT’d iirc.
So the fact that my Windows Store app works on x86 and ARM without me even thinking about it is a huge achievement. Plenty of Surface RT users are enjoying it right now.
I don’t really think you’re in any position to know about the architectural characteristics of Windows 8. Windows RT in fact exhibits very comparable performance to Windows 8. I’ve seen no major performance differences between the two when it comes to writing Windows Store apps.
Now, the part you’re likely complaining about is Desktop apps. It wasn’t so much an kernel or OS design limitation (because the aforementioned Windows Store apps achieve compat w/ both architectures) but a conscious decision to move forward with the platform.
Desktop support is there for legacy reasons, but it is inevitable that eventually Windows Store apps will be the way to write Windows applications for the foreseeable future.
Sorry, I should have specified that.
I suspect this is the way OS X will end up too, if Apple continues on their current path.
I really don’t like this push towards a tightly controlled, closed userspace on the desktop. It works well on mobile devices but in my opinion it doesn’t scale well on business machines and power user workstations. But then, I’m from an older generation; I grew up with machines that were largely open to experimentation. Today’s young engineers are brought up in a consumer oriented culture and are probably more accepting of an OS they can’t hack around on.
I don’t think Windows 8/Windows RT is closed (as opposed to Windows PHONE 8 which IS closed) for the following reasons:
– You get access to the full file system
– You can develop personal applications and even side load applications provided you have a Microsoft Account. There is no developer license needed to side load. Just run a simple Powershell script that Visual Studio generates, and share that along with the appx that it creates.
The one caveat is that to distribute the app you must sign it with a Windows Store registered account, but others can install your app via a simple PS script.