The Verge published a video demonstrating how desktop mode and Office 2013 – a desktop application – work on Windows RT, the ARM version of Windows 8. The video showed a desktop mode that clearly didn’t work well for touch, and even Office 2013, which has a rudimentary touch mode built-in, didn’t work properly either. It looked and felt clunky, often didn’t respond properly, and even showed touch lag.
From the video, it becomes clear as day that Office 2013 is a “we-have-to-do-something”-release. The touch mode and many of the interface changes are cosmetic, and do nothing to address the core issue of Office simply being horrible for touch operation. Nothing has been done to truly design a touch-first, Metro experience for Windows 8, and considering Office is Microsoft’s other big cash cow, it’s unbelievable the company didn’t put more effort into this.
The Windows and Office departments within Microsoft are separate from one another, and I get the feeling – although it’s hard to impossible to substantiate – that the Windows team went ahead with Windows 8 and Metro before the Office team was fully on board. In fact, the Office team may even have simply waited it out, expecting Metro to be killed or shelved as too risky at some point. Now that Windows 8 is on the cusp of release, we are being a served a lukewarm Office update that feels and looks like a wart.
In fact, as I stated on Twitter, I’m actually convinced that the only reason Windows RT has desktop mode in the first place is because of the Office team lagging behind the Windows division. Had there been a Metro version of Office, the desktop mode would not have been part of Windows RT. You cannot install applications in the desktop mode of Windows RT, so it’s there almost exclusively to enable Office to work. Save for a few specific settings which aren’t Metro-ified yet (and perhaps file management), desktop mode on Windows RT could better be renamed to Office Mode.
As far as file management goes – it’s a valuable asset, and there’s actually no reason to drop back to desktop mode for something like this. Android has several fantastic touch file managers, and heck, even Windows 8 itself has an excellent base to build a Metro file manager upon: the file picker.
In the comments on the article at The Verge, people suggested that users increase the DPI, plug in a keyboard and mouse, and more of these things that have nothing to do with tablet computing and do not actually address the core issue. We’ve been offered a rudimentary glimpse on what an Office for Metro would look like with OneNote MX (this article is written in OneNote MX), but looking at Office’s past release cycles, a Metro Office might still be 2-3 years away.
In those few years, Windows tablet users – ARM and x86 – will be dropped back to the desktop whenever they open an Office document, or whenever they need to change a setting which for some mysterious reason hasn’t been Metro-ified yet or which doesn’t belong on a tablet in the first place. Considering Windows 8 still has windows that do not even respond to a scrollwheel or touchpad scrolling (I kid you not – check the Character Map, for instance), that’s pretty damn pathetic.
Is it a tablet? Is it a laptop? I don’t know…
I really, really hope that they will split Windows into two in the future.
+10 That is EXACTLY what they need to do, although I’d argue that it needs to be split in THREE, not two.
1.-Windows Pro X86-64, a light business oriented release without any extra bling but with full DirectX support, similar to how they did with Win2K. This would let businesses have a nice conservative desktop (also performance gamers would probably choose this) without any extra fluff, just “get out of my way and let me run my programs” basically.
2.- Windows Home X86-64, this would be your consumer oriented more “family friendly” version, with your prettier desktop, appstore, plenty of multimedia stuff, very home user friendly.
3.- Windows RT, the ARM “Metro” look designed around cell phones and tablets, lots of appstore focus, all touch, no desktop mode.
Sadly what we are getting those is MSFT throwing a “Hail Mary” by trying to force desktop users to use a touch UI in the hopes that if they “get used to it” at home they might buy a tablet with it. I don’t think its gonna work unless Ballmer is willing to flush another billion by selling Surface tablets below cost, but that seems to be the plan.
Oh and I’d say Mr Holwerda is incorrect, desktop mode isn’t “just” for Office as frankly most of the deep level system stuff like file explorer and notepad and paint and the deeper system tools frankly aren’t touch UI useful either so they really have no choice but to hang onto desktop mode. Remember the first versions of Vista, and how many programs still had XP dialog boxes and icons? We are seeing the same thing here, another rushed out release with a LOT of the low level stuff not converted over to the UI.
Personally I think like Vista its gonna be another bomb, anybody who uses it for any length of time can feel how unfinished the whole thing is, but MSFT is getting curbstomped so badly in mobile they have to throw something out there to try to gain some share. What I think is gonna slaughter them is the carriers are royally p*ssed at MSFT for buying Skype and they aren’t gonna allow WinRT units to get the subsidies like the others because Skype competes with their core business, gouging on minutes and SMS.
The problem is that Microsoft cant do “very home user friendly.” They never could. Back in the days when DOS/Win3x were dominating home computing, home users were just forced into using a business computing environment for their home use. My first PC shipped with “Windows for Workgroups” for a 486DX2 that wouldn’t even be connected to the Internet. Win 9x was friendly, but it brought all the business stuff along for the ride and made things extra complicated.
Even Windows Media Center, which some people seem to like, is incredibly confusing and way too much work for a simple TV environment.
The only friendly thing about recent Microsoft products is that they shortened the names from “Windows Ultimate Home Edition Professional Live Edition” kind of nonsense.
I know several people who use WMC. All of them ordinary home users, none of them needed any help. It’s probably safe to say more than some people like it, and I doubt it’s remotely close to your “incredibly confusing and way too much work” claim. If Windows and/or WMC is that challenging for somebody, my first thought would be either they have crap issue-laden hardware, or they’re really lame when it comes to computers.
Or just use applications designed for both platforms. Not much of a deal.
You want notepad? Run notepad.
You want some scribble application? Run that instead.
I assume applications will be designed towards both. By not splitting it up at least you got the option to use that application which you really need but may not have the perfect user interface for how you use it.
Edit: It’s early time for Windows 8 to say the least ..
Edited 2012-09-04 03:01 UTC
A few things:
I’ve actually used Office on a Touch device. It is nowhere near as frustrating as he made it look. Im guessing he ran into preproduction Windows RT issues with touch sensitivity.
But I agree that in the long run Desktop mode should not exist. However, I think it’ll go farther than Windows RT. I can envision the Desktop being obsoleted across all form factors in the long term.
Obviously for now given that Office 13 isn’t full on Metro we need it, but over time we’ll need it less and less.
It’s difficult, to say the least.
With the release of Windows 8 and Office 13, Linux distributions have got a never-seen-before case scenario of a window opportunity to take over and this will be much more bigger than the Vista fiasco. It remains to be seen whether the geek community will do something about it or cross their arms.
LOL.
Lol whut?
You’re dreaming if you think that will ever happen. There’s only one category of Linux that has gotten anywhere with the average consumer and that is called Android. If the “geek community” as you call it want Linux to take over they must do at minimum the following:
1. Consolidate and agree on which distribution should target the end users.
2. Target said end users rather than just themselves and those who want to tweak. That means, among other things:
* Ditching X.org for a working graphics stack (Wayland does look promising)
* Fixing the audio subsystem (ditch ALSA, Pulseaudio, and the rest of that mess and put your resources into OSS 4 which actually works)
* Providing API and ABI compatibility (yes, the Linux lovers say it’s unnecessary but it damn well is necessary for commercial software)
* Stop trying to tell every commercial developer that they need to GPL everything or open source their drivers
* Thoroughly test all updates to insure that a simple software update won’t result in the login screen of death
* Document all system APIs in a consolidated mannor and publish that for third party developers
We’ll start with that list. If the Linux community is willing to stop breaking everything and do the remaining 20% that isn’t fun, then just maybe they’ll have a chance. As for me, I’m betting my money on none of this happening. Oh well, FreeBSD already has most of this and they could always make a stab at the desktop or a FreeBSD tablet. The *BSD folks know what needs to be done to make an operating system rather than a mishmash of components that come apart at the first opportunity.
Lol FreeBSD.
The BSDs are way behind Linux in every way, WTF are you talking about?
I would bother to answer except for two things: this thread is about Windows and the fact that you do not include any actual evidence as to how the BSDs are so behind Linux. People who make said claims must present evidence to be taken seriously. I simply claimed that the *BSDs were a complete operating system designed with the goal of each component going with every other component. This, if you have even used a BSD for five seconds, is obvious. That alone puts it ahead of Linux for desktop use, as the various parts of the os must work together for a smooth and consistent user experience. Windows has this, *BSDs also have this, as does OS X. Linux does not and, until it does and until the community acknowledges this, desktop GNU/Linux will go absolutely nowhere. As long as it’s possible to brick your system with a simple security update that prevents X.org from running (yes, I’ve seen that and to an average user that is a bricked system) you’ll gain no traction. Ignoring it, or shouting at we “haters” who point this stuff out won’t help. We don’t do it out of malice, you know. Plenty of us, including me, would love to see a real alternative to Windows on generic hardware emerge and if there were ever a time for that to happen, it is now. Either it happens now, or it doesn’t happen for another twenty years. I don’t need to remind anyone here that, if Microsoft has their way with secure boot, that might be twenty years too late.
Are you claiming that BSDs have the advantage that the kernel and user-space are made by the same people?
What about Linux and things like Wayland and systemd? Those are Linux-only and were designed for Linux in mind.
Anyway, I can see the advantage of having an unified system but I don’t think the NIH Syndrome is an advantage, but to each his own I guess.
Because there’s no NIH syndrome in the linux camp. Right.
Case in point: upstart vs systemd vs runit/daemontools. Why the need for systemd when we already had upstart? Why the need for upstart when runit/daemontools have been around for a very long time.
darknexus: I don’t know what OS you normally use, but you seem to be talking about Linux from years ago. I run Fedora on the desktop, and while I have run into some issues, none of them were ever bad enough to keep me from being able to use the system. Part of the issue is that Linux is constantly updating. Linux is under continual advancement. Windows goes years between versions. OS X goes at least a year. About your concerns:
-While X is certainly a bit old, it has enterprise features that other OS’s still don’t support, especially when it comes to multi user systems, which Windows and OS X are NOT.
-I have used pulse since it was put in Fedora. In at least the last 2 years I haven’t had a single problem.
-Your kernel API’s are going to change. Its a fact of life. They do in Windows and OS X as well. Your user space APIs are stable. I can run software that was produced half a decade ago with no problems. (like Loki games).
I won’t deny that Linux has some issues to work out to get to the desktop. But its not as bad as you make it out to be. Half or more of the problem you listed will never affect normal desktop users anyway. The key here is that if the user is going to have to learn a new system (ie WIndows RT) then they could just as easily learn a new Linux OS.
You’ve just said it yourself. Most people don’t want to be constantly updating, and they’re quite content to have a few years between versions. The bleeding edge is called that for a reason, you know.
Hmm, where to begin? First, the home user doesn’t care one bit if their os has enterprise features. What they se is this: graphics crashed, all my programs went down with it. They do not care if it’s because of unstable half-implemented video drivers, nor that some enterprise-level feature is present. They want it to work and do so relatively reliably. X, due to a combination of age and driver instability, is not reliable enough on an unpredictable home user situation. Yes, you can do amazing things with it on the corporate desktop but, to 99% of users, those features will never see the light of day. The other 1%… well, they’re already using X.org aren’t they? As for multi-user systems… you accuse me of talking about the Linux from years ago. I’ll come right back and say you are talking about the Windows from years ago, and OS X? OS X was built on a UNIX userland (FreeBSD-based to be precise) from day one. You do not get much more multi-user than that and yes, I have administered Macs in a multi-user configuration. They are just as multi-user as Linux with X.org, with the advantage of easy configuration in most situations. If you can claim, with a straight face, that OS X and Windows (at least the NT-based versions) are not multi-user, then I’m sorry but I have to take away your geek card.
I’m going to go out on a limb and say that you don’t do much audio recording or editing, and that you don’t use VOIP software to keep in contact with people. Sure, you can play your music through Pulse and usually it will work, but for recording or gaming the latency is just staggering.
That doesn’t matter if distro x (Ubuntu, I’m looking at you) decides to patch glibc and break binary compatibility. It does happen. Add to that, drivers are not stable in Linux, by which I mean that I cannot simply download a driver and load it. No, it has to match the kernel version and compilation that I have in whichever distribution I’m running. That’s ridiculous. I can run drivers, 32-bit ones at least, on Windows 7 now that were created when XP was released. It’s not recommended of course, but there are times when it just has to be done. I don’t care, and neither does the average user, about what happens to the internal kernel APIs. That’s not what I’m talking about. If you are going to change those internal APIs though, you need to keep the external interfaces the same (a userland for drivers if you will). This is what Windows and OS X do, and guess what, it works. All the user has to do is download a driver for their version of the os (or the closest available if there isn’t one), install it, and go. It’s not that simple in Linux and, until it is, no one except developers and corporate IT departments will adopt it. It will never be that simple because maintaining such a stable external API isn’t fun. It’s that 20% that no one in the Linux community ever want to do because it doesn’t have the shiny factor.
Come back when you’ve actually tech supported normal users on Linux as I have. Every problem I’ve listed is a recurring theme and has been for the last ten years, with the exception of Pulseaudio which is newer. It works for you. It can work for me. We both have the technical knowledge to fix it when something unexpected happens. Most people do not and, while both Windows and OS X can and do break, overall they break far less often. The breakage needs to keep to a minimum for a home desktop and, though programmers don’t like to hear it, that means giving up new feature X and fixing old bugs in their programs. It’s not fun. It’s not rewarding. You’ll never get recognized for fixing bugs and keeping stability. It is necessary, however, and this concept is fundamental to the desktop experience.
If you don’t want breakage, use Ubuntu LTS (Just like Google does) or Debian. Fedora is a bleeding edge distro made for linux enthusiasts. Don’t try to match bleeding edge systems (like Arch) with “long-term” systems (like Windows).
Pulseaudio is a very nice sound systems with low latency or (XOR) low power usage in mind. It does not even overlap with Alsa, it actually let’s Alsa do what it’s good for (kernel/driver handling) and replaces Alsa Userspace, which was really broken and featureless.
And even though the pulseaudio transition was pretty messy, nowadays you’re a fool for not using it.
That just show how ignorant you are to “the Linux way”. You know, they don’t need ABI/API stability for drivers because, hey look, 99.999% of all drivers are included inside the kernel (yay score for monolithic kernels), and so every single change they make automatically takes into account these drivers. Just saying that “you can’t simply download a driver and load it” shows how much of a Windows mindset you’re trying to bring into the Linux world. That will never change for really GOOD reasons.
Guess what, Linux does THE SAME for the external, userland APIs. You can load up programs statically compiled more than 10 years ago, and they will run fine. (look at my Unreal Tournament 99 post) The reason we don’t usually do that is because since we already have source code for pretty much everything we can dynamically link everything, reduce RAM/Disk usage, fix bugs and just recompile everything again. That’s what distro do and will keep doing.
Again, since 99.999% of drivers come with the kernel, the user does not need to do anything in Linux. It’s actually easier. Oh, and the plug ‘n play is much faster. Things only get ugly when no driver is available.
In my experience it has only happened with the (pre-AMD) ATI *cringe* and Nvidia binary drivers. The Mobile Radeon chipset in particular is such a crock that it can make a strong man weep. I don’t remember X.org crashing on any other hardware.
Good point, but I still have to see a desktop Linux user download drivers from the Net. Most of them, third-party drivers included, can be found in the distribution repositories.
For some reason these features are promoted as enterprise stuff, so that’s where you get them. Debian and Scientific Linux in my experience are great desktops but don’t get as much attention as Ubuntu and Fedora.
Nice in theory, but for quite a few device classes there simply aren’t Linux drivers. Case in point, quite a few modern printers and almost all modern scanners. Granted, it’s because these drivers haven’t been developed. The reason is simple: there’s no standard to target. Printers and scanners are not kernel drivers anymore (thank goodness for that) however, with scanners in particular, you face several problems when developing said driver. The version of sane used, the version of glibc and other libraries it is compiled against, 32-bit or 64-bit, etc. My scanner (Canoscan 5600F) has Linux drivers, but they’re almost impossible to get working and require endless amounts of fiddling because the drivers Canon have provided do not match any version of SANE other than the one in RHEL. I’ll stress that I understand the reasons and I know how to get it working; I bought this model, in fact, because it’s compatible with all major operating systems that I use no matter how fiddly the Linux support is. I know the Linux community would blame Canon at this point and claim they should GPL their drivers (another argument I hate) but, if I were them, I wouldn’t even bother making Linux drivers until I could reasonably target the majority of systems out there. That doesn’t help the average home user that just wants to scan a few pictures however. To them, it doesn’t work and there’s nothing they can quickly install to make it work. It either works or it doesn’t, and it doesn’t help that most hardware doesn’t even give an indication when they buy it if it’s Linux compatible or not. If the driver is in the repositories and it fully supports your device, great. If it’s not, and you’re not a tech, you’re royally screwed even if said driver does exist because the odds of being able to install it yourself (whether in source or binary form) are close to zero even if you follow the instructions exactly. Further, if your distro updates its kernel and it is a kernel driver, it gets wiped away and you have to do it all over again, and it might not work this next time around if an API has changed.
I haven’t even gone into the trouble of trying to use Linux to communicate with an iDevice which, like it or not, is something a lot of home users will want to do. That’s an entire can of worms in and of itself and yes, I know this is because of the fact that the developers of libimobiledevice and usbmuxd have to reverse engineer the USB communication protocol that Apple keeps changing. A test for you though: Say that to a typical home user and see if they understand it. My guess is their eyes will glaze over a bit until you finish and then ask: “Well, will it work or not?” Note, this isn’t actually a guess.
I never said Fedora was the choice to use for a desktop OS. My point was that even in Fedora, which is pretty much as far on the cutting edge as you can get, you don’t see the problems you are listing.
As for multi user, Windows is not multi user. Why would you claim the abilities of the NT base, but then complain about how certain features on Linux are enterprise based. The desktop version of Windows is not multi user. Never has been, probably never will be. As for OS X, it wasn’t until Lion apparently. I admit, I don’t use OS X very much. But still, for most of its life, OS X could only spawn one gui interface at a time. But I will give that it has now progressed to that point.
I have NEVER had X crash and take all my programs with it. I have had it refuse to start. But once started, it has always been rock solid.
I am going to go out on a limb and say that if professional film studios can use Blender and Autodesk on Linux to work on professional films, then the state of Linux audio must not be that bad really.
You run 32 bit Windows XP on WIndows 7? Well good for you. But I guarantee that more drivers for Windows XP don’t work on WIndows 7 than do. All three OS’s break compatibility on different levels all the time. At least on Linux its not a bloody trade secret when it happens. It gets documented.
Its funny you mention supporting Linux at work. Sounds like you hate your job. I would if I had to use a system that I felt like that about. Simple fact is that distros like RHEL and SUSE and a few others get 10 years of support. They don’t get the bleeding edge stuff, they don’t break compatibility, they don’t have driver issues, they just work. I don’t use them for my desktop, but then I like the bleeding edge. But you are complaining about a problem that only really exists in a small portion of the Linux community. A portion I might add, that frequently advertises that they are cutting edge development distros that will have issues.’
I actually support only a few linux people, mostly I deal with Windows and OS X users. And I have to say that I have vastly more problem with WIndows and to a lesser extent OS X than I do with Linux. My sister runs a computer support company, and she has a business because Windows just doesn’t work for most people. Between stupid user behavior and malware/virus laiden software, its a giant mess. Linux is immaculate compared to what is out their for Windows users.
You do realize that the last version of Windows not to come from the NT base was Millenium Edition back in 2000, right? Home users have been on NT since the release of XP back in 2001. Home versions of Windows have been fully multi-user for over a decade now.
That may be but it doesn’t change the fact that Windows can only have one person log into the machine at a time. Even if one is remote. Period. Windows doesn’t even offer command line access for other users. The only way to have multi user access is through Windows Terminal Server, which is NOT a desktop OS. Not to mention you need extra licenses to do it. But this isn’t really a point that has anything to do with anything other than the fact that X offers features that don’t exist in OS X and Windows.
Home computer, the vast majority of time one person is using it while being sat in front of it. Multiple logons is not a common use case.
Wrong. The limitation is that you can only have one user interacting with the GUI at a time. Starting with XP, you can have multiple users logged in simultaneously and switch between them in just a few mouse clicks (plus typing in the password on the account you’re switching to). That usage probably works perfectly for 99.99% of home users. The “one user at the computer plus another accessing remotely” case is pretty rare for the average person.
Not sure what you’re getting at here, but you can certainly open up a command prompt as a different user with the “Run as” feature. I think that’s been in every NT based Windows.
Linux has no place in the world or pro audio & video. While a very short list of software may be available, you’ll be hard-pressed to find anyone who actually uses it. Real pro audio & video is completely dominated by Windows and OSX systems.
Of course you’ll have people who have had good luck with linux. The same can be said for Windows and OSX as well. But, all the linux forums and mailing lists tell another linux story. There IS constant breakage, there IS ongoing problems, there IS no shortage of users fighting to get it to work properly.
Also, claiming that Windows doesn’t work for most people doesn’t do anything but signal sane people not to take you seriously.
As for professional uses, you are so wrong its not even funny.
http://news.softpedia.com/news/Hollywood-Loves-Linux-45571.shtml
That is just one article from Google. There are tons more like it. Most production houses are using Linux.
As for Windows working, I guess thats a matter a perspective. I don’t consider the hoops and hoopla that the average Windows user goes through to be “working”. Considering the vast number of Windows malware and viruses out there, I would say that there are way too many Windows machines out there that are “broken”. Otherwise that crap wouldn’t spread.
No, I’m not wrong. You’re mistaking cgi rendering with film & audio editing.
The average Windows user doesn’t have to jump through any hoops. Unless of course you consider installing Windows (if it isn’t pre-installed), and then booting the computer a hoop.
So your criteria for whether a computer is “broken” or not depends on if malware/viruses/etc are able to infect the box.. You do realize that while linux offers a little more control over security, the average user doesn’t know how to make use of it, or turns it off. I see just as many if not more typical linux users being scolded for doing stupid things as I do Windows users.
By the way, the vast number of Windows malware and viruses aren’t able to infect the vast number of Windows systems anymore so, yeah.
http://www.ovnivfx.com.br/
All their work is done on Blender running on Ubuntu, not just rendering but also video edition.
I’m sure if you look hard enough you can still find people using Session 8 or Pro24 as their multitrack system to record with too. It doesn’t change the reality that 99% of the field doesn’t use the stuff.
Sure most people don’t use that, but the example I gave is a professional company, not someone playing with video stuff at home.
The point is that it’s perfectly possible to produce professional quality content on Linux. Whether companies choose to use this or that tool is a different issue.
It’d be nice if we could see who votes comments down and why
Read this article from 2001:
http://www.cgw.com/Publications/CGW/2001/Volume-24-Issue-9-Septembe…
Linux was the best choice for Hollywood to replace all it’s SGI/Irix machines, and to this day it’s still their choice not only for rendering but also for workstations.
However, that doesn’t means everything is Linux compatible. Smaller studios might find it better to just pay for standard windows/osx tools instead of highly specialized Linux one. Game developers generally use windows tools and Audio artists prefer macs.
Most people in this thread say that Linux is unsuccessful, but for the wrong reasons. It’s more a matter of not running your software of preference instead of not having X API and not having OEM backing instead of not working with your hardware X.
No, I’m not wrong. You fail at reading. If you would have looked at the article I specifically linked for this reason, you would see that the vast majority of desktops in the company run Linux. For editing. Not for render farming.
So what you’re saying is that it is exactly like Windows and OSX?
Wow! Have you worked in the entertainment industry? As a vfx artist my self who have, firstly, my own company and is currently employed by one of the major studios to do vfx for massive blockbuster movies (last two ones was main blockbuster releases this year). Yes there is alot of mac’s here at work (only the management and that is not the main bulk of people) but that is also where they are. Our main OS IS Linux for three major reasons:
1) We have total control over the OS, there is no black magic happening behind the scene. We can modify it to the extent we want. So it fits the need of the studio.
2) It’s free. Yes we have this as a major point to us. If we need to extend our workstation base (bring in emergency freelance artist) we don’t need to buy 150-200 more licenses above the cost for the workstations, or our render-farm with about 500-1000 nodes (they are throwing around crazy numbers around here at work, at least for me and my own small company where we talk about say 5 – 20 new machines).
3) We have A LOT of in-house tools that is developed only for our pipeline to make the work we do faster. We base our pipeline on a of-the-shelf application and then extending that so our needs a met. Sure we’re not using blender to any extent (or at all, it’s installed and the artist are allowed to use it to create models. But no one does…). That is due to that unfortunately it’s not up to par with the packages we us here and currently for us to switch would be a massive setback since we have a lot of time and money invested in this pipe, then if I would start a new company I would not choose either win or osx mainly for the points above this one.
And no. We aren’t using photoshop. We’re using a 3d painting app that allow us more flexibility and interconnection between our main app and this paint app. Our editorial office here at work uses a in-house cutting/editing app that our colorist and rnd-team has created, in Linux.
But then this whole argument can be circumvented depending on what you consider being “pro video”. Creating a still image for a product visualization? Yes a lot of people are using win/osx and that is usually only (big generalization here) because they don’t need the pipeline for manage all the data that an animation/vfx production need.
But hey that is just the way we deal with our stuff here at work (and at the places I’ve been to before).
just my two cents.
Ps.
Just looking at the “video” comment since I don’t work in audio. I have absolutely no say in that or any knowledge about it’s pipe and the normal setup at a recording studio
Ds.
To answer your first question, yes I work in entertainment/film/music.
Anyways, this whole thing started because of a blanket claim about linux being extremely popular in film & audio (no, that isn’t a direct quote). Of course there are explicit environments (as you described) where it may be preferable, but to make the claim as a blanket for the entire industry is simply absurd.
You may as well have antennas coming out of your head if you want to go into a recording studio talking about linux. I suppose saying that will attract petition from the lurking bedroom studio “pros”.
Also, my comments are in response to the original & outlandish blanket claim. We can bring up specific use cases and debate it all day but there will never be a winner. The fields are vast and so are the opinions and setups. That being the case it’s best to focus on the original claim and apply KISS.
Edited 2012-09-03 18:25 UTC
Have you ever taken a look at some Windows forum, like windows7forums.com ?
I have, yes. Of the 27819 registered members there (I just checked that number), assuming 100% of them are real people with real Windows problems is still a far cry from the hundreds of millions of Windows users around the world. Certainly not even in the same universe as `Windows is broken for most users`.
There isn’t a shortage of people with Windows problems just like there isn’t a shortage of people with Linux problems, or OSX problems. But to say that describes the most users of any of those OSes is ridiculous at best.
Please correct me if I’m wrong, but AFAIK the fact that your apps go down along with the X server is because of the frameworks running on top of X not taking advantage of some of it’s features.
That is, X itself does actually support resuming your apps when recovering from a crash, so replacing X with something else wouldn’t magically fix that problem.
Hardly anyone is telling commercial developers to GPL everything. At least, far more people are complaining about that kind of behaviour than there are people actually behaving that way. It’s a very trollish point to make.
Simple software updates don’t break the login screen. The only time I’ve seen this happening was in OS X, where an error in a WLAN library made the login splash screen hang indefinitely with no error message. No Linux distro offers that kind of unneeded complexity.
FreeBSD is years behind Linux on the desktop. At least if you include things like ditching X for Wayland, never mind that FreeBSD is years behind Linux with X development as well. But yeah, it does have OSSv4. It’s not really all that much better than ALSA + Pulse in all configurations (certainly didn’t work for me), but if you prefer the FreeBSD fantasy to the Fantasy of The Year of Linux on the Desktop, then just go right ahead and dream on.
BS. Ditch X.org for Wayland? I have very few good words to say about X, but ditching a thoroughly tested and working, though somewhat aged and quite clunky, system for some vapourware is plain stupid. Pulseaudio is a bitch, right, but no-one is forcing you to use it. OSS is already ditched for ALSA, so the chances that ALSA be ditched for OSS4 are quite slim. Re: ABI compatibility, it’s right there. Install the libraries you need, load the modules you need and stop whining. I have, and occasionally run, programs from every incarnation of libc: a sourceless ZMAGIC roguelike, some Loki games, a commercial tape backup program from the 2.0 kernel days, some really old version of ApplixWare. That’s pretty damn good for me. If you want to say something about GCC ABI compatibility, well, I feel your pain. APIs stay mostly compatible within those projects I track, can’t say anything about the general trend.
Nagging vendors for open source drivers is actually useful in the long run — most of them eventually got the message, even Broadcom. The last two are good points, but this is not a single team with a single policy.
Edited 2012-09-02 12:57 UTC
Seriously, none of that is a problem. I can install Unreal Tournament 99 for Linux, ported by Loki arround ~2000 and it works fine in this modern 64bit Gnome 3 ArchLinux install.
If your poorly binary _only_ software depends on dynamic linked libraries that will sure change over the years, then sure, it will brake.
Since this is linux you can also do cool stuff like route audio trough jack and apply any kind of DSP, mind you this game uses very old legacy OSS for actual sound output
http://img803.imageshack.us/img803/9997/capturadetelade20120902.png
Please consider using http://ompldr.org/ or http://imgur.com/ next time, ImageShack sucks and I can’t see your image without registering to that site.
Edited 2012-09-02 21:44 UTC
Sorry, I didn’t know about ompldr and imgur uses very aggressive compression.
http://ompldr.org/vZmM3YQ
That is kinda the whole point of his comment, I have a reasonable expectation of a library in Windows staying the same.
I would add to your list: create/support a usable desktop (like Cinnamon or Mate) or go back to something like KDE 3.5.
I will never stress that enough. The direction taken by the Linux Desktop has disgusted so many users.
But they said that users didn’t matter, didn’t they?
Not a chance dude, they’ll just stick with 7 like they stuck with XP. As we have seen posted here lately there are just too many problems in Linux land that keep the software developers from supporting it, and without native ports of the software everyone uses it has ZERO chance.
Joe Average simply isn’t gonna deal with the flaming mess that is Wine, hardware roulette when it comes to shopping for devices that will work with his computer, dealing with the incredibly retarded breakages that happen like Pulseaudio crapping or the wireless getting hosed, and the general unfriendliness of the design.
The simple fact is to get linux to get even OSX levels of adoption fundamental changes would have to be made to the way things are done and the devs are elitist ubernerds that think the entire world should do things THEIR way instead of conforming to the way the world works.
As a wise man on one of the forums said “Linux is 3 years away from being ready for the masses. it was 3 years away 10 years ago, it is 3 years away now, and it’ll be 3 years away 10 years from now” because the devs simply won’t go along with the changes to the way they do things to get more people on board.
Yeah because the main reason people used windows is because of the “Start” button. Keep telling yourself that 😉
Uh huh… just like the huge opportunity that Linux had back when Win2k was released because of the “OMG 60,000 bugs!”. Or the huge opportunity that Linux had when XP was released because of “OMG product activation!”. Or the huge opportunity that Linux had when Vista was released because of “OMG DRM!”.
Remind me, how did those “huge opportunities” work again?
They would all help if Microsoft didn’t have such a strong arm to push their products trough OEMs.
It’s not about who has the best product, but who is powerful enough. Look at Android, the first versions were fairly poor, and yet Google was able to force it trough many manufacturers.
You don’t even need to go that far, Linux servers, HPC and supercomputers are mostly backed by powerful companies like Red Hat and IBM.
No, no, you’re doing it all backwards. You’re supposed to wait until AFTER this latest “huge opportunity for Linux” amounts to nothing, and THEN you start making excuses for it.
It’s like watching a great whale die. Sad, a bit embarrassing but I can’t stop watching. They really have no idea and just panicked because of the iPad led tablet wave and tried bolting everything together into a weird mess. It’s not so much you can see the seams where everything is glued together but the fact the seams come apart before your very eyes.
I have no idea how the average Windows consumer will cope with all this let alone enterprise IT. If MS had had the courage to just go with Metro as a new and separate OS and developed a new completely redesigned for touch set of productivity suites they may have stood a chance to make it work. They could have just focussed on making a feature limited but well designed touch versions of Word and Excel (with file compatibility with their desktop versions) and with Metro minus the absurd desktop mode it might have looked good to the IT depts who are uncomfortable with Apple i-Devices. But they just couldn’t do it.
The Innovators Dilemma indeed.
I think enterprises are much more interested in full Windows 8 tablets because of the superior enterprise management capabilities.
How is enterprise management IPad support looking right now? What actual advantages can MS have in real world scenarios?
Joining an NT domain and understanding domain policies. Even if iPads match the configurability of NT (and I doubt they do just yet) an NT tablet can just be dropped into the existing infrastructure.
Except for the fact that windows RT can’t join a domain….
You really think MS are going to die anytime soon!!?
really?
I don’t think Win 8 or Win RT will be massively rip roaring successes, over and above Win 7 ; and I personally hate the idea of devices which on the surface of it are or might be good hardware devices, no pun intended, I mean all Win RT/arm tablet and notebooks not just the eponymous MS ones, being secureboot tied to the OS. Same goes for Apple tablets/phones, and both of them with their walled garden App stores .Yes MS might be slightly copying the Apple direction at the moment, but I won’t be at all surprised when MS learn from Win 8/RT mistakes and remove or add features or functionality, maybe even open the whole platform back up a bit more if there’s enough backlash, though I do think most of the backlash will probably be ‘UI change’ based, rather than owt to do with platform lockdown and bit by bit they’ll (probably) turn it all and make a reasonable success out of it.
they’re not fools. MS and Apple will BOTH be around in 20, 30 probably 50 years time. They’re the kind of companies who are big enough, amongst other things, that they can (pretty much) ignore stockmarket issues, at least for a long time.
I think EVENTUALLY web apps(web technology based I mean obviously so including offline apps) will kill off most of the dregs of the OS wars (outside of specialist scientific, financial, and HPC and high end media stuff, NLE, animation, rendering software etc)
therefore maybe eventually BOTH Apple and MS will “simply” be vendors of the ‘shiniest, bestest, fanciest of all the hardware options out there!’
rock on 50 years time. all the fanboys can then maybe begin to give it a bleeding rest and go make sweet love to each other instead or whatever else they’re into
The funny thing is nobody have yet really figured out why IPad is successful.
I agree with this part.
Had Windows 8 been a nice and gradual evolution from Windows 7, then enterprises (and dare I say it, a lot of us computer geeks) would have been very happy. Maybe they could have called it Windows 7.5 🙂 (or adopt a version numbering structure that doesn’t smack of marketing foo-foo.)
Here’s an idea: Metro could have (should have) been a seperate OS as you state, but there is no reason it couldn’t be spawned when required from with-in Windows as a virtual machine to run a touch application… Much in the way you can run Android applications now from Windows (see Bluestacks.)
This would have satisfied the need to run the occasional touch application, whilst not destroying the desktop experience that many depend upon or simply like.
With that said, I’m already tailoring ways to make Windows 8 more Windows 7-like, so perhaps this will all be a storm in a tea-cup…
Other people have touched on this, but I really doubt Microsoft is going anywhere soon. Even given much thought to what would be required for an organisation (even a small one) to fully abandon that ship? The cost would be outrageous and what is the gain exactly?
IMHO: We’ll see another decade of Windows 7 installs (with Windows 8 licenses) before that happens!
It’s very important to understand that a technology like a passive digitizer is just one form of touchscreen technology. There are others. Thus, the claim of complete touch-unfriendliness on the basis of how well a passive digitizer is supported is not very rational or sincere. Active digitizers, like those provided by Wacom, are another form of touchscreen technology, and from my experience, Windows 8 desktop mode and Office 2013 makes better use of these types of touchscreens than any other operating system of software. I guess it’s convenient to pretend that things like handwriting recognition aren’t driven by touchscreens if it fits an agenda of discounting Windows on touchscreens (or a perhaps just lifestyle of general intellectual carelessness), but it’s not accurate.
It’s also very important to understand the relationship between the DPI of your desktop environment and the PPI of your display panel when making use of human fingers on a slate Traditionally, Windows has defaulted to 96 DPI which is LOWER than the PPI of most display panels. The “PI” in both of these settings stands for “Per Inch”. When your DPI is lower than your PPI, you are lying to your operating system. You are telling it to use fewer pixels to render an inch worth of graphics than are actually required for your panel to display it. This is why fonts “appear smaller” when you drop the DPI. You have mislead your operating system, convincing it to under-deliver. If instead you set your DPI to a value higher than the PPI of your display panel, then you are telling a different lie. You are forcing your operating system to render an inch worth of graphics using more pixels than your panel has available in an actual inch. Thus, graphics take up more room on your panel than they otherwise would.
Most people who complain about the “touch friendliness” of the Windows desktop and its applications are completely ignorant of the above issues. You can’t in good faith allow the operating system to render an inch worth of graphics using fewer pixels than your display needs, thus creating smaller physical targets on your screen than there should be, and then complain about the results. Sure, Windows should do a better job of taking care of this, and Microsoft should do a better job of educating people on the issues (as we all should). Still, I can’t imagine a website that reviews cars complaining about how terrible the driving experience was because the car was delivered with the parking brake engaged, and they just never bothered to change that before getting started. Car reviewers aren’t that stupid, but today’s technology reviewers are (which is probably true because today’s technology consumers are a relative idiots when it comes to technology).
As a veteran of tablet computing (as in actual tablet computing, as in using actual tablet input hardware in actual tablet usage scenarios with actual operating system and software support, and not the stupid little touchscreen slates that gadget noobs and tech shopping sites call “tablets), I can assure anybody who will listen that setting the DPI of your operating system equal to the PPI of your panel will result in a highly useful, satisfying, and finger friendly experience in the traditional Windows desktop. In fact, cranking up the DPI to something above parity does an even better job of this. Even if you disagree with my claims after doing so, you will at least put yourself in a position to have your complaints be taken seriously (since you at least removed the parking brake before whining about the car).
Microsoft would be wise to provide a “quick DPI boost toggle” button somewhere in the user interface that would switch between a low-dpi “mouse and keyboard” mode and an equal/high-dpi “touchscreen mode”. The old Sony Vaio P series, with it’s 1600×768 panel squeezed into an 8 inch horizontal, had a hardware button that did exactly this, although it was offered in the spirit of providing greater readability. In fact, given that many slates are used more for content consumption than for creation, a “quick DPI boost toggle”, would be doubly useful: enhancing readability and touch-ability.
MS is a tired warn out company that can’t manage to take a firm dump to save it’s own life!!!
It’s the same thing year after year after year… they come out with something new that bombs! and 18 months later they pull it off the market.
It’s my hypothesis that Linux killed any hope of a Windows desktop replacement! Companies see Linux failed on the desktop and see the carnage left by the also rans of the 1990’s (BeOS, OS/2, AmigaOS…) and no one is willing to put any money to bank roll a competitor to MS.
Balmer has drilled his company into the ground! and yet nobody wants to take on this limping one legged gazelle of a company! Why? Because Linux on the Desktop is dead (save techies which is <1% of the desktop market) and companies see this and say if a FOSS OS can’t do it, nobody can!
OSX is not in the running because it only runs on Apple HW, and while quite superior to a commodity PC HW/OS solution, large companies will not fire someone for choosing MS! So the mediocrity continues it’s long and slow slide to in the outhouse we call the future of MS!
I recently asked my IT folks to put me back on XP because the crap that came with my latest PC was nothing but eye-candy updates that added no real value to my system for me as a user and so I did not want to spend the next two weeks learning/fighting the system for no clear advantage! It’s amazing how much faster & fresher XP feels under the new HW.
When will the slow death of MS end… Not until Balmer is let go so they can put a real leader in charge and not a dancing monkey. The board has to see the latest fiasco called Windows-Mobile/Nokia alliance as a last straw for Balmer. Even Nokia has distanced itself from MS by saying they have a “Plan-B” in case windows phones fails… again!
So far MS is about 12 years into it’s death slide… It probably has the money for another 12 years worth of corollas effect before the final gurgle.
Edited 2012-09-02 01:15 UTC
A big problem with dumping money into the Linux desktop is that your competitors get the fruits of your labor thanks to the GPL. Red Hat has managed to get around this by selling a brand name and pricey support to enterprise but that same model doesn’t work on the desktop (and Linux fans please note that the Red Hat CEO acknowledged this).
If someone really wanted to do it they could put there graphics (non-X11) on top of a linux distribution without giving away the goose, as it were.
However the way Apple did it was to use BSD so the GPL was not an issue. The GPL means it’s a non-starter. I think someone could do the same as long as they rolled their own graphical front end, as Apple did.
The problem is getting the necessary threshold of apps available, so there is a viable market. If it was kept inexpensive enough, the open source developers would probably come along. There are a lot of great apps out here that are 90-95% complete. If they could polish these to make them usable… who knows. But then again, that is what contributed to killing Linux desktop, unusable apps.
The last 5%-15% is where most devs get bored, because the major interesting problems in the application are “solved”.
I always joke than the last 20% of any development takes longer than the first 80%.
Lol, Red Hat built their brand name (and fortune) by providing stellar support for a kernel and software ecosystem (both for which they wrote tons of code) which anyone could download and use for free, that is quite a feat. Pricey? In comparison to Microsoft?
It’s certainly true though that the type of technical support on offer to the enterprise is not something you can sell to the end user desktop. In fact there is really no market for support on the end user desktop at all.
As such there is very little business potential on the Linux desktop as it’s not only open source but also GPL licenced which means there’s no ‘we’ll keep the best parts proprietary as a competitive edge which you will have to pay for’ option.
Again the only really serious attempt at pushing money onto the Linux desktop is that of Canonical. I suppose their endgame is to make Ubuntu the enterprise desktop choice and OEM deals where it’s preinstalled?
That said Ubuntu has certainly made a huge splash in the (albeit small) Linux desktop pond, and I’d certainly attest to it being the most user-friendly and polished out-of-the-box distro I’ve come across (yes, even with Unity!).
And while I personally prefer distros like Arch, Gentoo etc where you have total control, I find that Ubuntu is the one I recommend to Linux newcomers and it’s also what I’ve installed for my parents. It will also be interesting to see how they will leverage the potential that native Steam brings, and how (if) it will mesh with their own app store.
Canonical has reported about 20 million desktop/laptop users (based on unique IP addresses to their update servers), but more importantly, they have a serious monetization strategy that includes:
* An excellent store for both free and paid products (Software Center) built on apt that tightly links to their OS (e.g., Unity Dash shows both installed and available products matching my search, and purchasing and installing the available products is as simple as clicking the one I want)
* An excellent cloud service (UbuntuOne) that sells media and offers automated off-site backups, highly customized multi-OS synchronization, collaboration, etc.
* An active vendor affinity program for pre-installs
* A cross-product strategy covering desktops/laptops, servers, tablets, smartphones, TVs, and other embedded opportunities.
You can argue over whether they will succeed or fail in the long-term, but their founder has exceptionally deep pockets, extensive business experience, and a strong ideological commitment to “paying forward” the FOSS philosophy that he credits with enabling him to make is billions.
I’m a paying customer, so you needn’t ask on which side I would argue. 🙂
Some fruits, not all. And you get some fruits of the labor of your competitors thanks to the GPL. This way your company is not forced to develop a whole operating system and its applications.
Nope, company I work for we use Linux as our base. We do contribute back to the community but we also maintain some proprietary code. There is a fortune to be made in maintenance, support contracts and consulting for the Enterprise. That’s were the real $$$ are generated.
Can’t figure out why you were modded down, as you raise some seriously valid points. Balmer is an incompetent, pure and simple. I’m convinced that the board keeps him around just for entertainment value. He has no idea how to bring a product to market and, further, has no vision for the end result. I also think Ms was a bit too quick to jump on the “me too” bandwagon from the fear that they would be left behind. Now, they have no idea what to do with it or, at least, the top brass are clueless. They want a tablet os, but they can’t break compatibility, yet they want to lock it down like iOS but that would piss off their enterprise customers, etc. They’ve got a lot of baggage with no idea where to drop it, and I think with Balmer and the rest of that crue in charge, that baggage will drag them under the water. They need new blood in management, pure and simple, and I hope they get it. As much as I don’t like Microsoft, having a monopoly controlled by Apple at the top scares me even more.
OS X can run on any modern Intel equipment. Case in point I’m running Lion right now as I post this. It does NOT have to be on Apple HW, just Intel. I wanted a OS X box that sadly Apple either doesn’t want to sell or is phasing out so I built my own. Whether that’s legal or not I don’t care, I have what I wanted. Plus my point is that OS X will run on other HW other than strictly Apple’s.
Irrelevant to the mass market.
I didn’t think Windows RT (ARM) had a desktop mode.
It does, it is just fixed to just run Explorer and Office since porting Office to Metro was too large an undertaking and a good file browser is simultaneously important and a bad fit for the tablet face of things.
This thing just needs a stylus, like older Windows Mobile.
See this video by ZDnet:
http://www.youtube.com/watch?v=GOyOqzLKRKg
It’s a bit odd that the touch mode is not enabled by default on a Windows RT tablet and it’s obvious that it’s still not optimal for touch. But it seems to be somewhat better than the interface The Verge tried out.
Edit:
Lol, it seems the ZDnet guy actually failed to enable touch mode too. This is how you do it:
http://www.youtube.com/watch?v=2e5fTetHbp8
Edited 2012-09-02 09:16 UTC
Touch mode IS on in The Verge video.
Really? When touch mode is on there is a small blue dot in the title bar. There is no such dot in the video by The Verge.
http://cdn4.computerworlduk.com/cmsdata/slideshow/3370958/07_Office…
You do realise that dot is in both shots there, right?
?
The first screenshot shows touch mode enabled but not turned on, the second shows touch mode enabled and on. If touch mode is not enabled then there is no dot at all, such as in the video by The Verge.
Take a look at the video I posted again:
http://www.youtube.com/watch?v=2e5fTetHbp8
OK so touch may not work really well in desktop mode BUT I for one want full office even if it means using a keyboard and mouse when I use it. In fact i couldn’t imagine writing a long document on a tablet without a keyboard.
Lots of people seem to agree because you just have to look at the popularity of VNC / RDP type apps on iOS and Android and also the number of BT keyboards being sold.
purchase the $600+ version of the surface that will have windows 8 on it rather than windowsRT.
Surely you can just connect a keyboard to a Windows RT tablet as well?
You can, but Office and Explorer will be all you can run in desktop mode. Feel like running LibreOffice instead? Then you bought the wrong device.
[Of course in theory LibreOffice could devise a Metro version; they’re working on Android support so compiling to ARM isn’t the intractable part. Being enough touch-friendly to pass Microsoft’s store rules while still being friendly to mouse & keyboard “docked” usage might be tricky!]
You know, this whole issue of tablet vs desktop is becoming a fight of the grognards vs the innovators. Its cross-platform. With Windows, it seems to be the Metro vs Desktop versions of Windows. With Linux, its Gnome 3 vs Gnome 2 clones. Everywhere you go, you have one group designing for the future, where tablets and phones will dominate, and another group, who hate change, who want to design for 1980’s technology.
I have no dog in this hunt. I like it both ways. But for God’s sake, if you’re going to design something for tablet/multi-touch computing, then do so. Stop trying to cross-dress applications. If Windows wants Office for its Windows 8, then it needs its Office team to design a version of Office specifically for it. At the same time, it should keep its old design for the grognards who resist change. Windows 8 will stand or fall on its own. Office needs to be able to go either way, because it makes more money for Microsoft.
Hi,
Tablets are good for some things; like reading a few web pages, keeping in contact with people while you’re on the move, reading an ebook in bed, etc. They’re complete useless for anything serious (like writing documents, creating software, CAD, modern 3D games, etc).
Desktops are good for almost everything; except when the size is a problem. They’re too big to carry around all day, and I’m sure I wouldn’t want my full “dual 30 inch screen” setup for reading a book in bed.
Obviously they’re entirely different form factors, designed for entirely different uses. The idea that tablets will make desktop obsolete is completely idiotic – as stupid as thinking that Segway is going to make cars obsolete.
What we have at the moment is developers trying to make desktop stuff work on tablets, or trying to make the same user interfaces work in both cases at the same time. My prediction is that this “first wave” of user interfaces will fail badly (either being useless for tablet, being useless for desktop, or being useless for both); and this failure is going to be an important lesson that forces developers to realise that different usage patterns require different user interfaces.
– Brendan
Um… Multi-Touch is desktop, not Tablet. Though Tablets do have multi-touch technology.
Can’t tell if trolling…
I really do believe that Windows 8 is going to be the new WindowsMe. Vista was at least usable.
windows 8 is very usable.
It works well enough in my testing, though I don’t use it exclusively or as my primary OS.
On a laptop or desktop, it basically feels like Windows 7 with a tablet OS bolted on the side. [I’m currently running it under Parallels 8 on my MacBook Pro.] The tablet-oriented apps are sometimes a bit awkward under touchpad or mouse control; getting at the “charms” through hotcorners feels like a hack, and activating the app bar through right-click is pretty funky too.
The rest of the OS (Desktop view and desktop apps) still appears to look and work just like Windows 7, with a more “flattened” widget style. If they’d just left in the start menu and booted to desktop mode on non-touch devices, you’d probably see a lot less whinging.
On a tablet, that touch-oriented OS is interesting; I actually rather like it and I’m curious to see how it pans out when real devices hit the market.
On a laptop with a touchscreen, Metro is a little nicer since you can use the gestures like on a pure tablet, but you’re still able to use desktop apps with a real keyboard and touchpad.
What’s going to be *really* interesting is seeing how things go on hybrid touch/keyboard devices. These exist already in the Android space with things like the ASUS Transformer’s keyboard+trackpad dock. iOS has keyboard support but no mouse support, and neither iOS nor Android will run your traditional desktop apps from other operating systems.
Intel-based touch devices will actually run real apps in desktop mode… I can use Git and Visual Studio on the Samsung Series 7 tablet to develop apps — obviously they’re not very touch-friendly, but hook up a monitor and keyboard/mouse and it’s a “real computer”.
On my Dell Inspiron Duo (mini laptop with a touchscreen that swivels to use in either “laptop” or “thick tablet” mode) I can also do some development, but the screen and keyboard on that particular device are not great. (The Inspiron Duo’s fatal flaw is a lack of a video-out connector, so you can’t fully “dock” at your desktop by connecting a bigger monitor.)
Of course if Windows RT won’t run desktop apps not from Microsoft, the ARM-based devices will be more limited… but they’ll still be dockable for Office and web apps.
Actually Windows 8 is usable, I’m running it roight now (I never thought I’d say that). You only need to tweak it a lot. For ex. you can enable a local account, uninstall all metro apps, hide the store and disable the hot corners. Then metro becomes an overview of all installed applications and the desktop is like the one in Windows 7 but faster and with a IMO nicer theme. The only thing I’m missing from previous versions is actually “recent files”.
Although I’m pretty sure people who don’t want to spend some time adapting and tweaking their desktop will hate Windows 8. I think it’s really annoying by default.
I don’t believe Office is lagging behind, it’s obviously not even Microsoft has figured out how to make a Metro version of a larger application instead of just some app with limited functionality.