Software engineer Satoshi Nakajima, the lead architect of Microsoft’s Windows 95, picked up a Mac for the first time two years ago.
He was so impressed, he says he’ll never again touch a PC again.
Satoshi loves Apple products so much, he started a company in April, Big Canvas, to develop for Apple’s iPhone platform full-time.
“We have chosen iPhone as the platform to release our first product (for) several reasons,” explains his company’s website. “We love Apple products… You need love to be creative.”
Interesting stuff. I’m on a Mac right now. It’s *okay*. OSX looks nice, but I think the UI and applications are sometimes a bit too elementary. Just my opinion.
BTW, on the main page the title of the article is ” Microsoft’s Windows 95 Architect Is a Happy Mac Convert”, with a ‘â’ instead of a ‘.
There are both good and bad apps on every platform. Personally, I find that mac apps tend overall to be better designed and of higher quality.
I have found that most of the commercial programs I need to work with are considerably cheaper and more functional on my Mac than were the PC equivalents. And a lot of what I had to purchase for Windows came free with the Mac. But, to each his own. I know people who are happy with Vista as well.
I guess it is because Mac users tend (like me) to be more anal about whether the application not only accomplishes the task but actually is usable as well. Fulfilling a task isn’t enough; the whole experience has to be top notch from the interface to the quality of the implementation.
The only other ‘desktop environment’ that comes close to that sort of anal retentiveness is GNOME where GTK+ based applications are routinely updated so that they conform to the GNOME HIG specifications.
On the Windows world, it appears, unfortunately, that very few developers seem to actually spend any time making sure their application is ascetically appealing but also usable at the same time. Far too much time these days seems to be spent on ‘skinning’ and making things look cool rather than making sure that end users are as productive as possible on a given environment.
Edited 2008-07-22 00:39 UTC
and this is the #1 reason why I (a long time KDE user) recently tried GNOME for getting some real work done and haven’t logged back into KDE since. This is not to say that one is better than the other, but I found GNOME to be so much more elegant and a lot more “polished” and because of the level of polish in my distro towards things like OpenOffice etc, I found I was able to work more productively.
Not only that, but compiz works great in GNOME but still has many small issues in KDE (for me anyway) so I was able to fully utilize the eye candy as well, which is a huge plus for me. The whole system just feels so much more refined and stable.
offtopic i know – my apologies…
there are a lot of ui problem in gnome…
toolbar size is not alway the same, icon size, icon bar….
google it if you want to know others
applications is poor if i compare with kde: k3b, konqueror, koffice, kopete, amarok
i think kde 4 is a lot better and easier to use
toolbar size is not alway the same, icon size, icon bar….
I have not noticed such issues during all the years I’ve used GNOME.
applications is poor if i compare with kde: k3b, konqueror, koffice, kopete, amarok
As it has been said these things are matters of personal preference. I for one absolutely hate konqueror, I wouldn’t use k3b either (I like to just right-click on image files and select burn. Or just drag files to Nautilus window and select burn. So much easier), eMesene is a perfect alternative for Kopete and Rhythmbox is just somehow visually a lot cleaner than Amarok. That being said, there are good applications in both DEs as well as there are good application for Windows and OSX also.
It is deceptively basic for the most part. Having a Mac at home and at work a Linux with advance effects on. Apples approach is that in the UI everything does what is expected. With linux with full eye-candy while looks cool it is less productive for the most part. Eg dragging across different apps.
… Blue Screen of Death for iPod!
Sorry, couldn’t resist
Don’t you mean, segmentation fault?
Do Macs have BSODs?
A BSOD is just a kernel panic and OS X can also have panics. They used to be a bit frightening:
http://www.osxbook.com/book/bonus/chapter5/panic/images/panic_1.jpg
but now look like this
http://www.leussler.net/wordpress/wp-content/uploads/2007/10/osxker…
Edited 2008-07-22 00:47 UTC
True, but Apple’s kernel doesnt’ produce those gloriously useless and cryptic registry addresses that reveal nothing about what the hell just puked.
Linux and OS X are truly missing out on that one.
True, but Apple’s kernel doesnt’ produce those gloriously useless and cryptic registry addresses that reveal nothing about what the hell just puked.
I write file system drivers for Windows. When the machine blue screens (which is very common sadly (due to my great programming skills – lol)) it gives me a detailed report of what just happened. You just have to know how to read it. It has saved me hours of debugging.
The Mac’s SOD is nicer than the BSOD, but no more enlightening. It doesn’t tell you why your machine just “puked”, only that it did and how to restart it. There is a dump that you can look at however (just like Win and Lin), but again, you have to know how to read it…
I’ve not seen the Linux SOD for a while (I hardly run Linux sadly), so can’t comment on the amount of detail there…
In writing filesystems you also have access to the debug code, tracers and more to help give feedback, besides the reference manual to describe each error in detail.
The average mortal doesn’t have access to this and to them it’s garbage. The more revealing and discernible the error messages are inherently speaks volumes to filesystem developers who take that into consideration.
here is a kernel oops for you:
http://fopref.meinungsverstaerker.de/div/oops.png
(this one didn’t lead to a kernel crash)
you see the information there … complete backtrace.
and even for someone that dont read hex daily, thats potentially informative.
reiserfs that was acting up?
I made this “screenshot” to write a bug report about a reproducable reiserfs bug on a Seagate disk. Creating a new reiserfs partition, then copy some hundred megabytes of files over. It crashed everytime.
Still, I didn’t get any feedback for this critical bug report. After over a year I got a “is this still in version 2.16.xy?” Needlessly to say that I just didn’t care anymore…
there is a reason why i choose ext3
In Mac OS X after a reboot forced by a kernel panic, you will find the stack trace dumped in a file.
Additionally, in MacOS X you can put the kernel in debug mode. This means that, in case of a kernel panic, you can have the dump of the entire kernel memory sent to another machine and analyze it with gdb.
You also get a dialogue when you login after a kernel panic asking you if you want to report the issue. This dialogue holds the details about the kernel panic and usually gives you the relevant info pertaining to the crash. I used it to find out that 10.5.2 was having issues with Firewire interfaces, though that was obvious since I would get a kernel panic every time I connected my Audio interface to my MBP.
Back in about 2000 the BSOD helped me realise that my audio driver was causing crashes. (which i would never have thought seeing as the driver came from the cd.)
download new driver and install, no more crashes. you just have to know what you’re looking for. i’ve used it for a long time and you tend to pick things up as you go.
Edited 2008-07-22 02:33 UTC
Through no fault of my own I once had half an OSX version installed over a different version of OSX…. when I started it up I got the initial blue screen you usually get when starting and then suddenly half the screen had these runs of monospaced characters (white on black background like from the commandline) which looked like they were running down the screen covering some of the blue.
It was static, but looked like the matrix was trying to take over my computer…. well it failed, but then so did OSX.
i like the sad mac a lot more (http://en.wikipedia.org/wiki/Sad_Mac)
Yes, they have some equivalent thing that displays with a nice message saying you have to do a hard-poweroff.
What “good” OS is hard-ware limited? My friend has a major problem with Mac OS, and I use it all the time, but he does have a good point. That’s a major flaw with this OS.
I can think of few accolades deserving more derision…
at the time, win95 was quite a step up from win3.x…
Sure, but you can probably imagine what happens when you try to polish a turd.
heh, i have seen computers with winme first hand, so yes, i know…
still, im not really sure these winnt based versions are a improvement. at least with win9x one could swap a motherboard without being faced with a bsod at boot…
…this doesn’t become a “this OS” better than “that OS” debate again…
Aren’t we all tired of them. I’ve never ever seen the following happen ever
Person A “My OS is the best OS ever, you should switch”
Person B “How wrong I was, I will switch, thank you A”
Every time someone someone does switch (or just realise they were misinformed about another one) is when they have actually used an OS for a while and given it a chance. You do that with most OS’s and you will find your opinion will change…
Most people (including me) argue out of passion and ignorance.
Tools for the job…
This guy may or may not ever use Windows again, that’s OK. Right now the OS on the iPhone is just perfect for what he needs…
So true … you use what works for for you. I’m pretty happy with my windows. It may be Mr. Satoshi Nakajima’s opinion that developing for the iPhone is the way to go, but surely coding ObjC is a step back for the comforts of say Java or C#.
Now don’t get me wrong: I LOVE the iPhone, but having to use XCode (as opposite to intelliJ/Visual Studio) and a not to well documented API is a lot of unnecessary work.
Since apple switched to Intel, you just use whatever you like when ever you like.
Edited 2008-07-22 06:44 UTC
“Software engineer Satoshi Nakajima, the lead architect of Microsoft’s Windows 95, picked up a Mac for the first time two years ago.”
Really?
I find it hard to believe that the lead architect of Microsoft’s flagship OS never thought that it would be a good idea to at least look at what the competition was doing.
I suppose the Start Menu, Recycle Bin, Explorer, and Help buttons were just a coincidence.
Maybe he used OS/2.
Maybe he should have done a better job on windows or maybe worked at apple
maybe he was expressly forbidden so as to not risk a lawsuit?
apple already went after microsoft ones over the similarity between windows and macos. no need to give them more ammo, right?
Also, back in the 90’s there was no OS X, just the awful, awful MacOS – which I suspect they looked long and hard at and decided to replicate everything that was wrong with it.
Am I to believe that a principle architect of a software never before picked up the OS of a major competitor?
Either it’s a marketing ploy, or the guy’s just an idiot, and saying he’ll *never* go back I suppose actually indicates the latter.
From article:
How was it developing under the limitations of Apple’s SDK?
Well, I think it was a good decision to limit applications to running one at a time, if that’s what you mean. I think that limitation is very beneficial to power consumption and memory usage and we didn’t find it difficult to work around Apple’s limitations for our application at all.
So he praise Apple for tieing developers’ hands. On the very next sentence he admits that they had to break this limit.
He didn’t state he “broke the limit” as you put it, where he violated things: a more accurate way of looking at it is they figured out how to get things done within the stated constraints.
Microsoft’s big oops for Windows Mobile on phones is the mistaken idea that a phone is a desktop computer, and the OS should act like one. If you’ve ever actually done embedded software that you want to run a long time, on a resource budget (power and RAM and CPU), and reliably, how Microsoft went about it is entirely wrong for that task, and that especially extends to the user interface. Modern desktop operating systems have swap files: that isn’t a very viable option on a cell phone, because of various factors. Thus, you run what can fit in actual RAM and make that work efficiently. Certainly, you can use memory-mapped files from flash (I’m guessing Windows Mobile can, and I know the iPhone OS can) and that’ll help a lot, but once you have several processes running in RAM at once on a limited RAM platform without swap, and they’re running for extended periods of time, heap fragmentation is a growing problem (great, now I’m sounding like something out of a drug commercial!) that’ll eventually make things very unhappy, and chances are it’ll cause problems for more than just the one application with fragmentation. Of course, there’s always the alternative way of designing such an application for an embedded space: set fixed limits for resources, such as memory usage, and never allocate or free new resources after the first initialization. This works perfectly fine when you have full control and knowledge over the resources of the machine, but… if you’ve got several applications of unknown origin that may be running at the same time as yours, you may not even be able to predict being able to get the required resources.
Microsoft has made their engineering decision to attempt to replicate a desktop OS in an embedded device, and look what it got them. Time will tell if Apple loosens up their restrictions to be more like Microsoft’s OS policy, but if application developers can work with more of the assumption that they’re developing for something closer to a console in nature where things are more deterministic, all other things being equal, long-term better software stability should result. Remember: there’s no such thing as a free lunch!
“working around” != breaking. It can also mean “Working within those limits”.
Personally, I wouldn’t give any deference to what this guy has to say about anything. Technically speaking, the Win95 architecture was C-RAP. Granted, it moved users from the Win16 model to Win32; however, it did so in the kludgiest way possible — thunking, global locks, unstable driver model, etc. It was a stinking pile of code. I don’t think that I would tell ANYONE that I was the architect of Win95 under any circumstance… maybe my priest, for purposes of confession, but that’s about it.
Edited 2008-07-22 18:09 UTC
Operating systems are way more than the technical aspect. Geeks are quick to call something crap while ignoring a good GUI, and quick to praise something that only a geek would know how to use.
I dare call the taskbar they introduced if not an innovation, at least a improvement! A quite important one. Now people had a visual way to see what windows they had open when they had many. (Multi-tasking where about to really really become multitasking.)
Take Ubuntu. It’s easy to use, but probably quite far from the best Linux-distro from a technical point of view. Despite that, I think it’s way better than most distros simply because it’s easier to use.
Apple should copy the Windows taskbar. at least there should be text on the dock so that you can read what the icons are for. More text and smaller icons=I can find things faster.
win 95 was bad on the inside but there is a reason why the first thing people do when they get vista or XP is to make it look like win 95. I don’t want an OS that’s jumping up and down saying “look at me! look at me!” but that’s what you get with mac os x or xp/vista.
I think I had lead something that was the beginning of an horrible area (9X/ME), I’d go far away too. I never fully understood why Microsoft decided to split their Windows division into a consumer and a prosumer division. I mean, if Windows NT4 would simply had been released instead of Windows 95, the world would be so great as of today!