The new M1 MacBooks are fast, beautiful and silent and the hype is absolutely justified. There’s still a lot to do on the software-front to catch up, and the bugs around older iOS Simulators are especially problematic.
All of that can be fixed in software and the whole industry is currently working on making the experience better, so by next year, when Apple updates the 16-inch MacBook Pro and releases the next generation of their M chip line, it should be absolutely possible to use a M1 Mac as main dev machine.
For the vast majority of people, it’s going to be very hard to resist these new Macs. They’re just so far ahead of the competition in performance, power draw, battery life, and noise (or lack thereof) that any x86-based laptop just can’t compete, unless they go hardcore in on price.
I’d love to have one to test and review here for OSNews, but financially that’s not in the cards for now.
The majority of people will stay on x86 and Windows, because that is what they know. The Apple circles will flock to these new machines. OEM builders will try to emulate what Apple did, but fall short anyway. Alternative OS users will shun these machines if they only boot macOS.
Yeah, I think he probably the meant the majority of Apple users rather than everyone. Still not for me as long as I’m forced to use Mac OS or ARM Windows, they’re not even on the radar.
I’ve had a fanless Samsung octa-core ARM-based Chromebook since 2017 for about US$500. With Android apps and an embedded Linux VM, it can do everything that I need. ARM-based laptops are great.
Welcome Apple peoples.
Except the much greater performance, much better screen, much better touchpad and just about everything else.
It’s plainly obvious you’ve never used a higher end chromebook, but let me expand your universe a bit, if I may. I’m guessing that you’ve experimented with the bargain chromebooks at Best Buy and have a half-formed opinion.
This chromebook has a 2400×1600 touchscreen display, a fabulous touchpad and a terrific keyboard; furthermore, its screen flips and can be used as a tablet complete with stylus or touch. What is true in your comment is that the 8-core Exynos processor is 3 years old now and it’s RAM constrained. I’m sure that the M1-based macbooks will look dated in 3 years hence, but by then you’ll be able to run iOS apps. just like this chromebook runs Android apps today. I bought a more expensive 2017 Dell XPS13 around the same time as this chromebook and actually use the chromebook as my daily driver. It’s that good.
Welcome Apple peoples and richter.
Imagine creating a new account just for this comment, what a belmet
After about 20 years of x86 dominance we are seeing more diversity and fragmentation, which is actually a good thing as it will lead to more choice, competition and lower prices for similar performance.
It will also mean that truly open standards are going to be needed for interoperability, so those companies still peddling their proprietary formats and services to lock customers in will be finding themselves more and more at a disadvantage for adopting and propagating such policies.
psychicist,
I agree, it’s good to have alternatives catching up to x86. Many of us wanted high performance ARM PCs for a very long time.
That’s not a given. Apple became the most profitable mobile manufacturer using walled gardens & vendor locking. We may not like it happening on PCs since it would be devastating to owner freedoms, but they may yet find a way to make the same business model work in the PC space. We’ll see what happens, but I don’t think it’s wise to just assume that openness automatically wins. If we don’t stay vigilant it becomes easy for our digital freedoms to be taken away.
Nah. What this means is that we’re in the post-hardware stage of this industry. Computing/information is just another commodity now. Which is cool ,
So there are 2 business models depending on wether you’re targeting consumer or professional markets.
– Walled ecosystems tend to rule the consumer market
– Whereas Professional markets are now mainly ruled by subscription models.
With some obvious overlaps between the two.
In this case, the M1 means that an “ecosystem” (Apple) is now capable of making it’s own competitive SoCs. Hardware is now less relevant in the consumer space, because it is not the main driver of profit/market.
Most buyers of M1-based macs really don’t care much about whatever it’s inside of their laptops, as long as it performs as good or better than the alternatives there’s no value proposition for them to leave the Apple ecosystem (or for new customers, choosing Apple is a no brainer if you’re in the lifestyle/purchasing power Apple targets).
javiercero1
This is rather reminiscent of the IBM days when the normal mode of computing & information processing was to use IBM’s offsite assets rather than running them locally.
Yeah well this ignorance is bliss attitude in consumers is liable to come back and bite us as the corporate-governmental complex increasingly takes full control of our information and devices and 1984 ceases to be a work of fiction.
I don’t think this means anything for open standards. I’m not sure how you are jumping to that conclusion. Adobe will still read Adobe files, MS office will still read ms office files, etc etc. With everything moving to the cloud, most things most consumers use are not even stored locally, requiring even less open formats.
In an open world the likes of Adobe andere MS Office could be rendered irrelevant in a heartbeat due to cross-platform and open file formats becoming the norm.
However, if entire industry branches stick to their old ways due to Stockholm syndrome and extend monopolies to the cloud, nothing will change for them. Maybe it’s time to stop being stooges and grow some spines.
I agree I want open file formats, I’m just not sure how that’s related to M1 arm chips.
As the client landscape becomes more diverse with various operating systems running on chips implementing various instruction sets, the incentive to stick with entrenched proprietary file formats will become less and less. The spread of devices with M1 chips might play a role in this, especially if they end up running Linux in addition to macOS.
I think the Apple M1 demographics won’t be the driving force behind that seeing the demand for Microsoft and Adobe applications. But I am not going to let myself be limited to those proprietary applications just because others keep sending files in their closed file formats.
Of course there are situations in which there are no alternatives, but that should be the case less and less. The adoption of applications such as Blender could be an indication of such a future, but there are still many legacy applications (often for Windows) without open source replacements.
I’m a bit surprised by the response I’ve been getting form my Apple based associates, they love the M1 and gloat about performance and battery life, but they have no intention of buying one at the moment. Also I’ve only had luke warm responses about the iPhone 12 models, I’m sure they’ll sell but perhaps not with as much gusto as many thought. It’ll be interesting.
I gather this is a situation caused by COVID.
This reminds me of when I switched to iPhone from Nokia WinPhone, I bought straight into the 10S, I think at the time it was the Applie top of the line, I was quite impressed in terms of performance, but my Apple associates said Yeah, …….. Nah!
cpcf,
I don’t know, if I could run linux on these M1 laptops, they’d be very desirable since they’re the best ARM CPUs available. If I cannot run the OS of my choice though there’s no desirability whatsoever. Alas, I don’t believe apple’s M1 strategy requires it to play well with others, especially niche desktops like BSD/linux. I’d rather this didn’t happen, but apple’s M1 computers may end up being walled off to alt-os users 🙁
That niche user isn’t going to pay Apple’s bills, the Apple users I know have zero interest in anything Linux or MS!
I’d like to hear from a user of the new M1, how the new user experience compares to relatively recent alternatives. I suspect this is part of the problem, the extraordinary numbers aren’t really making much difference to the end user experience, they are becoming less and less relevant.
I have heard from an iPhone 12 user, after a few days they were a bit “it’s OK!” It’s seems if your not into shooting ED images or photos it’snot got much to offer.
cpcf,
Yeah, I can believe that. However it’s the opposite of what I’ve encountered in my own social circles where the majority of mac users I meet actually do have a linux background and use it for some stuff. It’s kind of rare that I meet a macos user and when I do it’s usually in the context of CS peers and software work.
That’s true. Unless you’re running heavy workloads, the performance gains over the years haven’t done much to change our user experiences. Improving responsiveness has diminishing returns… from 100ms 50ms makes a big difference, from 50ms to 25ms can be noticeable, but much less so. When you start getting to 25ms, 12.5ms, and then 6.25ms these improvements are likely to go completely unnoticed even with huge CPU gains, and a lot of desktop applications are already at this point. Even so, the battery life may be another compelling reason to choose a CPU.
The type of users who are most likely to notice CPU performance (good and bad) are those who run heavy workloads, and in this case the low MT benchmarks scores probably points to disappointing performance for such workloads. The M1 is probably a bad choice for them, at least until future CPUs show up and improve on MT performance.
I have an M1 Macbook Air. I can confirm it is extremely fast and very power efficient. It’s pretty much always at room temperature, the battery lasts more than a day of regular use. The x86-64 emulation isn’t noticeable at all, no transpiling pauses or glitches that I’ve seen.
It seems so effortless in its blazing speed it’s a little uncanny.
I’m a dev so I’m still watching to see how the developer tools I need shake out (ex. golang in Feb, apparently), but my hot take after a week is that it’s the best machine I’ve had since my Amiga 500.
Don’t get me wrong, I’m not bagging the platform it looks super impressive for MacOS use. I’m cynical though of first use experiences, because everything I’ve ever owned looked super-impressive on day one.
But in reality I’ll have to wait and see if it heads towards Linux usability, if so I’ll be onto a Mac Mini in the blink of an eye! The MacOS stuff for me is irrelevant, I can’t work with a bodged BSD, and I don’t usually need to open a dozen desktop apps with the click of a single button so that sort of benefit is a bit intangible. While much of what I do is cloud and as such makes the hardware irrelevant, I still need a local base as do most of us at least some of the time! Every week I hear about or experience a bad story relating to the cloud going wrong, so that sort of bridge isn’t viable either!
> > > I’d like to hear from a user of the new M1, how the new user experience compares to relatively recent alternatives.
> > [provides subjective experience]
> I’m cynical though of first use experiences, because everything I’ve ever owned looked super-impressive on day one.
LOL! Why did you ask?
I suppose even though I’m cynical, and I think there is very good reason to be cynical of corporations hawking technologies, I still trust people 1000x more than any corporate collective!
Apple could maybe sell many systems for use as servers if they made it not hard to run Linux on it.
It will depend on real world performance, etc. which we still have limited knowledge. But it looks very promising:
https://info.crunchydata.com/blog/postgresql-benchmarks-apple-arm-m1-macbook-pro-2020
So, long story short: if you’re a developer of anything other than Apple-platform-only software, wait a year for things like Homebrew, Java, Docker, etc. to put out ARM versions, not to mention for a 32GB RAM model to be available. With any luck maybe Microsoft will by then finally allow ARM Windows to be licensed to non-OEMs and tools like Parallels and Crossover/Wine will also have adapted, giving the new machines nearly all the functionality of the old ones.
Perspective from a developer that has no interest in dealing with any proprietary environment (like Apple’s).
If this development means that Apple is indeed so far ahead of the competition and the rest won’t catch up any time soon, it means that typical developers will end up using this Apple hardware as anything else would slow them to a crawl compared to colleagues.
Previously I have seen developers having to put up with Windows just because that was what the rest of the company was using. In a way this development points out how irresponsible it is to provide developers with a standard corporate environment. At the same time I don’t know if I’d like the standard developer environment to be Apple.
You seem to contradict yourself with the claim that inside corporations there is a dominant platform that people simply must submit to using for fears of missing out, but also proposing that it is so easy to switch that Apple has already won the war albeit not having released a single ARM-chipped pro-level computer yet…
Typically there’s a corporate standard that people adhere to because they feel uncomfortable to challenge it.
It has nothing to do with being difficult to switch or fear of missing out.
@Z_God
That is a prime example of what I’m talking about, in relation to the futility of benchmarks as a measure of UI performance. The OS being ready 15ms sooner makes little or no difference to the user experience or productivity, and claiming those remaining on older hardware are slowed to a relative crawl is a prime example of a misrepresenting real life user interface experience.
Your post is making me have second thoughts about my criticism. A responsive UI is very very important. Most people will have a difficult time understanding the difference 15 ms makes. Look at some of the demos of the m1 macs, and see how quickly apps launch. It is insane. Lag causes the mind to wander, focus to get lost. The faster a device responds to a user request, the more productive they are going to be over all.
So cloud based development is good/going to get better, but, man the ui responsiveness is going to be a hurdle.
Bill Shooter of Bul,
You still need to look at diminishing returns though. Lets make a simple assumption that every year the CPU speed doubles and response time halves. Assume response time starts at 10s.
0 years = 10000ms
1 years = 5000ms
2 years = 2500ms
3 years = 1250ms
4 years = 625ms
5 years = 312ms
6 years = 156ms
7 years = 78ms
8 years = 39ms
9 years = 20ms
10 years = 9.8ms
11 years = 4.9ms
12 years = 2.4ms
13 years = 1.2ms
14 years = 0.6ms
15 years = 0.3ms
…
At some point not much to be had by going faster. Anyways, the CPU is rarely the real bottleneck these days. IO (disk & network) are responsible for far more latency in most applications. You’ll get much more bang for the buck optimizing IO.
Indeed, these’s no denying that remote servers are always going to involve lag, but you can mitigate this by mostly running locally and making connections less frequently.
Google’s game streaming service stadia took the opposite approach, with every single frame inuring network latency. This “cloud gaming” concept may appeal some users, but even on google’s own broadband service it’s relatively laggy. Meh, I’m not a fan of that approach. I think streaming games might make sense if you download the resources for each level and then play the levels locally…but sending down every single frame across the internet creates lag and I imagine may take up more bandwidth than the original game assets to boot.
Yes, this is exactly the point.
So what happens is the hardware developers then start showing us examples of launching multiple apps as a selling point, launching Photoshop, RAD Studio, Eclipse, Chrome, Firefox, Docker and Netflix with a click of a button, something we’ll never do in reality. Sure I get that it is impressive, but does it really make a difference to my day to day life, it’s a bit like owning a Ferrari and living in the centre of the CBD just walking distance from work!
I’m not sure why you mention UI performance. I was basing this on the compilation times that are in the article.
I don’t know. Maybe I’m old, but I remember the G5’s and how they were so much more powerful and cheap than x86 that universities were building top 500 super computers out of them. Then Apple switched to intel. I still have my doubts that apple will continue to invest in desktop cpus, and that the x86 duo of intel/AMD will take this lying down.
That combined with the migration to the cloud, I’m not sure even local performance matters that much. If companies can pay a flat price per developer per month for adequate cloud hardware, I think they might jump on that bandwagon soon, just as they have with software licenses.
Yes, that’s true. I forsee also an increase in hosted VMs for dev environments.
“For the vast majority of people…”..
Don’t think so… maybe for apple addict but for other… important is the software ecosystem……
Yes, this is clearly the issue, I’d love to go down the new hardware route but I’m not being burnt by that again I’ve seen this all before!
Some commentary is trying to paint this as similar to the phones, using the explosion of phone hardware like it’s some sort of predictor for the M1, but phones were a new platform and are almost independent / disconnected of 3rd party software.
If this ARM direction was so attractive to software Devs, why didn’t we see more of the “Phones as a portable corporate desktop”, they’ve had more than enough capability for years and years, and the economic argument is self explanatory, but the uptake is virtually nil with early attempts effectively becoming museum pieces? They didn’t even need the discrete graphics, and the uptake is still nil!
It is exactly the 3rd party legacy applications holding such new usage scenarios back. I never allowed myself to become dependent on proprietary software, so I can easily switch between operating systems and hardware architectures.
When we finally realise that those legacy applications aren’t that great or could be rewritten from scratch, the incentive to use a phone as a compute unit that could be attached to a keyboard, mouse and a monitor to serve as a desktop would become much higher.
In a sense I blame Windows addiction.
I bought an adapter lead for my Samsung Galaxy Nexus to experiment using it as a desktop. It chugs a bit now so I mostly use it as my home deskphone and forgot about this until you mentioned it.
I have no problem with the idea of using a smartphone as a desktop especially if it had a dock to plug in a keyboard, display, and extra storage. Android still doesn’t properly support mouse and keyboard though does it?
… other than high end gaming, most people can find the same apps in OSX and Windows.
Is that a gross generalisation in this case?
I was forced to take a whole commercial design department off PowerPC platforms years ago due to losing access to updates and bug fixes for software products from several major vendors. Now what few Apple Macs are left we keep for legacy/compatibility support issues and not much else.
It was never about hardware performance.
similar and same is not the same thing…..
if i had used software x during y years….. very few chance i want to restart to learn software z with similar functionality……. need to put time, money