Apple Inc. is planning to use its own chips in Mac computers beginning as early as 2020, replacing processors from Intel Corp., according to people familiar with the plans.
The initiative, code named Kalamata, is still in the early developmental stages, but comes as part of a larger strategy to make all of Apple’s devices – including Macs, iPhones, and iPads – work more similarly and seamlessly together, said the people, who asked not to be identified discussing private information. The project, which executives have approved, will likely result in a multi-step transition.
This shouldn’t be at all surprising. Apple’s own Ax chips are quite amazing, but still limited in how far they can be pushed because of the small form factors they’re being used in. On top of that, everything seems to be pointing towards the latest Windows-on-ARM devices having multiple-day battery life, with which Intel chips simply can’t compete. It makes 100% sense for Apple to put its own processors inside Macs.
Apple wants to own and not to pay anyone so they could go the Risc-V route. That would do more damage to Arm than to Intel.
Risc-V is an unproven architecture without major tested OS or compiler support. Yes, Linux and GCC ‘support’ it. That means something very different from being production ready.
It would be insane to totally move a commercially supported OS to it, especially since Darwin/XNU doesn’t support it at this time.
Besides that, this isn’t about patents, it’s about fabrication.
Edited 2018-04-02 20:46 UTC
No, but it’s also about royalties.
I don’t think money is an object for the richest company in the world. They have cash piles larger than the GDP of some countries
So? They’re always looking for ways to get more money nonetheless.
I bought into Apple with a dual-G4 machine, and a few years later was left high and dry when they absconded over to Intel. As much as I liked my Mac, I was bitter and wasn’t going to follow them onto a new platform.
Luckily for me, I stayed disgruntled and didn’t bite the bullet. If I had, I would be sooo annoyed right now…
So, did you move to windows? How’s life since you left Macs?
My Mac love ebbs and flows with the lack of staggered releases of OS X of various qualities and hardware releases and refreshes.
I’d stayed away for years, but just tippytoed back in. Its kinda auful on the desktop, IMHO. iphones/ipads serve a purpose.
TBH, I’ve never really settled on anything since.
My main machine is on Windows 7 ‘coz I don’t like 8 or 10, but that also has good days and bad days. Things don’t always work the way they should. At least there’s plenty of software available for it.
I’m constantly changing my laptop between BSDs and Linux distros – I’ll use them until something breaks, or I find I can’t work out how to make it do xyz, then try another.
Occasionally I get to play with Macs (older ones), and although I can’t be specific why, they just seem to feel more pleasing than Windows.
Honestly, that says more about you than it does the OSes.
Yes, it does.
It says I want a machine to do what I expect it to do, and keep doing it until I want it to do something else. Unfortunately that seems just too much to ask.
There are posts on forums out there where people have had similar issues to me, often on dead threads because no-one has an answer…
I’m too old to be pissing about trying to get things working that should work out of the box. Recompiling software just isn’t fun anymore.
Yea.. I’ll admit I don’t have time to debug boot issues. If an update stops allowing me to boot, then I find something else that will work, rather than spend the time and effort debugging everything that could be happening during boot with my obscure hardware.
Note to self: Never buy obscure hardware again. Pay more for really common hardware.
Why would anybody want to do that? Linux distros follow the Taco Bell strategy: They appear to be different on the surface but underneath they are the same low-quality ingidients with different seasoning on top: Awful X.org with bad GPU drivers, awful audio stack, awful drivers for peripherals with bad power management. Most people soon realise this and go to either Macs or Windows, however some will keep hopping from distro to distro hoping to one day find one that actually leaves a good taste in their mouths.
Edited 2018-04-03 08:58 UTC
I really don’t see anything GNU/Linux and Taco Bell have in common.
However what I see is a software ecosystem that targets the user, and only the user. Software that does what I want and how I want it. Not encumbered with or impared by corporate politics, revenue through vendor lock-in, revenue through upselling, and revenue through third-party deals.
The “crappy audio” on my GNU/Linux system works very, very well for me. And it’s not just playing audio. Today I expect a seamless switchover of single applications between line out, HDMI audio, BT headphones, and Chromecast streaming. Which is exactly what I get; my way and without any hassle.
The X server is doing o.k., it could do better in several aspects; however when I compare to my coworker running Windows, it appears to be quite competitive (e.g., my multi-monitor setup with one HiDPI display and one old FullHD monitor works well with some customization; my coworker had to retire his FullHD monitor because he ran into unsolvable issues). The cherry on top is that I can use my favorite window manager and focus on being productive instead of shuffling through windows all the time.
I use the very same distribution (Arch Linux) for about 13 years now. During these 13 years, a lot has happened. For example, Windows Vista has happened. Windows 8 has happened. Or the forced-down-the-throat Windows 10 updates have happened. The Mac world certainly had it better than the Windows world; However my father is locked-out of the newest OS X with his old hardware that is otherwise perfectly fine for him.
I was fortunate, during the whole time I was pretty happy with the system I was using, and never had to look another way. However, as a computer scientist with an interest in operating systems, I always kept myself informed about what happened in the market. Seriously, this is not Taco Bell, this is also not for everyone, but this is top-notch.
Edited 2018-04-03 13:48 UTC
The massive diarrhea you might get after partaking of either one? Oh wait, that’s Windows. </sarcasm>
Though the revenue model common in open source (through support) might be an incentive to make software which is ~harder to use… (and indeed GNU/Linux has a reputation of beeing somewhat harder to use)
You make it sound as if Linux deliberately makes crap components. While sometimes Linux components can be bad, it isn’t caused by people coding for Linux but the ecosystem. In the Windows ecosystem if you are a peripheral manufacturer and want to sell your product, you have to create device drivers (and you don’t have to share the code). In Linux, because manufacturers don’t get paid, often drivers aren’t created and the community will have to, based on incomplete or non-existing documentation.
10 years ago this was pretty crap but nowadays developers get better at it and we see more manufacturers opening up (as they realise they don’t have to write the drivers themselves).
As to sound: after the initial clusterfuck that PulseAudio was, it seems to work pretty well now. And the close source drivers from AMD and NVidia seem to work well too. I prefer X11 in many cases as it makes it possible to have a desktop over the internet but understand it is indeed not great code inside.
I don’t hop distros that much, happy with what I have for a long time.
IMHO One of the reasons that Windows Server is losing the battle against Linux is because Linux is high quality, free (beer, freedom) and there is much more innovation in the Linux kernel then anywhere else.
I used a bit of cash and combined it with my insurance payout to get a PowerBook G4 in 2005, after a ceiling leak sealed the fate of my Toshiba Satellite. Easily the best laptop I have ever owned.
Six months after getting the G4, the transition to Intel began in earnest. The laptop didn’t get upgraded beyond Leopard, but software vendors kept fat binaries for years after. I was routinely using the laptop with up-to-date software for at least four years, after which I started to use older versions. I kept using the laptop routinely for about six years before it assumed a new life as the ultimate portable Classic retro gaming machine.
Fat binaries definitely prolonged the lives of the PowerPC machines for years. A transition like that would not have been viable without that technology. I suspect the same will happen with this fabled Intel–ARM transition, and vendors will continue providing fat binaries to Intel machines so long as there’s a market for it.
Intel Macs exploded in volume compared to the much, much smaller PowerPC market. There’ll be a far larger legacy this time round, which means better support for older machines. And if you’re annoyed by the lack of support, at least you can run Windows (never an option in PowerPC land).
You could have bought a mac with Intel and used it for it’s life after your PPC unit got old. My mac is 5 years old, and still trucking – and I use it 8+ hours a day. Even if you buy Mac now, they aren’t switching until 2020 according to rumors, and will support Intel for at least a couple of years after that. You’d have a computer you can use for at least 4 years – which is not too shabby.
I’m not really sure I understand your perspective at all here. It’s not like we buy software we plan to use for 10 years without updates any more (if we ever really did). CPUs are an almost fungible commodity at this point.
I can see Apple using ARM chips in their MacBook Airs, iMacs and MacBooks, but I don’t think they are serious about putting ARM in a Mac Pro. Realistically, only AMD and Intel have CPUs that can go inside a Mac Pro and also support the massive buses that allow the two GPUs and the SSD to be connected.
Which means Apple either have to ditch the Mac Pro or have dual binaries (or fat binaries). Hmm…
Edited 2018-04-02 22:45 UTC
Everybody ridiculed Apple when they introduced their own ARM chips, and now, they’re twice as fast as anyone else in the industry. Apple’s chip division is more than capable of producing i7/i9/Xeon-level processors.
I’m not familiar enough with the internal R&D structures of Apple, but I can imagine the following scenario:
With their own chips inside their hardware, they have a better control of what customers will receive, so they can accomodate the hardware and software much better to fit each other. Their hardware development will be done internally, so any advantage they can gain compared to competitors will be for their own benefit, as it will be present only in Apple hardware, which intendedly suits running macOS. Their vision of providing “computing experience” rather than “just hardware and some software” could benefit from this move. Combine this with the instantiation of own centralized manufacturing lines, and it will become valid to conclude that they can achieve the ability to produce better hardware at lower cost (due to less “internal” transportation, less costs for external contractors, less fees for patents etc.), making more profit without being in the “sell & forget” business (a. k. a. “we don’t care about customers as soon as we have their money”).
Time will tell if their decision will be something that leads people to buy their products – and that is the primary motivation of all business.
Apple effectively designed the chips back in the PPC days. It led to constant comparisons between Windows and Mac on Photoshop workloads and similar. When PPC/G4 and G5 fell behind, Apple was left with a tiny % of the market. They were forced to effectively reinvent themselves as x86. Apple moving away from commodity hardware is (in my view) setting themselves up for history to repeat itself
Apple never designed the CPUs, IBM did, and IBM was too busy with POWER to deal with PowerPC just so Apple would have a half-decent CPU to put in their laptops. This is what convinced Apple to switch to x86.
Edited 2018-04-03 21:00 UTC
Apple “co-designed” the G5. IBM had no interest in the desktop market for POWER.
Apple was heavily involved in downscaling the chip to be desktop-suitable. Features like Alti-Vec were co-developed with Apple and shoe-horned into the existing POWER design. IBM were so lackluster, Apple was even going so far as designing the north bridge.
The 970 chips were so specific to Apple’s needs that I think they only ended up in a blade or two outside Apple branding..
https://www.apple.com/uk/newsroom/2003/06/23Apple-and-IBM-Introduce-…
https://en.wikipedia.org/wiki/PowerPC_970
Until they actually do and prove it, those words hold no water.
Apple (and mobile device benchmarks in general) are invariably as fake as a $3 bill. They have no relationship to real world performance.
…
This is fucking ridiculous!
No everyone didn’t, no they have’t proven capable of producing such processors. Are you trolling?
When the rumors of Apple going for ARM chips in their computers people with some knowledge said “no, that would be stupid” – and it would. They didn’t have the experience of producing processors, they didn’t have the people needed to produce a top of the line processor and they didn’t have the practical capacity or capability to do it. A large part of their software was x86 based and even if written for portability that creates problems.
So now they have gathered experienced processor designers, they have been creating a good team of professionals capable of making very good processors and they have invested in manufacturing.
So yes now they could make their own desktop processors. But would they be as good as Intel chips? Probably not.
I don’t remember anybody ridiculing Apple when they released their own chips. IIRC, it was more of a curiosity than anything else.
Are you sure? The 2W Apple A11 CPU in the iPhone X has the same performance as a 21W Intel CPU. In a laptop standard 45W TDP budget, Apple should be able to put 12 high performance Cores running at 3-4GHz with another 12 low performance running at 1-2GHz. That should be a lot faster that the latest 4-6 core CPUs from Intel. Performance-wise Apple has shown 3 times the performance per watt.
Imagine having the performance of the high-end 18 core 145W Xeon CPU in a battery friendly 45W case. That’s what Apple can deliver with current technology. By 2020 Intel should be toast.
AMD has done wonders with the Ryzen architecture, but it only caught up with Intel. x86 cannot push further. ARMv8 on the other hand can push a lot further. The 1W BCM2837 CPU can push 4 cores 1.4GHz. That means that a 3GHz quad core ARM Cortex should be under 10W TDP unlike the Intel or AMD which required 20-40W for such a performance.
Makes me want to just avoid getting a Mac now (was sort of in the market for one by the summer). Seems like when they transition from 604e to G* to Intel, my timing is always screwed up and I’m left with an outdated machine. lol
I’d avoid buying Macs at the moment because Apple outright refuses to provide a stable roadmap for the product line. Pro users have been left out in the rain for years which, for a smaller company, would be a fatal mistake.
Unfortunately, most desktop OSes are garbage and macOS is the best of the bunch for most uses. On the other hand, many people successfully run macOS on commodity hardware — a practice that has become far more widespread with the product vacuum Apple has created.
Right?! I was looking forward to moving back to a Mac Pro from a Hackintosh…ehh I don’t know what to do now.
I’ve been an Apple user at various points by force and by choice. I’ve had some great experiences but I bet if I tallied it up, more horrendous ones with their products. These days I’m glad to be (nearly) rid of them. Apple can go jump off a cliff for all I care.
So, in two years another massive break in compatibility?
Something I realised recently is that breaking compatibility isn’t something Apple historically did to the Mac until they moved away from PowerPC.
PowerPC Macs were capable of running almost all Mac software released up to that point (1984–2005). The first main break came when Classic wasn’t migrated across to Intel or PowerPC installs of Leopard.
Other breaking changes have happened since, which is what we now think of as “normal” Apple behaviour.
I can see the rationale for removing burdensome code for the sake of backwards compatibility, but not at the expense of huge swathes of users. Conspicuously, the removal of Rosetta seemed premature. I really don’t understand why they’re removing 32-bit software support in the upcoming release — it seems like pissing off too many people for such little gain.
Owning a software development company that develops cloud services, we need Intel based Linux VM on the developer machines. If we can no longer run a Linux VM at a decent speed on the laptop of the developer, my company will have to stop using Macs as development machines. If this rumour materializes, we’ll move the developers to an Intel Linux box (e.g. Dell XPS).
Yeah! To get that, Linux should run on ARM CPUs, and this will NEVER happen! Oh, wait a moment…
Don’t be dense, there are benefits to running on dev machine a VM configuration which closely matches servers; and if those are on x86…
Apple’s own Ax chips are quite amazing…
When massively overclocked for a few seconds on a synthetic benchmark.
In the real world…not so much.[It’s pretty much the same for all ARM chips.]
Edited 2018-04-03 08:31 UTC
When massively overclocked for a few seconds on a synthetic benchmark.
This is BS. They are not overclocked, and have no problem with working at base frequency.
It’s pretty much the same for all ARM chips.
LOL. How about ThunderX2?
https://www.avantek.co.uk/avantek-thunderx2-arm-workstation-thunderx…
Edited 2018-04-04 22:14 UTC
IMO it’s pretty pointless because power consumption and heat (the main advantages of ARM) aren’t major issues for workstations.
Edited 2018-04-04 23:49 UTC
So not all ARM chips are equal?
Since they aren’t equal to each other (cores, features, …) so no. Why would they have a special kind of magic that makes them all special ? ARM is an IP, the implementation may differs from each other. A Wonder Media chip might not compete on the same level of performance than a nVidia Tegra or Apple Ax. Pretty much like a Intel xxx wouldn’t perform as an AMD yyy, despite both being x86-64.
ROTFL Why did you tried to explain it to me?
Previous poster, who wrote that drivel about “Ax benchmark oveclocking†and “all ARM chips are equal†seems unaware of this.
Edited 2018-04-06 01:27 UTC
LUL WUT because I answered your concern? Add the <sarcasm> tags to be more obvious next time. And, hey, don’t forget comments are public so the one you were replying to can also read my comment, so don’t take everything too personal.
Apple was forced to go to Intel cpus because they did not have the resources to improve PowerPC.
And they got nice discounts from Intel because they sold millions of Intel flash chips in IPods.
Now they finally can make their cpus and so it is a good move for them and for their users to have custom chips.
Please note that OSX derives from NextStep that had multi architecture support with fat binaries by default.
So they can support arm and intel together if they want.
Mario
I think it’s more correct to say that Intel simply doesn’t compete in this market.
They tried in the past, and a few years ago even sold an Atom SOC that was roughly on par in both power efficiency and performance with contemporary ARM SOCs being sold, and they were on a trajectory to surpass it. Their Atom SOC even found its way into a couple of phones.
Then they left the market, for basically two reasons. First, Qualcomm, Samsung, and others are thoroughly entrenched in the smartphone market, and it would have been a difficult and expensive fight for Intel to make much headway into those markets, and second, the margins are too low. The per-wafer profit Intel was generating wit these tiny Atom SOCs was significantly lower than with the desktop and server chips they make.
There is one problem you have missed a factors.
Atom cpu silicon area is larger than arm cpu performing at the same speed. Atom might be able to match arm on power and performance but it failed to match arm on silicon area. This area problem comes issue when you are attempting todo SOC(system on chip) or SIP(system in package). The silicon area problems comes from the complexity of x86 vs arm. This area issue is part of why Westen digital is looking at making 1 billion + risc-v cpus a year.
Next attempt to do custom SOC or SIP with intel or amd processors. This is where you start running into x86 licensing issues.
SIP(system in package) is very interesting. Turns out the process of making a SIP where you still multi bits of silicon inside 1 housing turns out to be more dependable than soldering those parts individually to a board. You only have to follow up on the begalbone black vs the begalbone pocket. The 1 version of normal board begalbone black exactly matches the begalbone pocket other than the pocket being SIP so the pocket over a 100 less parts soldered to the board. X86 only than a few vendors making 486 class there is nothing X86 that you can buy a license to-do your own SIP. Also pays to notice the size difference between the begalbone black and begalbone pocket remember you are looking at electrically identical. Yes explains why atom phones had a nightmare attempting to match features with arm phones. Not being able to go custom soc or sip costs you a heck load of area.
Risc-v and Arm both allow you to make your own CPUs using the instruction set. Both allow you to do SOC and both allow you to do SIP where it suits.
With the creative things Apple like doing being stuck with a chip demanding large construction was not particularly nice.
Maybe, all things being equal, but Intel was a full process generation ahead of its competitors when it was making Atoms SoCs aimed at cell phones. That more than made up the difference.
But, yeah, customization was big factor with Intel getting customers. That’s one of the things I was thinking about when I said it would be a hard fight for Intel to get their chips into phones – Atom chips paired with one brand’s LTE chips, or paired with a different brand. Atom with powerful graphics, with weaker graphics, one memory channel, two memory channels. That’s sort of antithetical to Intel’s style – they normally have one or two sets of silicon, and just disable features after the fact to create segmentation without segmenting their manufacturing, and that just doesn’t cut it for highly integrated devices like cell phones.
But, that’s a business decision, not a technological one. Intel’s absence in the cellphone market is purely business decisions, and not at all technological.
Edited 2018-04-04 07:48 UTC
But atom was large in silicon area compared to an arm a generation behind. The price of the x86 instruction set is quite high.
I had already decided I am never buying another Mac, after converting my entire family when they went to Intel.
The shitty product updates, totally abandoning products like macmini actual desktop computer and I have already decided when my mac’s die, I am never buying another.
Already started adding PC’s to the family the last few years. The only thing left out of a few macbooks, a couple of mini’s, some airport’s, a time machine, and an iMac is the iMac. And we recently migrated all our data and photos off of it onto a PC server.
Next is getting rid of all my apple phones and tablets.
I am done with that company who has no clue what they are doing. They totally farked up when they decided they didn’t need to sell anything but phones. They have lost the pro-sumer creative video/photo market too.
In a few years, I don’t think there will be hardly any mac sales at all, and they will blame their lack of products on lack of demand and never admit it was their own fault.
Yet, this is all a rumor from Bloomberg. No real facts.
And has been pointed out you will have at least 4 years of full support and updates if you went out and bought a new Mac tomorrow. Given that most PC’s especially laptops have sales lives measured in weeks (7-9 AFAIK)
I’m sure that there are a lot of people taking a more measured view of the situation that you seem to be.
As the HHGT2G said, “Don’t Panic”
Nothing has been shown to make this anything more than speculation by a company that is known to be down on APPL.
Now where’s that pinch of Salt…
Reading this thread I see that most people here do not consider an important thing: when you make a cpu you arl talking about costs, when you buy from someone else you must consider price.
I explain with an example: if I want to put a top of the line intel xeon platinum on my entry level pc, I cannot because I must BUY it from Intel and Intel asks to pay me a very big PRICE.
If I am Apple and I build cpu I know that the cost of the cpu is not related to “speed”.
So I can, for example, put a top speed cpu on my entry level macbook just to crush competition.
Look at new IPad for schools, it is 300$ but it has a very powerful cpu. It is possible because Apple can do what it wants with its cpus!
This article dates from 2015 but is a good example of how poor ARM CPUs actually are in real tests.
The cross platform TabletMark3 (https://bapco.com/products/tabletmark/) shows that the iPpad Pro is more budget Atom/Celeron class mobile.
https://www.pcworld.com/article/3006268/tablets/tested-why-the-ipad-…
Edited 2018-04-07 00:46 UTC
At least that shows ARM chips are quite enough for daily tasks like internet browsing, office editing, some gaming. Hence proves it’s usefulness in that market segment. Now if you want more power…