With the iPhone X revealed, we really have to start talking about its processor and SoC – the A11 Bionic. It’s a six-core chip with two high-power cores, four low-power cores, and this year, for the first time, includes an Apple-designed custom GPU. It also has what Apple calls a Neural Engine, designed to speed up tasks such as face recognition.
Apple already had a sizeable performance lead over competing chips from Qualcomm (what Android phones use) in single-core performance, and the A11 blasts past those in multicore performance, as well. Moreover, the A11 also performs better than quite a number of recent desktop Intel chips from the Core i5 and i7 range, which is a big deal.
For quite a few people it’s really hard to grasp just how powerful these chips are – and to a certain extent, it feels like much of that power is wasted in an iPhone, which is mostly doing relatively mundane tasks anyway. Now that Apple is also buildings its own GPUs, it’s not a stretch to imagine a number of mobile GPU makers feeling a bit… Uneasy.
At some point, these Apple Ax chips will find their way to something more sizable than phones and tablets.
Source of the A11 compared against recent desktop chips is suspect at best.
It is a twitter table with no explanation for where the numbers come from, no references at all. It amazingly shows the Intel Core i7 just besting the Core i3 series. I think someone manufactured data to get attention.
Nothing suspect about it. That’s Jeff Atwood, and those are GeekBench numbers. you can look them up yourself.
https://browser.geekbench.com
I always thought the problem when benchmarking hardware is that for fair comparisons you would also need to be running the same software. I mean you could have a hardware platform that could seduce women for you, but if the software running it would only work in polygamous communities, it wouldn’t be very useful….
http://www.osnews.com/comments/thenextweb.com/apple/2017/09/12/appl…
I think that should be
http://thenextweb.com/apple/2017/09/12/apples-new-iphone-x-already-…
edit : (same as above)
Edited 2017-09-13 18:40 UTC
Better link: https://browser.geekbench.com/v4/cpu/search?utf8=%E2%9C%…
Now, if these are true, Qualcomm CEOs will flip some tables.
I love competition, better tech for Android in the end.
Best joke I’ve heard all day. Android could run on a chip twice as powerful as this A11 is reputed to be, and it would still crawl because of its inefficiency by comparison.
You’re not wrong, but you do have to admit that the power is wasted a bit on an iphone due to the efficiency and restrictions on background tasks.
Yeaaaaaah. Something tells me that the software isn’t handling the new iPhone correctly, making ANY figures suspect.
Thom Holwerda,
You can’t always take benchmarks for granted, it takes time for independent benchmarks to confirm the results. I was browsing geekbench results and I was surprised at how inconsistent the results were even for different runs on the exact same CPU.
For example:
https://browser.geekbench.com/v4/cpu/3988183
https://browser.geekbench.com/v4/cpu/3988073
These huge discrepancies could mean there’s a problem with the benchmark, or they’re being bottlenecked by components other than the CPU, in which case it’s not a good benchmark to use to strictly compare CPU performance.
Exact same CPU, but different OS (OS X 10.12 vs 10.10) In this, every single sub-benchmark is faster on the newer version of OSX, without exception.
That probably has more to do with it.
Anyone else remember when a new OS X release meant that your existing hardware preformed _better_?
Yeah, I sure do miss those days.
Those improvement mostly came from Apple and community’s improvement of the gcc PowerPC port, so people so said.
Drumhellar,
Could be that, or maybe different memory or some other arcane mainboard controller… who knows, we’re not given enough data. If I owned these devices, I would conduct my own tests to get to the bottom of it.
However my point was that regardless of the reason for the discrepency, the mere fact that these wide discrepancies exist is in itself a sufficient reason to question the accuracy of geekbench as a measure of CPU performance.
In other words, this evidence proves that one can not assume the geekbench scores are representative of CPU performance unless it’s the only variable that changed.
I’m curious to revisit this in the future to see where it goes!
Thermal throttling…
Power throttlijng…
Background tasks…
Many factors could affect benchmark results.
From a microarchitectural standpoint Apple’s high performance cores are excellent, they really are that close to the contemporary lake core in IPC.
They’re not going to outperform an i7 (it has more cores + SMT), but they’re certainly giving a run for their money to the low power dual core i3s and i5s.
The performance of the iPad pro is quite eye opening, when compared with a microsoft surface tablet. The CPU performance diference is down to single digits/error margin. And have much better GPU performance, with a lower overall power consumption.
Their cost is significantly lower than the deep mobile intel parts. Hell, thank goodness that Apple is not selling those chips to 3rd parties, that would be problematic for intel.
I don’t like apple as a company or care much for their products. But credit where credit is due, they basically came out of nowhere and they’re now one of the top CPU/GPU architecture outfits in the world.
That’s exactly why I do want them sold to third parties. Imagine a blob-free A11 based laptop for Linux, or a convertible that runs Android.
Rome was not built in a single day.
Apple don’t innovate now do they?
OTOH, they are doing stuff with the ARM architecture and IOS Software that most others are still dreaming about.
Google is starting to take this seriously at last.
That close link between the H/W and OS is key to this.
Apple will soon introduce ARM powered Macbook along with merging iOS and macOS into one single system, for iPhone, iPad, Macbook, iWatch and iEverything.
Arm powered macbook, yes. IOS on a macbook, no. It will still be MacOs. They are related in many many ways, and the distinction may slowly disappear, but large differences will remain for some time due to the difference in capabilities and expectations for the platforms.
Now macbook air / ipad pro is kind of melding into an in between state. Which is horrifying ungodly creation, that I would burn at the stake, if It wasn’t so oddly beautiful. Like unicorn that mated with a crab.
The macbook air is probably going the way of the dodo. I’d bet there’s an iPad pro with official keyboard solution in the near future as a replacement.
Do you think they’ll stick with the iPad name? I’m guessing something Odd like iPad X or Macbook Air X or maybe even Bionic X with a Bionic X pro.
Bionic X being IOS and Bionic X pro being MacOs, but otherwise indistinguishable. Sort of like the Surface RT / Surface Pro.
Good luck getting Photoshop, and all the other apps other than the iLife suite and Apple’s own software on that though. Microsoft has had no big successes on ARM and Apple hasn’t shown that it can convince vendors to support multi-platform chips yet. Sure going PPC to Intel was smooth enough, but that was migrating towards the most popular chip brands not away from them.
ARM is by far WAY more popular than Intel and x86.
Only for phones and pads or the like. Not for laptops, and CERTAINLY not for desktops.
I’m not sure if you are aware that st, atmel, nxp, xilinx, allwinner, mediatek, rockwell, whatever runs on ARM, not on x86. The world is not just desktop, pads and smartphones. Ask yourself what your tv, car or even fridge runs on.
Edited 2017-09-14 02:52 UTC
I am, 100% aware of the popularity of embedded applications and systems. I am also aware that Linux has completely failed to make a dent into desktop market share. This idea that people will magically drop Photoshop to take up Krita and GIMP is false. AND in the interest of full disclosure, I only use Linux systems at home and where possible at work. I’ve been using Linux for 16 years now. I can compile my own kernels, and I write my own desktop software using GTK. I also live with an artist who would burn the house down before giving up Photoshop and the Autodesk suite of programs. This idea that ARM will magically take over everything is a fantasy from people that hate Intel. Legacy applications are a massive driver of IT purchasing decisions. Sure ARM is popular on mobile/tablet devices where content is consumed and barely created, raw capture is one thing, deep editing is another. ARM isn’t going to suddenly storm into workstations and servers because people want it to.
Edited 2017-09-14 04:24 UTC
It’s not going to matter much what CPU it’s running soon enough.
You may not have noticed, but there are proof-of-concept in-browser versions of the heavy hitters like CAD and photoshop.
With either wasm or electron, it’s trivial to use the full extent of a platform’s power, and do you really think Adobe is going to keep letting their software get pirated like it does?
Google office and MSOffice Online are only the start, it’s all going completely subscription-based – and what better way than by requiring everyone use an OTA-updated universally compatible app?
Hell, look at desktop and mobile already – huge percentages of modern software are webapps in wrappers.
Even if you don’t see the writing on the wall for desktop apps, there are large numbers of “professionals” for whom an ipad pro is already sufficient. Stick a keyboard on it and resurrect the “iBook” branding or something, and you really think the Adobes of the world are going to stand around while competitors like Sketch eat their breakfast?
Edited 2017-09-14 05:00 UTC
Ah, the mythical Web based thingy.
Naturally it relies upon an always on internet connection that charges by the bit for data going over it.
So there I am on a shoot and take a whole bunch of images with my new Nikon D850 (47MP). Say around 24Gb for a decent day in the field.
1) How long to copy that lot up to the cloud for the cloud version of Lightroom to work on it?
2) How much will that cost me from the middle of the Amazon rain forrest? Do you want an extra arm with that?
Sure, for a lot of people the cloud/web versions will work. But for a huge percentage of Photographers out of the studio? Forget it.
Have laptop, will travel and process images.
shotsman,
Yeah, I was hearing on the news about people who got their power back after the hurricanes but still had no internet or cell service. Internet may be extremely limited for some until the infrastructure is fixed. Obviously this is a major failure mode for “cloud apps” that would otherwise work if they were local apps (things like GPS could be extremely useful, but your screwed if you rely on an online service like google maps).
Of course these are drastic circumstances, but the cloud services can and do fail under normal circumstances too, ie amazon outages, google outages, microsoft outages, isp outages… One of the very few games I reluctantly bought off steam was a multiplayer party game from jackinthebox games. I thought would be fun to use during a holiday party. Low and behold, the jackinthebox cloud service was connecting and disconnecting all night. This failure mode would not have been an issue with a local version. And despite the fact that I own a perpetual license, it will stop working whenever they deem to take down the service. With local software, you can run it on your terms, but with remote “cloud” software, you become completely dependent.
Technology has gone between local versus “cloud” since the easiest mainframe days. The main difference was that back then the trends were motivated by cost and technological factors. These days the decision to have remote services is often made for advertising, snooping, and marketing reasons even when it conflicts with robust engineering.
Edited 2017-09-14 13:42 UTC
Service Workers.
There’s no reason those exact same apps won’t work without the internet as long as you’ve loaded the page even once.
Jesus, the more things change… the more they stay the same.
I remember reading similar posts where people used to film cameras could not envision the feasibility of digital imaging and processing of photos on the field.
When the auto was introduced, people used to horses were wondering about what happens when you run out of hay.
Nice examples, though i don’t see how the apply here.
Both your examples were obvious, of course they would eventually improve and be better than the old alternatives, anything thinking otherwise had to little knowledge to judge it. Just like driverless transportation is obviously the future now.
Thinking the web approach to all software is anywhere near as obvious an improvement is much less clear, just like so many other things that were previously hyped to take over the world, but never did.
Haha, no.
You can’t even use the full extent of a platform’s power in Java (at least, if you plan on compiling to JAR files instead of native executables), using it in Electron is a complete joke, and anyone trying to say otherwise either is being paid to do so, or has no idea what that many levels of indirection does to performance. Electron is why VS Code and Discord’s Desktop app get such crap performance compared to natively compiled tools. The same has conventionally applied to things built on Adobe AIR and Mozilla’s XULRunner.
WebAssembly makes things better, but it’s still limited in performance because of the translation overhead.
Portability is the enemy of performance. Portable builds of software written using Java, or a CIL language, or even things like WebAssembly, are all interpreted code, not machine code. That hurts performance. In fact, the only case I’ve ever seen where that can perform better is Lua, and that only applies in very specific situations where the algorithm being used happens to be more efficiently implementable in Lua’s interpreter runtime than whatever native runtime you’re comparing it to.
By the same virtue, iOS is so efficient because it’s single platform. macOS is as well. Conversely, Android supports at least 3 CPU architectures, not accounting for varying bit width (x86, MIPS, and ARM), and it runs on a kernel that supports a lot more (SPARC, POWER, SH, Alpha, HPPA, M68k, S/390, RISC-V, and more than a dozen architectures most people have never heard of).
Note that I’m not saying that portability is bad, just that it’s not always the best answer (especially if it’s not a client node you’re running on).
Then why is there ARM based server ? At least attempts ? What is so particular to x86 that ARM cannot do ? It plays video and game, display pictures and the internet, can be used for office use, even lightweight clients runs on ARM.
The virtual machines (Net, Java) and JIT makes things so easy to port or even run any kind of software on ARM it is baffling you believe it won’t work because desktop linux failed. I’m pretty sure Adobe could port Photoshop and whatever suits them if they find an economical interest doing so.
I was a 68k fanboy but turned agnostic because flaws in x86 were slowly removed and because the cpu implementation isn’t really important provided it counts reliably and accurately. Pretty sure ARM will slowly take over the world, sooner or later. I’m not even sorry for the 50 year old x86.
RISC/ARM has been available for decades. Even Apple was not using Intel x86 cpus. The main reason they did this was for compatibility purposes and attracting lots of developers.
Main problem with Unix these days is that Linux does pretty much everything at fraction of a cost.
IBM, Oracle, HP and I guess even Apple are fully aware they have to let it go at some point.
You cannot always ask premium prices for just a posix recompilation. Sometimes you have to add real value.
Thanks to Linux, FreeBSD, OpenJDK, and Free Software in general, there’s already a bunch of applications that compile and run on ARMv8/arm64. If there’s no Adobe on ARM, people will learn GIMP or Krita. Office suites, browsers, dev tools, media players, and basic desktop stuff all work fine on ARM. It’s pretty much like where x86 desktop Linux was six or seven years ago, PLUS all the Android stuff that could conceivably run in an emulator or chroot.
You’re joking, right?
And people will install linux because windows is proprietary and spy on them.
Fixed that for you…
People were OK with most past Windows editions, so Windows beeing prioprietary is of no concern to them. As for spying – Google is also spying on people, and they happily use Google services regardless / I don’t see you predicting an exodus away from Google.
I’m not, actually. That’s why Adobe will recompile for ARMv8 sooner or later. Photoshop already had ports to m68k and PowerPC for prior generations of Apple hardware. Apple laptops and mobiles will sell well for the foreseeable future, and companies will chase those profitable consumers.
With this post, I agree. But above, when asking if you’re joking, I was reffering to the idea that if Photoshop isn’t available on ARM, people will switch to GIMP or Krita…
They will, if Apple does it. Apple fans are going to stick with Apple platforms regardless of nearly anything, and they have money. That’s why Apple’s ISVs are willing to chase them across six hardware architectures (m68k, ppc32, i386, amd64, ARMv7, ARMv8) and dozens of OS revisions.
Though GIMP or Krita wouldhardly be part of that ecosystem… (I can see another commercial app largely displacing Photoshop if it weren’t available)
apple is not going to compete on the market with their pricing, no matter what insane tech magic they whip out.
and i highly doubt they will license their tech to other companies. either they won’t do it, or the licensing will drive the pricing to absurd levels.
if anything, this may either drive competition or create a market for top-end phones. that is, if people actually can handle the pricetag on this model.
Edited 2017-09-13 20:13 UTC
Since Apple doesn’t sell chips, how do you know what their pricing would be? And why would it be so hard to command a high price for a chip that is likely twice as fast as the competition while being more energy efficient?
As to pricing of finished goods, there is no direct way to compare Apple’s prices to those of other vendors, because you’re not just buying hardware. When you buy an iPhone, you automatically get free and timely OS upgrades, the development of which is funded from your purchase. You get access to AppleCare, the best support available in the industry. You get access to carry-in service, support, and training at Apple stores all over the world. And you get a product that holds its resale value much better than competing products. How do you compare that to the price of an Android phone?
i just guess at their pricing given their markup on their products.
android phones tend to have budget models, which are cheap. and these models are good enough for what i want out of them. i just moved on from android 2.x phone, mostly because screen was broken.
i’d rather pay significantly less for a device that will last just as long. i do not need all the bells and whistles.
also, your perception of apple support is probably predicated on being in us. i live in a different part of the world, and situation is not quite as you describe, and apple’s pricing is just plain ridiculous.
Apple are building high end processors, such processors would never be put into budget models of anything
There are plenty of low end budget processors available, and their performance is inferior. People are often willing to pay more for a superior product.
I’m very surprised to see ARM with such single-core performance and at such low TDP’s. It might be worth it building an ARM machine for emulators in the near future.
I am not sure if it is an Apple to Apple comparison. Are we trying to compare RISC(ARM) and CISC(x86) architecture, isn’t it?
https://www.quora.com/What-are-CISC-and-RISC-architecture-How-do-the…
Edited 2017-09-14 01:58 UTC
These chips all still compute the same numbers for us. You can run GNU/Linux on ARM and use most of the same software as on x86. Also the ARM processors are 64bit now, just like x86.
So it is fair to compare the performance of these chips. Maybe there are some technicialities in the Geekbench score computation that make the comparison unfair? Also I would like to mention that performance/watt is probably the more interesting score these days than raw performance. Which should be even more advantageous for the ARM processors. In theory an apple to apple comparison is possible.
Pure benchmarks have always had a problem with working better on some architectures than others, making some CPUs look better than they are. The best benchmarks are a large group of actual tasks: how long to encode this video; how long to compress this file; how long to crunch this block of data; how long to reformat this document; how fast does this game run. Most sites have gotten much better about this.
Real innovation would be Apple turning an iPhone into a full PC, using iOS on mobility and macOS with wireless display, mouse and keyboard in office.
Technically is quite simple and Apple has the capability to realize a good hybrid OS
Really surprised nobody is doing that.
Microsoft tried it but apparently nobody is interested in something that would potentially hurt PC/tables sales.
Edited 2017-09-14 14:09 UTC
Right?!
Because the only way I’m going to pay $1000 for a phone, is if for another $400 in peripherals I can have it replace (completely) my $1500 laptop I use for development.
I would imagine some sort of clamshell dock device that I carry with a bigger screen, full keyboard, and a batterypack.
Ubuntu did it did it already.
https://liliputing.com/2015/05/ubuntu-phone-that-works-as-a-desktop-…
https://en.wikipedia.org/wiki/Motorola_Atrix_4G
The Atrix did it before that, and the Palm Folio as well (not a phone, though).
Does this mean if Photoshop ever release a ARM version it would be the same performance level against i3 model?
Or this high performance is related to how apps are compiled and run in the iOS world? (almost like cheating).
They could perfectly optimize it for iOS instruction set and that’s it. The benchmark shows incredible results but it would be unusable outside iOS/iPhone SDK/Compiler/whatever.
AR is mundane? https://www.youtube.com/watch?v=iw9MPZoPqCQ
https://www.youtube.com/watch?v=z7DYC_zbZCM
https://www.youtube.com/watch?v=2lFeT6lo78s
https://www.youtube.com/watch?v=rIPfpGCxONQ
How telling that you get no response to this comment on the Apple hatefest that is OSnews. What I don’t understand is why everyone and their dog has been superhyped on VR for years when anyone with an interest in technology and half a brain understands that AR vill be a thing on mobile years before VR is going to.
Basically all the negative analysis of the new iPhones being completely “meh” have left out the AR story. Even going to great lenghts trying to make a case for all that power not really being needed in a mobile phone. They (meaning Android phones I guess) are apparently “good enough” as is.
Has the techworld gone mad?
Apple is building their own CPUs and GPUs —specifically— to leave everyone else in the dust.
Now that they are designing their own CPUs and GPUs, they can now go down whatever path they want and unless Google starts doin the same, Android phones are going to start dropping further and further back and very quickly.
Why? Companies that design CPU and GPU chips only want to make just enough improvements and not a bit more that will keep the vast majority of their customers coming back to them over and over again.
Compare this to Apple. They don’t want small gains in performance and reduced power consumption. They want to push the envelope hard to walk/run away from the competition using the combination of hardware and software to create an increasingly large performance gap.
The bigger this gap is, the better it is.
Eventually I expect them to make every part of their devices, even cameras and and sound and everything you can think of.
The very LAST thing they would ever want, is to license out their chips to other companies allowing them to duplicate their products.
Edited 2017-09-14 19:51 UTC
Manufacturers of CPUs and GPUs may want to work that way, but they can only do so because their competitors are doing the same.
If Apple are producing massively superior hardware, then there will be demand for competitive designs from other suppliers, as well as reduced demand for the inferior designs.
Other manufacturers will be forced to step up their designs to compete with Apple, in the same way Intel have tended to step up their game shortly after AMD do.