Apple introduced a 5K Retina iMac today.
iMac has always been about having a huge, immersive place to see and create amazing things. So making the best possible iMac meant making the best possible display. The new 27‑inch iMac with Retina 5K display has four times as many pixels as the standard 27‑inch iMac display. So you experience unbelievable detail. On an unbelievable scale.
At a relatively mere $2500 (a dell 5K display will set you back just as much, and that’s just a display), this is an amazing machine. It’s not useful for me (certainly not at that price point), but professionals are going to eat this thing up.
Finally. And thank you, Apple, for kicking off the desktop resolution race. I think 5K will be just about enough at 27″.
I’m holding out for 6K!
Hard to justify getting just the monitor for $2500 if you can get one with a free mac attached.
The old iMacs could function as a standalone monitor, I wonder if these ones will be able to as well.
Edited 2014-10-16 21:59 UTC
The answer would seem to be no. According to Anandtech, it won’t support Target Display Mode.
http://www.anandtech.com/show/8623/hands-on-apples-imac-with-retina…
That’s too bad. Better pass on this one then. Wait until they’ve sorted the input issue with newer display port or HDMI connectors that can properly handle these resolutions.
judging from dells other displays in that pricerange it will be 10 bit with a 12 or 14 bit LUT
the display apple is using is 8 bit at best
it’s a bit hard to compare prices under this conditions…
You don’t know and are drawing conclusions?
It’s 5120×2800 at 14.7 million pixels in case you were wondering
Regarding dell….. their comparative display won’t actually ship until the fourth quarter.
Edited 2014-10-16 22:58 UTC
What the number of pixels has to do with color gamut?
In other words, you know nothing about these displays but eager to blame Apple for something, even if they used the same panel.
where am i blaming apple?
just stumbled across a news-article about the dell
it is a 10bit display
And there are about 3 apps that any person would reasonably be using that support 30 bit color depth, the second most likely app —Lightroom — does not, and the entire chain needs 30 bit color depth (display, interconnects, graphics card, OS (Windows support is flaky and requires Aero to be disabled), and the apps.
30 bit color depth uptake will significantly trail 4K adoption, never mind 5K adoption.
And have you experienced or read up on the horrors of the flakiness of Dell’s 4K firmware, the poor tiling, the quirkiness of using MST?
Edited 2014-10-18 02:16 UTC
i assume you mean the combination of 2 dp-connections to drive a single monitor (isn’t MST the opposite?)
but it seems apple is using it too in the new imac
do you see the 8+2 (not 4+2) lanes coming from the connector and going into the controller?
https://d3nevzfk7ii3be.cloudfront.net/igi/GRYRXBZjbTWKKBeW.huge
No, I very much mean MST. Dell does not have a timing controller to handle 30 bit color depth at 60 Hz at 4K so they use 2 TCONS and treat one monitor as two monitors and then tile the content together for one monitor. It has been highly buggy and problematic; you can often get a monitor only displaying half the screen or numerous other problems.
Apple is touting their new TCON so it’s very likely they don’t need to use MST and tiling, but exactly what they are doing is not yet known. Since there is one TCON, they would probably need to be using one DisplayPort connection (but the current spec can’t handle what Apple is pushing — so they may have modified their DP connection against the spec… the benefits of controlling the whole package and doing this internally). The iFixit hardware teardown does not answer this question; I would expect AnandTech to be the first to piece together what the iMac is actually doing. But again, Apple could be doing something very similar to Dell (but unlikely… but contrary to Anand’s theories, maybe there is a possibility that they use 2 connections and then somehow integrate those signals prior to hitting the TCON… I don’t know… just theorizing… but I understand the limits of the DP spec) but do it more stably and robustly, we simply don’t know yet.
What I do know is that 30 bit color depths are currently very difficult to achieve and not highly practical or useful for 99.9% of all applications. I know that I have real world experience trying to implement 30 bit color using Dell Monitors and Windows and it is a horror show… and only useful for PS in very limited fashion… but most people aren’t capable of making manual edits at the billions of colors level anyway on images that already have much higher depths anyway (most edits you want to make at those levels are applied programmatically to many colors across regions).
But, again, the question I posed and asked of you is: have you worked with 30 bit color depth, implemented such a setup, troubleshooted such a setup, and justified the need for 30 bit color depth in your particular application? You avoided answering that, but your lack of knowledge of how Dell’s doing it suggests an answer. As I said (but accidentally misstated), 30 bit color depth adoption will significantly lag 5K adoption, never mind 4K adoption.
Edited 2014-10-19 14:37 UTC
Interested to know how well that mobile GPU will hold up driving that many pixels.
Edited 2014-10-16 23:15 UTC
Seriously? We’re talking about an AMD Radeon R9 M290X.
These are typically found in gaming machines such as the Alienware 17
it’s still a piss-poor card if you want to game on it
that gpu should be fine for a 2560×1440 (3,7 MPx), but not for 14 MPx
Its a very very decent card at 2560×1440. No need to run at full resolution. It’s trolling call it a mobile GPU simply because the best mobile laptop for gaming happens to use it.
It can easily drive 5k display, rendering scenes however is a different issue.
This is being marketed as a 5k screen for use by video editors using FCPX, not people playing games at half resolution. I’m sure it’s great for games running at 2560×1440; the 680MX I have in my iMac is still very good for that as well.
What’s everyone smoking? Half resolution on a 5K monitor doesn’t sound like any kind of imposition.
If you really have to have native, get a gaming monitor at 1/3 resolution and run that external display. BOOM! Full resolution graphics!
Or, get a dedicated machine with two Titans with a 5K external monitor, and tell me how much you spent on it.
Oye.
This is only a ‘bargain’ if you want a 5K monitor. You can buy a 28″ 4K monitor for $500 and an upgradeable desktop with similar hardware performance for considerably less than $1000.
When you buy less… you spend less. Nothing new about that.
Apple gear is typically less expensive or (at worst) equally priced to the competition. Apple allows fewer options to buy less (or spend differently) and thus spend less. This has been their business model for many years now and has served them very well… though its perpetuated the notion that their gear is more expensive than same spec’d competition.
Thankfully they have product ranges to suit most consumers (unless you want to buy really low end gear or have very specific needs)
Edited 2014-10-17 04:06 UTC
Just keep repeating this absurd mantra to yourself as you curl up in a ball in the corner. The truth is that Apple provides a very small range of products with very high profit margins.
Every Apple product uses mass produced industry standard components (RAM, CPU, hard drives, GPU, displays etc) and is assembled under contract in China by third parties. In many cases Apple components are cheaper and lower quality than used by other major brands. The only real difference between Apple and other brands is the physical form factor and software.
In this case Apple has a short term advantage in selling 5K (a premature technology) hardware relatively cheaply. Within a year 5K displays will become much cheaper and the iMac product will be just another overpriced Apple product.
This is a false statement.
December 18th 2013: “We have begun manufacturing the Mac Pro in Austin [Texas/USA],†Cook wrote…
Your middle name must be Troll.
Translation: “We install extra RAM and hard drives in Texas”.
The average suburban whitebox builder probably does more ‘manufacturing’ than the Austin plant.
Hope you can read:
http://www.brandchannel.com/home/post/2013/12/20/131220-Apple-Mac-P…
BTW: For your information: I’m a Linux guy…
The only thing I saw being ‘Made in the USA’ was the case. That involves semi-automated extruding, machining and finishing of an extremely simple aluminium shape. Pretty much routine manufacturing by modern standards.
There’s this little thing we like to call ‘design.’ That perhaps eluded you as you raced to troll another Apple thread.
Edited 2014-10-17 11:31 UTC
It is either an oversized Coke can or R2D2 clone. Take your pick.
Pots and pans have been looking like that for years.
It’s not really modern
Compare a 15″ Apple Mac Book Pro to a *similar* 14″ Razor Blade, and tell me the Mac doesn’t get more bang for the buck.
Fanboi-ism is bad, but the opposite is worse. Far worse.
And don’t try to tell me a gray box or a clunky plastic failure laden HP laptop is “similar” hardware – it isn’t.
No because the CPU/GPU combo on the Macs tends to always be worse than the competition.
For me, an avid 3D graphics user, that is all that matters.
It depends on what you consider to be “the competition”.
An iMac/Mac Mini may not be more expensive than a similarly designed competitor in the same specialist all-in-one/mini-PC niche, but that’s only really relevant if you demand a computer with that specific design.
I’m actually quite happy with a boring old desktop PC, and because of that, when I’m shopping for a computer, that’s what any Mac is competing against.
For example, I look at what I could get for my money from the UK Apple Store, and see a Mac Mini, with dual core i7, 16Gb RAM, and 256Gb SSD, costing £1,119.
Then I look at the quad core i7, with much faster dedicated graphics, 500Gb SSD, and 4Tb HDD for storage, that actually cost me less money.
Of course that’s not a fair comparison, with the Mac Mini a very different design, using more compact and lower power components. But in practice, sitting on my desk, they’d both be doing the same job, only the PC would get it done faster for less money.
So your argument is you can get inferior equipment for cheaper? Wow, what amazing insight. By the way you can’t get a 4K monitor for $500. You can get a 4K TV for that, but it is inferior in many ways.
Edited 2014-10-17 04:41 UTC
Since when has 4k been crap? There is still SFA 4K content so a move to 5K is massively premature. Very few video cards can handle 4K gaming let alone 5K gaming.
Oh dear…
Let’s take a simpler example. I can get an Intel 4GHz i7 for $370.
I can also get a 3Ghz Intel i5 for $200.
The i5 is perfectly adequate, but equally clear is that the i7 will be faster. Hence it costs more.
Just like a 4K Seiko TV can be had for under $500. It’s a good value, but it has several important deficits compared to the 5K panel on this iMac. No power saving support, lower quality image, 30Hz max refresh rate, etc.
Therefore it is cheaper. Is it really so hard to understand?
That $500 display is sure to be a TN panel, so it’ll likely have washed out colors, blotching, you can’t color calibrate it worth a damn, and it’ll have horrible viewing angles.
It’s not even remotely the same as any even medium quality TN alternative (like ips or something else) display.
Plenty here for under $500.
http://www.amazon.com/s/ref=sr_nr_p_n_size_browse-bin_3?rh=n%3A…
http://accessories.us.dell.com/sna/productdetail.aspx?c=us&cs=19&l=…
Sorry I made a mistake. I didn’t realise how low the Apple specs actually were (dual core i5). I’ll downgrade it to a $500 machine with an expensive screen.
But why wouldn’t you want that? A 28″ 4K monitor has a pixel density of only 157 PPI. That isn’t nearly good enough to display text properly. My laptop delivers 220 PPI, which still isn’t good enough at laptop distances.
Wait until you are 50 and you’ll find 100dpi is ample.
Maybe. But I’m sure I can get a 8K screen at a decent price before I turn 50.
I was really hoping for an upgraded Mac Mini, but they took away the quad core option. So, the best processor option on the Mac Mini is actually a downgrade.
Guess my next machine will have to be a 21″ imac, even though I don’t need another monitor.
They basically screwed the guts from a mid range laptop onto the back of a 5K screen.
Even this $449 Dell has better CPU.
http://www.dell.com/us/p/inspiron-3847-desktop/pd?oc=fddnrt201&mode…
The dell PC has a
4th Generation Intel® Core™ i5-4460 processor (6M Cache, up to 3.4 GHz)
(actually 3.2Ghz with turbo to 3.4Ghz)
The iMac has
3.5GHz Quad-core Intel Core i5, Turbo Boost up to 3.9GHz)
or
4.0GHz Quad-core Intel Core i7, Turbo Boost up to 4.4GHz
how does a 3.2Ghz CPU beat a 3.5Ghz CPU? Please expalin for the rest of us who are really in the dark here.
The CPU specs were cut/pasted from the Dell and apple sites
That’s simple: el cheapo DELL PC has -=PC-M@5TER-RACE=- performance multiplier. Every Apple product is crappled by definition and made entirely by starving workers in China villages.
So, as we learned today, iMac display has 4x less dynamic range and only a fraction of performance compared to non-Apple products.
The Quad core was only for the absolute top of the range Mac Mini though. The base and middle tier always came with a Dual core i5. In fact, the bottom one is slightly slower (by 0.2GHz with speed boost) and the middle is actually faster. I have a base 2011 Mac Min and it’s not a slouch. The RAM was far more of a limiting factor, and now it comes with 8GB as standard from what I can tell.
Still, I would have liked to see them keep the quad core i7 in the top end. Been using the mini for development machines but the new one might actually be a step backwards. Better graphics, but possibly slower CPU. Will wait for the benchmarks.
I’m not sure why they might have done that – maybe the quad core models weren’t selling well, maybe it’s thermal, maybe they just felt like nerfing the line at the top end intentionally, to incentivize upgrades. Could also be that they’re using a dual core CPU with hyper-threading and thought that is enough.
Whatever the case, I’d like to see some benchmarks before making a judgement.
Additionally: They also have to rely on an Intel SoC here, which may have something to do with it. Intel simply may not be willing to sell them a quad core SoC with Iris graphics, due to the expected volume of sales of Mac Minis.
Edited 2014-10-17 20:32 UTC
They also soldered the RAM directly to the motherboard so now if you want to upgrade, you’d better order the mini with as much RAM as you think you might need down the road. Ridiculous!
Why would you want such high PPI in a desktop machine? I want to see my pixels damn it!
Somehow we managed with 96ppi for many years.
Multiple windows on a single desktop.
I use a TWM, and have 2 or 3 browser windows/terminals side-by-side.
In order to actually see text at a reasonable size, you have to zoom out quite a lot. The trouble with 96dpi is that when you do that, letters are no longer recognisable.
High DPI screens can show smaller text, so you can fit more on the display.
I have dual 1920×1200 24″ at work, and constantly miss my 2880×1800 15″ screen’s pixel density.
You can’t even display 2 browser windows side by side without having to scroll on 1920×1200.
Edited 2014-10-19 02:16 UTC
I love the 5K iMac but I think the new $500 Mac Mini is a much more interesting product cause it opens a new market for the Mac (or “re-opens” it if you wish).
$500 is a magic price tag, It’s accessible for common people even people outside the US… and I think that’s where the Mac must target. A beautiful computer for the masses not for the classes. 😉 C=
Don’t get me wrong, I love Apple shiny expensive Macs (I’m an iMac and Mac Pro user myself) but I think the Mac as a platform deserves much more than 5% global market share.
Edited 2014-10-17 07:22 UTC
Never underestimate the lack of interest in Apple products outside the USA.
I can buy a complete gaming computer for 500€, with a CPU/GPU combo that beats the hell out of a Mac Mini.
And Apple actually downgraded the CPU if I am reading correctly the posts going around the Internet.
So, no thanks.
Of course you can, the Mac Mini has never been about speed or performance. They are about getting a cheap Mac with OS X and a visually pleasing design.
The new CPU is using the Haswell architecture. Just comparing the clock speed will not give a proper picture of how it performs compared to last generation Mac Mini.
If your machine is larger, weighs more or generates more sound – you fail …
Unless it is under the desk where none of these factors really matter.
If your machine is slower, sealed shut, or tied to the RAM it comes with – you fail…
The point is that in the bulk of cases, it makes no sense to argue that a machine is objectively better or objectively worse. Whether a machine is better or worse is usually a subjective question. It depends on whether it fits better or worse with my individual needs. An iMac does not come anywhere near meeting my needs at work, but the type of machine which does meet my needs (and which, from my point of view, is therefore better) probably won’t meet your needs. For my needs, whatever system offers the fastest processor, maximum RAM, and easiest upgradability at a given price point is always going to be the better one. For others, dpi and noise will weigh more heavily than they do for me.
Given that, it is pretty pointless to argue whether my beige box is better or worse than a retina iMac. It all depends on what you’re looking for.
If your machine has less cores, pushes less polygons per second, doesn’t support latest the OpenGL version or 5.1 sound – you fail …
4k, 5k or a billion 5 means nothing to me if i need X11/Xorg to be able to use it. I want several hundred lines of terminal space. Now THAT would be amazing. Using dvtm on one screen for all text work and one buffer for the graphical needs would simple be “amazing”.
1. Open iTerm 2 fullscreened.
2. Shrink the text down to your preferred size, changing font, colorscheme, keybindings, etc. while you do it.
3. Open dvtm.
Let’s be honest, anything in a terminal that requires more CPU than a retina iMac running OS X can handle is probably best done over SSH on a proper server anyway.
Indeed.
You can rent 32 core systems with 256 GB of RAM in data centers sitting on 1Gbps Internet trunk lines. With or without Nvidia Tesla cards.
Shrinking the fonts decrease the fonts readability as they get compressed, higher resolution on the other hand keeps the amount of pixels per character and still makes it smaller without compromising readability.
I can just imagine all the people who will be using their 5K monitors with incredibly tiny fonts and a magnifying glass.
If the font is too small to read comfortably then it doesn’t matter if there’s 200 pixels per character or 200 million pixels per character. If the font is large enough to read comfortably, then 5K is a waste of power/money.
– Brendan
Edited 2014-10-20 07:22 UTC
Shh! You don’t want any facts to get in the way. If the fonts are blurred on a 1080p+ monitor you’re probably sitting far too close to the screen.