“I decided to write this post after having too many heated discussions with many users across many blogs. After hearing repeatedly; ‘The iPad will have a better display’ or ‘It sucks because it’s not Retina’ I figured it was time to break the argument down and dispel the ‘Retina’ myth.” Fantastic post at The Verge.
Look at all of the pictures that depict tablets in “correct” use. I think ever user would be better off using an actual laptop with a bigger screen in almost all of those.
In any case, I’m of the school where you get twice what you think you’ll need for technology for a margin of error. So, I’d still want my screen to be retina at 12-13 inches even if I was usually viewing it at 24 inches.
In fact the argument for not viewing it at the close range seems to be reduced to a ” you look stupid” argument. Which is not a very technical persuasion. If I didn’t do things that looked stupid, I’d wouldn’t be as successful as I am today. Part of being a Nerd is daring to be stupid.
In all of the pictures shown using a laptop would be stupid and range from inconvenient to very inconvenient to use. I’m not a huge fan of tablets but there are plenty of situations where using a laptop just does not make any sense and is overkill when you have the option of using a tablet.
Really? even the two guys actively typing away on the device? A full keyboard and a screen that could be tilted wouldn’t be better?
Just get a Surface with a keyboard cover. Always mobile, optionally productive.
No thanks. I’d take even an iPad over that.
You have the amazing ability to either be two people at once, or to reply to a comment not aimed at you.
Unfortunately I have to disappoint you and say I am only one person and I can comment on whatever I want to.
Edited 2012-08-11 18:42 UTC
Awww iPad lovers voted me down. So predictable and pathetic.
Stop making me laugh you pathetic iPad lovers.
Do your parents know that you are still not in bed yet?
That is a good point that I was thinking about after my post. Surface is designed to be both a tablet and a laptop. I just wish the screens were larger like 14-15 inch, and it was loaded up with plasma active.
MS has yet to let people use the keyboards. Only time will tell, just how productive one can be on those…
The one guy has a keyboard. The other guy is sitting on a couch. The last thing I’d want is a laptop while sitting on a couch.
Probably depend on the primary use of the device. While I see that the laptop can do it all, there not enough stylish arm laptop ( that can last like 10h before requiring a battery charge ).
But if the use of a keyboard is only sporadic, the trade-off copmpared to a laptop is probably non-significant for a category of people, especially tablet users which tend to consume more (I know that tablet can produce content, but chances are that the ratio of producer of non photographic content is non significant ).
F*ck Style. Form after function.
Yeah! http://www.smbc-comics.com/comics/20120501.gif
I had an iPad2 and I sold it because I had very little use for it, other than playing some simple games. As others have said, I normally need a full keyboard, a mouse and a separate screen.
I bought a “top” netbook, but that was a mistake as well: CPU too slow and screen too small and low quality.
I want to buy an ultrabook in the near future.
People use their tablets reading in bed. No one reads a tablet from 2 feet or more when lying in bed. Duh!
I think you have reading comprehension problem. The article addresses that, saying:
2′ == 24″. Duh?
..And it also shows how people normally use (or hold) their tablets.
Edited 2012-08-11 17:13 UTC
It is you that has a reading comprehension problem. At more than 22 inches means at less than 22 inches it is not retina.
The articles uses pretty odd examples for typical use. Reading/Browsing in bed is not one of them. People tend to hold their books or tablets at about 12 inches may be 14″ at times when in bed.
Sorry to say this but how is this news? marketing hype is shown for what it is – marketing hype? I’m sorry for all the MacBook Pro ‘Retina’ devotees – I can’t see the bloody difference between my MacBook Pro (2011) and the current MacBook Pro with ‘Retina’ display. Sure, I have glasses on but my eye sight isn’t totally shot given that I sat at the computer looking at the two side by side in the shop and couldn’t tell the difference. It reminds me very much of the point I made when chatting to a Sun Microsystems Java engineer regarding evangelising Java – generate hype so that people demand something even if they don’t know what the hell it is but they’ve heard that they must have it in their device. Same situation here – how many people do you see demanding stuff and don’t have the slightest clue as to why they should have it? I’ve seen it many times. Quite frankly I put Retina and Siri into that category – marketing hype used to drag customers in but when the rubber hits the road and is evaluated with a discerning eye the hype is seen for it is – hype.
Im must be in the other category as i can tell a retina display from a few paces. For me Retina is all about text, web pages, documents and PDF’s look a lot better as does a lot of the elements of the UI. I don’t have a retina laptop but when playing with one i could see the difference straight away.
I do own an iPad 3 and again i can really see the difference when looking through web pages and large PDF’s it’s a real pleasure to use one of these with the only device which displays text as well being a kindle (or any kind of ebook read).
I have a laptop, which, 80% of the time I use for reading. Not feeling the need for a tablet or e-reader yet, but I can tell you this; when I start reading, I don’t notice how the book looks, the fonts, the colors, shapes, everything goes away.
I remember the first time I got my first 1024*768 monitor, and most websites were optimized for 800*600 … I don’t want that experience again.
I guess for me, my laptop is sitting at 1680×1050 on a 15inch screen so the results I guess aren’t as dramatic than if I was running a MacBook Pro without the High Resolution BTO. I have admit though I don’t really spend much time worrying about the shapes/curves/etc of the text but instead quickly read, get what I want and move on. Btw, the original post wasn’t dismissing the idea of higher resolution but the obsession with Retina as if it were some sort of cure all to life’s problems. Higher resolution is nice but lets not get all obsessive about it – as if it were the ultimate thing in existence that you must have or the major differentiating factor when compared to a list of other things to consider when purchasing a computer.
What do you expect from the frothing at the mouth Apple fanbois? Must…. buy… more… Apple… stuff.
Reminds me very much of a product being launched by Apple and there was a reporter who was talking to customers waiting in line for it. In that line you saw a variety of responses but what I thought took the cake was the response by a lady who said, “I don’t know what it is or what it does but it is from Apple so it must be incredible” (to paraphrase her). I don’t know whether it says more about society in general or the Apple fanbase in general because at least from where I stand what I see is the Microsoft fanbase critical of Microsoft when they screw up but when it comes to the Apple fanbase what you year is a litany of “hold on’ and ‘it’ll be fixed soon’ and ‘it’s only a x.0 release so give it time’ but funny how these same people never gave Microsoft the time to release a service pack.
Edited 2012-08-12 02:39 UTC
Well, religions ( http://www.bbc.co.uk/news/business-13416272 ) don’t need to be rational…
Yeah, seen that in the past but I don’t think we need a study to point out the blatantly obvious. I’ve never been able to work out the attachment so many people have to a particular operating system given that there is more to life to concerned about than having loyalty to one company over another but then again in NZ we have the Holden vs. Ford thing that has never mad any sense. Right now I’m checking out Windows 8 Pro RTM and to be perfectly honest I think Apple should be worried, very worried – we’re not talking about the old Microsoft of ‘lets throw the operating system at the OEM and hope for the best’ because it appears based on the documentation that Microsoft is really involved with many of the OEM’s out there to ensure that the experience is smooth from first boot up to every day usage.
Depending on how the finances work out at the end of the year I’m very tempted to jump ship to the Windows side – whilst the fanboys on the Mac side keep telling themselves that ‘OpenGL 4.1 will come in the fall’ and ‘Tim Cook promised a Mac Pro refresh along with an iMac refresh’ there are those of us who are seeing the writing on the wall and leaving 🙂
Generally speaking, studies exploring “blatantly obvious” (and/or giving their results more exposure than they currently have) are perhaps needed the most …for example, in light of this http://en.wikipedia.org/wiki/List_of_common_misconceptions “fun” list (and “See also” there; or a list of cognitive biases – which are really our primary mode of operation …a bit scary how many views and decisions revolve ultimately around them – including big political or environmental ones)
Personally, I suspect the attachment thing is also a vestige of ~tribal dynamics – to which our minds evolved, are still mostly stuck in them.
No sense because both from Australia? (…but then, that’s the case with most things in NZ, I think ;p )
Anyway, this one does have a clear answer: whoever made the Pursuit Special ;> (so Ford, it would seem)
Everybody sing along, you know the words: future speculative competitor product Y will be better than today’s shipping Apple product that actually exists X, and the Apple fanboys are stupid for not admitting it, because I’m mad at Apple because nitpick Z.
X = iPad
Y = Surface
Z = Retina
We have our terms; now let’s see the proof.
“Pro” in this case means “Intel”, and Intel means less battery life and more heat than ARM. (They showed off prototype Intel tablets at CES with cooling fans in them. Fans. In tablets.) I haven’t heard or seen any direct comparisons with WP7, but iOS always significantly outclasses Android on touch responsiveness, so where’s the problem supposedly introduced by Retina’s GPU demands? Is the argument just that if anyone other than Apple made a Retina device, it would suck? That doesn’t seem to fit the tone (/agenda) of the article. Also, Surface doesn’t even work yet (froze on stage, they didn’t let anyone touch it), so I doubt it’ll set any new benchmarks for smoothness when/if it ships.
I don’t own any Retina devices (or even any iOS devices, for that matter), but I totally get it. It would be fantastic to zoom in and out on a PDF with my own eyes and face, as if it were a printed sheet of paper, rather than fiddling around with zoom and scroll controls. It’s like a whole second layer of visual data constantly available. Displays have been the weak link in Moore’s Law for a long time, and I like that Apple is trying to push them forward.
So… yeah. Once again we’re fantasizing about a future product that might be better than Apple’s current product for various specious reasons. And probably still won’t be. But you know what probably will be better than Apple’s current product? Apple’s future product. This is why boasts about vaporware are not interesting. Also, the Surface still doesn’t work, and is still priceless. Great product. Huge success, I’m sure.
(This Microsoft advertising blurb is a “fantastic post” while the careful breakdown of Apple’s trade dress claims was a “terrible visual guide”? Thom, give it a rest. If you have any credibility left at all, it can’t survive much more of this crusade against Apple.)
While it’s true that the resolution offers little benefit from 2 feet away, there are use cases (such as reading books in bed etc) where you will use the tablet from much closer than 2 feet… Also instances where you will look more closely to see detail, eg when viewing pictures.
For cases where the extra resolution doesn’t provide benefit, you could always render at a lower resolution and scale with the gpu – scaling up video with a gpu takes very little power compared to rendering it at a higher resolution to start with.
…will move to Retina whether we like it or not. The sooner the better as the prices for the screens will drop to the current non Retina levels.
Apple has started the ball rolling, and it will only pick up speed from here.
If Samsung had Retina first, I can guarantee the arguments would be something along the lines, Apple is falling behind, Retina is an important technology etc. etc. etc.
Just so you know, I don’t care either way. I have an iPhone 4 and I don’t love it any more than my 3G I had earlier.
That article is full of sour grapes, picking and choosing on stats to be overly pedantic. Trying to disprove the obvious (that new iPad screens/MacBookProR screens look amazing). And yes, some of it is the color, but a lot of it is the DPI.
We had the same anti-Apple hysteria when the Retina display came out for the iPhone 4, about how it’s a gimmick, how it doesn’t make any difference. Then Android phones started showing up with similar DPI displays, and that debate suddenly went away.
Higher PPI displays are coming, and not just for Apple. Apple is the first to get the ball rolling on making it a consumer standard (which yes, of course there have been high DPI displays before, but rare and/or expensive).
Is the Surface’s 1080p display enough to sate my desire for high DPI? I’ll have to see. It could be.
But either way, mainstream DPI is coming, and that’s a good thing. It look gorgeous (and looking gorgeous doesn’t require having an Apple logo on the case) and the biggest benefit is that it makes things easier to read. Text just looks better with higher DPI.
TL;DR: Hey Verge, u mad?
Hardly. The author does not dispute the fact tha the retina displays aren’t good, just superfluous. If you had actually read the article the author posits that the retina displays benefit more from the better contrast and color gamut, more than the 400% more pixels.
What’s more the iPad 3 has 70% more battery capacity than the ipad2 but has 10~20% less battery life.
If apple hadn’t gotten lazy by not making a proper scalable UI then they could have easily gone 1080p. Instead they’re adding invisible pixels at a massive battery life cost.
Edited 2012-08-12 14:28 UTC
Just because our conclusions are different, doesn’t mean I didn’t read the article.
In the article, they made a big deal out of VA, trying to say a much lower resolution display is also “retina”. They also dismissed the high DPI as a reason the display is gorgeous, instead saying it was the better colors.
Part of their argument is that high DPI isn’t worth it. I disagree. And pretty soon, when most non-Apple tablets and displays have high DPI, that argument will magically disappear.
Of course it is. If you look at it further away, then your eyes cannot distinguish the pixels. The author makes a very well reasoned case that at the average reading distance people hold their tablets the extra pixels beyond 1080p do not make the slightest difference. [/q]
He does however state that there is a Huge difference between the ipad2 and ipad3. This is not only because of colour gamut, but because the ipad2 is not “retina” at the average reading distance.
If you read many recent high end phone review they always mention the pixel density of the various displays. They also mention that on high end phones that the differences are fairly indestiguishable despite the variations in dpi.
So expanding beyond a certain dpi is stupid. Apply probably only chose the resolution it did in the ipad3 because they backed themselves into a corner, not because 2048×1536 is massively better than say 1200 x 1020 at 10″.
I’m fairly amused at how it’s taken for a given that the “retina” screen is the best the eye can see. Our photoceptors can detect a single photon. There are ~25,000 cone cells (color) per eye in the fovea, which sees about four square inches at arm’s length. So 12,500 cone cells per inch at a couple feet, plus some rods as well.
Obviously, that’s not the resolution we see at. Our visual system heavily compresses information in an analog system, then our brain reconstitutes it. Given the overlap, we do a fair bit of superresolution processing while we’re at it, so individual photons aren’t the lower limit.
Apple’s own research points to about 450 pixels-per-inch that a normal person can see at two feet. If you’re nearsighted then you can distinguish more. If you’re very nearsighted, then a lot more. Does it matter? Probably only with text, where printers and book publishers have long noticed this problem and increased DPI to ridiculous levels (perhaps infinite, as nearby ink dots merge).
And, then there’s the issue with color. Many men only can see two types of color (r-g & b), while some women can see four (r, g, g, b). The exact response to light at each wavelength also differs based on genetics. I remember in my high school chemistry class one classmate could see “red” well past 900 nm, while others could barely see it at 700 nm. I think moving beyond RGB color would be cool, but I doubt most people would notice.
Nope. Our eyes are like low resolution pinhole cameras being waved around randomly.Our visual cortex makes up a picture by creatively interpolating about 99% of the image.
I’m not sure how you disagree. I was speaking of how ganglia primarily transmits differences between photoceptors down the optic nerve (e.g. only one of three reporting to that ganglion sees light). There’s also some motion processing in the retina.
It’s more the lateral geniculate nuclei in the thalamus that “interpolates” data from the optic tract and transcodes it for the cortex (IMHO it’s more akin to lossy compression, but there’s decentralized single processing as well). The cortex is where all signals merge and we get additional postprocessing. I’m not sure if it’s the frontal or occipital lobe’s cortex that give rise to optical illusions, hysteric blindness, and the various forms of blindsight, which are the more obvious examples of “interpolation” going wrong.
The image from the lens is inverted and then must
pass through the blood vessels of the retina before being converted to an elctrical signal. This produces a a massively distorted image which must be rectified via postprocessing in the visual cortex.
In theory the eye has at least 16MP (and possibly >500MP) resolution but only the central 2 degrees of vision has accurate colour resolution.
Our vison is essentially a low resolution continuous “scan and pan” movie which is comprehensively filtered, upscaled and interpolated to produce a “meaningful” (if largely inaccurate) visual narrrative. A crude analogy is getting a drunk with a handheld VCR camera to film Lawrence of Arabia. Then postprocessing the raw VCR video using a massively powerful computer to convert the images to a 70mm wide-format film.
For the most part we can assume that ~300dpi of full grayscale pixels in a handheld device is more than enough for all except synthetic use cases.
You’re right that there are no simple metrics for evaluating our senses. For example, we are unable to detect lack of light for ten milliseconds but anyone would easily see a sudden burst of light, even if it took only one millisecond. Same with the resolution, you could argue that “sufficient” resolution is one when you can’t see a single fully-on white pixel on a black background. That could indeed require a higher resolution but frankly speaking who cares, especially that a good solution already exists (just dim the pixel).
Besides, resolution is only one of things to improve (true, it was long neglected). Refresh rates, contrast, and color reproduction are still lagging quite a bit, and there are many tradeoffs involved there too (resolution vs contrast or refresh rate etc.).
By age 45 most of us would be unable to tell the difference between a Retina display and 640×480 display without wearing spectacles.
I’m pretty sure I have read somewhere that 10 to 20% of people can’t tell the difference between a regular screen and stereoscopic 3D, and this has yet to prevent companies from trying to sell that expensive tech
And possibly something like 80% experiences the inherent flaws of that approach …seriously, not only it offers just a bit more cues than “2D” (that’s a deceptive misnomer, normal TV does have most of spacial cues our minds use), it gets those additional ones very wrong (notably, the parallax is completely …well, wrong; and overall, few aspects which are very linked IRL, are decoupled here).
“They” are trying to sell this fad every few decades… (stereography is not much younger than photography)
Then they need better glasses. I am past 50 and I have no problem distinguish between e.g. iPad2 and the new iPad if I look at it at close range. E.g. when reading in bed. Fonts look sharper and you get more true font forms. It makes a big difference.
Lol!
It is impresive how much energy and time people waste to prove that Apple is wrong in something, or that there are better solutions …
Bravo for Surface Pro but calling article “Dispelling the ‘Retina’ myth” is pure nonsense.
Anyway: go, go Thom with “dispelling” Apple succes!
Retina Display was just marketing jargon for “good screen”. That’s all it ever was. Everyone please calm down.
Well that was a ridiculous read not worthy of a rebuttal.