It looks like 2013 is finally going to be the year that we’re going to see truly high resolution displays – according to Intel. Retina displays for laptops and desktops for everyone. Considering promises regarding HDPI have been thrown our way for years now, it’s high time they became reality. As the article mentions, there’s one interesting possible issue: Windows 8’s desktop mode. How will it handle HDPI displays?
My paranoid prediction : Windows 8 desktop mode will not support high DPI, Microsoft will take much care not to patch this, and then they will present this as an asset of their Metro interface in order to get people to use it.
Forget Windows 8. Windows 7 doesn’t even support manually entering the DPI. You can choose normal, or high. Sometimes you can get it to accept a percentage.
I’d settle for just being able to manually enter the DPI so that I can reduce it on my low-res monitors (like the SD TV). “96 DPI” on a 27″ TV with only 480 vertical pixels looks horrible! And “120 DPI” is worse.
Wouldn’t it be nice if the OS included methods to query the monitor’s physical dimensions, query the video card for the current resolution, and figure out what the *ACTUAL* DPI of the display is? And then use that figure for displaying things so that a resolution change wouldn’t change the size of icons, text, images, etc? Or changing monitor sizes wouldn’t change text/image/icon sizes?
Oh, wait, those capabilities already exist (EDID, for ex). But none of the OSes out there do this.
X11 does do that, if the monitor returns valid information (many don’t) then X11 will set the DPI accordingly.
It’s a pet hate of mine that systems other than X11 completely ignore this information, it makes high dpi screens a pain in the ass to use.
In theory, it should. But I’ve never seen XFree86/Xorg use anything other than 96 DPI unless I manually specify it.
Granted, I don’t have any high-resolution/high-DPI screens available (19″ 1280×1024 and 21″/24″? 1680×1050) to test with.
Are you using Gnome ? For a long time (and I think it is still the case), Gnome has enforced a DPI of 96, no matter what the hardware or X11 says, with some hidden option to overrule this.
It’s not a “hidden option”, it’s simply under the “Appearance -> Fonts -> Details” dialog. But yes, it is sad that in recent years the Linux desktops seem to have given way to the “lets ignore X11 dpi suggestions and just force 96 dpi” attitude. I think this attitude was adopted to simply make things like websites look like they do under Windows.
But alas, you can still override this and let X11 correctly detect the dpi value and use it. All my Ubuntu Linux systems are setup like this.
Edited 2012-04-13 07:19 UTC
No, I’m KDE across the board. Can’t stand GNOME, or even GTK+ (but that’s not germane to this discussion).
Hrm, interesting, looking through xdpyinfo output on various systems, it looks like Xorg is now adjusting the DPI. It’s all below 96 DPI, but at least it’s different across systems now.
Guess I just haven’t checked in a long time.
Bert64 is correct. X11 has done this for years! X11 has set my dpi to 154 (I think that was the number) on my 8 year old laptop….umm 8 years ago. Windows XP (which was included with the Dell) always hard-coded it it 96 or 120 dpi. Most Windows apps couldn’t work in 120 dpi (buttons appear outside a non-resizable window etc), so I was forced to use 96 dpi on a 1920×1200 screen – making for damn small text. Luckily I haven’t run Windows on that Dell laptop in years.
Anyway, the hard-coded 96 dpi (from windows) or 72 dpi (from Mac) is what is keeping software and hardware from moving to high dpi displays! Maybe they can actually learn something from X11 and Linux apps (which are much much more high dpi friendly).
XP isn’t hardcoded to 120 DPI … I dunno where you got this nonsense from.
The creator of cleartype was runn 200 DPI on his desktop in 2000 … he said it was difficult to setup but not impossible.
I meant it had two pre-configured settings: 96 & 120 dpi. OEM’s gave you one or the other, they NEVER setup your OEM PC or laptop with something else – that was left up to the end-user to experiment with.
You can manually change the DPI settings in the registry … not ideal I know.
Hi DPI support has been in Windows since XP.
It is hidden, but it can do it.
Nice Troll though.
Edited 2012-04-13 07:51 UTC
Yeah, and hardly any apps worked in high dpi mode (back then) as I explained in my other post. Maybe things are slightly better now – I don’t know, I moved away from Windows a long time ago.
Well that is kinda of the problem. Most complaints about Windows are assuming that the latest version is still XP.
Is Windows’ high DPI support just about adjusting text size, a setting which I do remembered finding somewhere in Windows 7’s control panel, or have Windows GUIs become able to resize such things as icons, fixed-size windows and toolbar buttons in a way that makes high-DPI actually usable without me being aware of it ?
Edited 2012-04-13 16:55 UTC
GNOME and KDE have supported SVG icons and arbitrary resizing of fonts for a while now.
Well, 120 DPI mode was available in XP. That’s not exactly “high-DPI”. And there’s no way to easily, via the GUI tools, change that to anything higher. And there are so many hard-coded pixel sizing elements in XP that setting it to even 120 DPI made things wonky (text labels overrunning the edges of buttons, menus overrunning the edges of windows, etc).
To say Windows XP supports “high-dpi settings” is disingenuous at best, and certainly misleading.
Windows fully supported high DPI apps written using Windows Presentation Foundation e.g. since Vista. GDI/GDI+ apps had to be made DPI-aware by developer (XP in 2001). But every occasion is good to bash Microsoft.
Can we please stop using “Retina display” to mean higher resolution? It as ridiculous as “web 2.0”.
I like the name though I hate Apple. It makes it clear we are talking about an extremely high dpi display. How would you market it?
“Retina Display” is an Apple trademark and does not refer to any specific resolution, screen size, pixel size, etc. It’s just a marketing term that means different things for different displays (neither the iPhone 4S nor the iPad 3 have the same resolution, screen size, or DPI, but they both have “Retina Displays” ™).
In terms of Apple’s definition of “retina display”, it means >300 ppi (pixels per inch) displays.
300ppi was chosen because it has something to do with the maximum ppi that the human eye can detect or see. Well, something in that lines.
I can’t remember the exact Apple URL where I read this. I’ll try and find it again.
iPad n3w says no to that, it only has 264ppi
Right. It’s not strictly a defined ppi, but a minimal size of a pixel at the typical usage distance. From their blurb, it is ‘57 arcseconds per pixel’.
Exactly. It’s just a marketing term, not a specification based on resolution, screen size, pixel size, etc.
Thus, it shouldn’t be used to describe anything other than Apple Retina Displays (TM).
It’s actually kind of revolting that Intel would try to market “retina displays” instead of listing the actual DPI of the screen.
Hear, Hear. If not even Apple herself knows what a “retina display” actually is, since the definition changes with each new device, why should we use it in a serious discussion? If you want to say greater than 300dpi, say “greater than 300 dpi”.
We should never use terms that came from marketing, it just causes confusion, like “is this video HD or Full HD or Super Ultra Mega HD?”
It’s crazy how these days you can’t get decent 4:3 monitors. I absolutely hate wide screen monitors, I use my monitor for work, not for watching porn. If I want to watch movies I have a TV for that. Why did they have to screw with something that was perfectly good.
wide is actually good for work too. 2 pages + in one screen.
best of all, if you don’t like side by side.. most monitors rotate 90 degres, so you can have 2 pages up/down instead.
My favorites are 16:10, unfortunately, for cost reasons, most are going 16:9 now (like tvs). The diff ain’t that big anyway.
Agreed. There are MANY work uses for higher resolution, wide screen displays. I would be in a world of hurt without my wide screen for development. It’s an efficiency thing.
With 17” displays the 4:3 and the 5:4 format yielded the most displays per production cycle. With 19” displays widescreen formats yielded more monitors. And when monitors get larger than 20” widescreen starts getting more useful.
Why is widescreen a problem for work? For me, it’s perfect – I can display two documents side-by-side, e.g a code diff tool, a code window with console output on one side, a spec document next to the code, etc. Sure, vertical space is important too, but the more horizontal pixels, the better…
+100000000
Computer monitors should definitely not by put in the same category as TV displays.
Unfortunately the hardware manufactures had a shortage of LCD panels back in 2009, and found that if they use 16:9 ratio displays (instead of 16:10) they can cut 18 15″ displays out of a sheet, instead of 15 displays. Scoring 3 extra monitors for no real extra cost. So they started pushing the 16:9 ratio LCD monitors, thus giving us the absolute hideous 1366×900 resolution laptop screens, which are now covering 56% of all laptops on the market! Very sad indeed. I think only Apple still sells 16:10 laptops – but for a premium.
Give me the good old fashioned 4:3 monitors any day!! I want more vertical space, which makes much more sense for computer use.
The sad things is that 4:3 monitors actually have more pixels than their widescreen counterparts, but because they advertise the widescreen displays as 15.6″ or 17.3″ the consumer thinks they are getting a bigger monitor – which they aren’t!! 🙁
Edited 2012-04-13 07:39 UTC
Does nobody tile windows on their screen?
That again would give you too little horizontal space per window. eg: 56% of all laptops now have 1366×900, so that would give you 1366/2 = 683px per window if you have two windows tiled. Now include window borders, scrollbars etc and you have even less “usable space” per window.
600 pixels is wider than the text box I am typing in.
I am running a slightly higher resolution (1440×900) than that and don’t have any problems with Tiling 2 windows per monitor (I have 2)
I usually tile VS 2010, SQL Server Management Studio, Firefox and Firebug and I don’t have any problems.
Edited 2012-04-13 10:35 UTC
You mean 1366×768, which is the 16:9 widescreen resolution most common nowadays. 1366×900 isn’t 16:9 or 16:10 (1366×853 is 16:10).
1440×900 is the “standard” 16:10 resolution, and is way too common on 19-24″ widescreens.
768 vertical pixels is not enough to do anything. Really, 1024 vertical pixels should be the minimum. And it’s very hard to find widescreen monitors that are over 9″ tall with over 1024 vertical pixels.
Migrating from 19″ 1280×1024 4:3 screens requires 23″ 16:9 screens in order to not lose vertical screen real estate (physical or logical). Which takes up a *lot* more physical desk space.
I miss my old IBM CRT 21″ screen that supported more than “FullHD” resolutions. Unfortunately, it was over 80 lbs and warped my desk. But the resolutions it supported …
Edited 2012-04-13 18:14 UTC
2013-2014 is going to be a good year for GPU and CPU companies.
I can see chipmakers being behind this because higher resolution means stronger computers is needed. Because gamers just love the max FPS at highest resolution. And at 3840×2160 that will drive a hellavu upgrade cycle.
I can also see chip companies becoming much more involved pushing extreme spec pc titles. Which is becoming a rarity.
Epic has pc only project planned.
http://www.gamezone.com/news/epic-games-planning-pc-only-project
fyi nex gen gaming engines
http://www.youtube.com/watch?v=1EH42eqtqjU&feature=related
Edited 2012-04-12 18:53 UTC
Then again I’m reading this on a 1920×1080 13″, since 2010.
Oh and if you don’t boost the DPI, its unreadable (that being said, its nice. can’t see no pixels here m’am!)
I for one, was disappointed when I found out the this “Retina” thing varied with viewing distance. I doubt we’ll be seeing true 300+ ppi (300ppi roughly simulates 133 – 150dpi printer resolution, give or take) monitors soon.
Edited 2012-04-12 19:47 UTC
you have it the wrong way around
a printer with 5760x1440dpi (like many epson photprinters) can generate 180dpi with 8 bit color-depth at best
thats the reason why they need these insane resolutions
http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors
Funky monitors with 204dpi (ok actually 4 monitors input merged into 1 panel), I would have almost dreamed of having one , if it wasn’t for the lousy refresh rate.
Pretty much in advance on their time.
the desktop display ecosystem is embarrassing. I’m still waiting for the 2560×1600 27″ monitor I was promised after the trinitron fw900 came out last fucking century. it still sells for more than the pitiful, weak LCD displays that replaced it. embarrassing.