I am a programmer. I do not deal with digital painting, photo processing, video editing. I don’t really care for wide gamut or even proper color reproduction. I spend most of my days in a text browser, text editor and text terminal, looking at barely moving letters.
So I optimize my setup to showing really, really good letters. A good monitor is essential for that. Not nice to have. A MUST. And in “good” I mean, as good as you can get. These are my thoughts, based on my own experience, on what monitors work best for programming.
There’s a lot of good advice in here. We all know higher pixel densities make our user interfaces and text crisper, but a surprising number of people still don’t seem to know just how much of a gamechanger high refresh rates can be.
If you’re shopping around for a new monitor, and you have to choose between higher pixel count or a high refresh rate, you should 100% without a doubt go for the higher refresh rate. The difference 120Hz or 144Hz will make in just how smooth and responsive a UI can be is astonishing. I think the sweet spot is 1440p at 144Hz, preferably with FreeSync or Gsync.
Both Windows and Linux support high refresh rates out of the box, but as the linked article notes, macOS basically has no clue anything above 60Hz exists, and you’ll have to be very careful about what display you buy, and be willing to jump through annoying hoops every time you load up macOS just to enable high refresh rates.
“Both Windows and Linux support high refresh rates out of the box, but as the linked article notes, macOS basically has no clue anything above 60Hz exists, and you’ll have to be very careful about what display you buy, and be willing to jump through annoying hoops every time you load up macOS just to enable high refresh rates.”
I have a 144hz samsung monitor connected to my macbook pro right now, it has no trouble driving it at 144hz by default…
Same here. I have a 2560×1440@165Hz and a 3440×1440@100Hz connected simultaneously to my MacBook Pro without any issues. On the contrary, macOS seems to handle the monitors better than when I switch to Windows.
I get what the article is talking about, but it’s still ultimately an aesthetic personal preference.
I mean – I still use bitmap fonts (or scalable fonts with bitmap tables for specific smaller sizes) for most things, and that works fine on a bog standard display. If the font was designed to fit pixels a certain way, then it looks great (to me) at low DPIs.
A higher resolution is more important on macs, because mac’s use poor hinting (a stylistic choice that makes the outline of the font match better, but makes everything more blurry). On Linux and Windows, strong hinting can also make antialiased text sharp and readable at lower resolution at the expense of perfectly placed lines on every character.
Of course the defaults have shifted so hinting is gettings slowly weaker on both Windows and Linux as average resolutions increase, but is still infinitely better than on macs.
I;ve had a 4k HDMI TV for over 5 years and it works well. It is 30Hz.
Refresh rates matter little when editing text or such. I did need to tweak the sharpness/brightness/contrast/ etc to just say “print the damn pixel you are sent!”.
“Bug” is flat Lucida Console 8-12 either black or white pixels. Very clear, but much closer to the original CLI screen fonts (which I do on the Linux console – add consolefonts, setfont, etc).
I’m looking for an 8K but I need a graphics upgrade.
For better or worse my computer budget goes into the PC itself and a high end monitor isn’t a priority, at least not until my old one dies.
All the suggested monitors are significantly above what I could afford. 🙁
I’m pickier about laptop resolutions…many of them are still no better than 768 pixels, which is absolutely pathetic. It was bad in 2000 to say nothing of 2020.
+1
I wouldn’t pay an extra dime for the 61st or the 161st hz.
But getting a resolution that was not available during the Clinton administration is a must.
So is matte screen. I don’t understand why people still accept a laptop with a glossy screen. Perhaps they don’t know matte screens exist.
A bad glossy screen is much preferable to a bad matte screen.
How so?
Bad glossy screens just reflect light in high-light environments, however text is still crisp, and colours consistent.
Bad matte screens blur text and can add a “static” like effect across the panel, making image reproduction less accurate, and the static effect can be distracting across large amounts of whitespace.
A glossy screen looks better in the store display. It is only when you work on the device, the problems become obvious. Also for a long time you couldn’t get touch screen that were also matte (not even sure you can yet).
So for tablets and hybrids there still isn’t even a working compromise.
Glossy screens are better for media consumption (watching youtube)
Matte screens are better for media creation (writing code)
guess what most people do on their screen most of the time
I disagree with that. For content consumption you still want to see what is going on on the screen. A glossy screen is just worse.
> for a long time you couldn’t get touch screen that were also matte (not even sure you can yet).
My laptop has a matte touch screen, and it’s 6 years old. It took me a while to find it, though. I hope the situation has not worsened meanwhile. I had a glossy one before and I would never go back. For me, a stationary monitor set in a controlled environment can be glossy, but not a laptop.
I think that personal taste varies a lot in these sorts of issues. Many, many years ago I had an Acorn Archimedes, and it had relatively terrible graphics output (colour at TV resolution, which is what most of the early PCs did before there was a market for actual monitors) but the OS did a brilliant job with anti-aliased fonts. I could word-process with actual 10-point text on a 500-line (interlaced) screen. No, it wasn’t pretty, but it was legible. Perhaps that biassed me, but I’ve _always_ preferred Apple’s properly anti-aliased text (at the correct physical size!) to Microsoft’s wonky-but-sharp “clear type” text. At least Microsoft gave you the option to turn that off in the settings, and use properly-anti-aliased text if you chose.
I’ve never used a high refresh rate display, but I doubt very much that it would be of advantage to me. I don’t find the smooth window scrolling to be a problem on 60Hz displays. Indeed, for all the time in the 90s when people were complaining of jaggies and tearing when doing solid-window-moves in X, and switching to outline-moves, I was happy with the solid moves. Just don’t seem to see these “jaggy” effects.
To each their own. I have no doubt that eventually high-refresh-rate will become commodity and it won’t be possible or reasonable to buy the old “slow” 60Hz stuff any more, and that will be fine, but I’m not going to spend more for it. Correct colours (I do photography) and high resolution: I’ll pay for those.
areilly,
Absolutely, the higher you go, the more marginal the gains and their importance is obviously subjective.
This has a lot to do with the software/drivers. With double buffering and vertical sync you can eliminate the screen tear issues (which can be very severe when there are no mitigations), but otherwise you don’t see it even on old equipment.
Another thing that used to happen on older screens (phosphorus or LCD) that are slow to respond is that you can see ghosting & warping, which is especially severe with fast pans. But I haven’t seen this in a long time. Think of the phosphor you’d see on an oscilloscope which takes a while to fade completely, this used to be visible on ordinary video monitors.
We’ve had 60 & 70hz graphic modes for ~35 years now…
https://en.wikipedia.org/wiki/Video_Graphics_Array
https://en.wikipedia.org/wiki/Enhanced_Graphics_Adapter
…just saying, 120hz might just be one of those luxury specs that never becomes a commodity. I mentioned this in an earlier post, but 20 years ago I would have predicted 768 pixel height would have been totally obsolete and no longer used in new laptops, yet I would have been wrong.
The thing is, they’ll keep building the low end stuff until people stop buying it, which may be never if they’re “good enough”.
I doubt your old acorn actually rendered a 10pt font at the correct size, font sizes in those days were just a scale rather than representing any real life measurements and the machine itself often had no idea the physical size of the screen it was connected to.
I was a big RISC OS user too, and the fonts were one of the things people raved about in the community, so you piqued my interest. Most people used the standard Acorn monitor that came with the machine, so I wondered if Acorn had done some calibration based on them. The default was a 14 inch display with 640 x 512 resolution giving 60dpi (you could also get a monochrome 1152 x 896 display giving 100dpi) [1].
The units of the RISC OS font API are millipoints (1/72000th of an inch, since the original ARMs had no FPAs so integers were used wherever possible). The docs state that the OS assumes 180 units to the inch by default [2].
So probably most people were reading the fonts at about 3 times physical size. Having said that, the docs also point out that you can tune the millipoint-to-os-unit scaling to suit your monitor. In practice I don’t think many people did though.
[1] http://chrisacorns.computinghistory.org.uk/docs/Acorn/Brochures/Acorn_APP230_Archimedes4001.pdf
[2] http://www.riscos.com/support/developers/prm/fontmanager.html
I am still waiting for 4K 120Hz monitors that don’t cost an arm and a leg.
All the monitors in the article are pretty expensive. Everything is currently gated behind ‘Pro Gaming’ or ‘professional’ monitors.
A 4K screen at 30+ inches is still a low res screen in his DPI based optics. Damn mac idiots.
Where did he advocate using 30+ inches? The highest I saw him specifically mention was 27 inches, and he was saying that was still too tiny for his eyes at 1x. Later he clarified he uses 24 inch and 27 inch monitors at 4k, following up later that some larger monitors would be harder to use. Perhaps I missed where he said *using* 30+ inch 4k displays is a good idea.
Or maybe you’re just rounding up and tacking on the + as an exaggeration to get the gist of your point across.
Also, no, it’s not a “low res screen in his DPI based optics.” If you’re talking about any given screen at a particular physical size, obviously doubling the screen resolution at that size is much higher res and far more in line with his goal.
Not sure what about his overall point makes him an idiot either. Unless you’re just saying him using a computer by a brand you dislike that he’s inherently dumb.
I’ve been fighting a losing battle with our company executive for years, it’s the old what do they need bigger and better monitors for they are just reading documents all day long. Then they complain about staff who are slow or go home with eyestrain or headaches, apparently it’s just an excuse for being lazy.
I’ve even had trouble getting executives to accept large high quality monitors are justified for CAD operators. They think because they spend five minutes browsing a 3D model on a 15″ $3000 laptop the people who made the model can work 8hr days on a $8000 CAD Workstation connected to a $250 19″ flatscreen. In the meantime, some of the execs have +30″ external monitors to plug their laptop into, some even multiple monitor arrays in their own office looking at spreadsheets, which are apparently a requirement for collaborative purposes or video conferencing! What a bunch of Norbits!
In the lobby they load up receptionists with 27″ Retina display iMacs that get used mostly for web browsing, while the guy running the CRM server farm works on a 14″ rack mounted fold out.
cpcf
I’ve seen this as well. The quality & expense of one’s computer at a company is often more a function of one’s position than one’s need. I’ve seen some ridiculously high end setups for executives that barely have a need at all, meanwhile other employees are left to fight over old scrap parts.
They should have made an episode about this on “the it crowd” (what an awesome show that was).
Well, it’s about image. The receptionist & front office area is usually the nicest area. All too often the IT folks are in the slums at the back of the building, haha.
This guys starts by claiming a twitter poll is “my research among developers” and then continues to push on his imaginary facts with self-enforcing evidence. I love a good recursion as any developer, but I would like to see some other arguments before I agree to drink the Kool-aid.
I also love h0w he wax lyrical about the whole ergonomics of working, then place his laptop flat on the table.
Yeah, what a dumbass. I think he has been looking too much at zoomed out text.
He is right, surely, that a good monitor is much more comfortable than a bad, point taken. But man, he is so smug and self-righteous,
The best setup is one you can afford. If you cannot afford it, you should not care that his preferred is the bestest in the world. Buy good stuff in your high limit price range, keep it for real long, that’s it!
And yeah, his setup just sucks. Would not touch that laptop keyboard without a stick. A good TKL mecanical keyboard is cheap (100 eurodollars), just go for it and do not waste your money on a 4k monitor even if text looks nicer. Buy something demonstrably useful like a good armchair.
It is funny how personal experience can differ. I am writing this while using a 144 Hz monitor, but you could turn it into a 60 Hz and i would not care. I will not claim i cannot feel any sort of difference, but i certainly do not find it important.
I do have slightly above normal full hd desktop display (which i assume to be 24″) pixel density as i am writing this on a 32″ inch 16:9 monitor at 1440p. I also use it a bit for gaming, and getting a GPU that can comfortably drive 4K is more what i am willing to pay, so the choice was 1440p or running at 1080p on a 4K display. It was a compromise, but i think i made the right choice. I sometimes work with 4K displays, and they are pretty, but after the first couple of minutes i don’t notice the difference anymore.
The main factors for me were black level, colors, average and peak brightness, height adjustment range, vesa compatible in case i need that later, price obviously, and it absolutely could not be glossy.
32 inch QHD is exactly the same pixel density as 24 inch FHD because both QHD and 32 inch are exactly 4/3rds of the values of the smaller display.
I decided to get myself a 49″ 4k tv, as pc display. Not sure I’ll like it, but nominal parameters are good – IPS, 120Hz, VRR, etc. All in all it is like having 4 24″ 1080p displays arranged in 2×2 config, DPI is roughly the same.
Fuck off. My screen has a pixel density of 96 but is 40″. It doesn’t get better than that currently.
you forgot to add “you wanker”.
fretinator,
British terms of endearment are funny.
I still think bitmap fonts in text mode on a decent CRT look great. Trying to force displays to render arbitrary shapes is always going to look bad until you have at least 300 PPI (laser printers are typically 600 DPI).
I use a 27″ 2560×1440 display (Dell UP2716D) with no scaling (except I zoom some websites that use small fonts). That’s only 108 PPI, but I’ll wait until OLED is widespread before considering an upgrade. I’m not really impressed with the lame “gaming” monitors recommended in the article.
Sounds like the spoiled rich kid who was “hired” because he’s the boss’s son/nephew/whatever. This kind of whining drivel from a normal employee would get you fired. If the font you’re using to program in doesn’t look good on a 1K monitor, choose a font that does. “But I don’t like that font! I want to use my own font!” Suck it up and do your damn job! God! The entitled little shits you have to put up with tend to be 95% of the job. Always has been, always will be.
“So I optimize my setup to showing really, really good letters. A good monitor is essential for that. Not nice to have. A MUST.”
Wonder if these kind of posts could have an income bracket tag, to warn off those of us who have to take what we’re given, or can’t even drive their existing 100dpi monitor at its native resolution because the work laptop is too dated, and yet mysteriously still manage to edit photos, game, CAD and (gasp!) write text.
Having said that, thanks to the author for a neat tip about setting MacOS back to x2 scale. That has helped my girlfriend, who has eye strain from all this lockdown screen time (when this is over is anyone ever going to want to use Zoom ever again?)
M.Onty,
Yeah. 4k 120hz monitors won’t have much impact on my quality of life. IMHO office environments have much more of an impact. Too many office workspaces are in cramped, dimly lit, dull, windowless cubical farms. Under these circumstances, convincing an employer to invest in a new monitor would feel like a shallow victory for me personally. In other words, I would not be jealous of someone with a better monitor, but I would be jealous of someone with a better view. Most likely though is that the person with the better view also has a better monitor, just because. Haha.
To be fair, FHD (i.e. 1920×1080) monitors seriously lack the resolution to do productive things with them. Too wide to display a website, document or IDE full screen, too narrow to display it in a vertically split half.
QHD (i.e. 2560×1440) monitors are not really expensive (€150-€200 will get you a very nice model already) adds 1/3rds to each dimension. Suddenly these halves are 1280px wide instead of 960px which finally makes using halves of a screen doable. Suddenly the vertical resolution is 1440px iso 1080px, so that’s an extra 360px for just more lines of text! Isn’t that great?
Sure, the author of the referenced blog post sounds a bit like a spoiled brat, but saying FHD monitors are good enough is really outrageous.
Funny, I (and many other millions of people) were quite productive with 640×480 (and sometimes smaller!) monitors. I was productive on a 320×192 screen for years. I did my greatest amount of programming on an 800×600 display. I’ve used a 1920×1080 display for the last decade, and I’d like to think I was still productive.
Increasing the resolution of a display often doesn’t add any more lines to your editor – instead, it goes into making the current number of lines look better. Many people simply cannot read fonts sized down to the native resolution for these new super HD monitors. Hell, the entire point of the article linked above was that supposedly, getting an ultra high res display and then using that to make the fonts look super smooth will make you an awesome programmer capable of things you couldn’t before with a silly “low definition” monitor. Because not having smooth fonts are what keep programmers from doing their best. /s
Gargyle,
That’s a bit exaggerated. You realize it’s quite subjective right? Many of us are still using older monitors productively. Like JLF65 said more resolution doesn’t strictly equate to better productivity. Beyond a certain point those pixels are not used to increase screen real-estate since there’s no value in making text/UI too small. When characters are clearly legible, adding more detail becomes less important.
480P -> 720P very significant
720P -> 1080P very significant
1080P -> 1440P less significant
1440P -> 2160P less significant
2160P -> 4320P hardly significant
I wouldn’t say no to more pixels, but for me the text and UI are already small enough for my viewing distance, so the only value of additional pixels for me is to increase the detail on existing lines. Yes it provides more detail but nothing dramatic and personally I have to look very closely to see the pixel array on 1920×1080. 1440px is better but IMHO it’s already reaching diminishing returns.
High DPI is nice but not nearly the top of _my_ priority list:
1. More screen estate – higher resolution+larger size at ~constant DPI.
2. Faster refresh rates, faster response times
3. Better color reproduction, viewing angles, contrast, bits per pixel/HDR
4. Higher DPI
I’ve upgraded my home set up to a 32″ 4k 60Hz monitor (~140dpi). Point 1 is now sorted and it made a huge improvement to my workflow and comfort, but getting 2 and 3 as well would currently cost too much. As for point 4 – let’s talk about it when 40″ 8k@144Hz monitors become available and don’t cost much more than 4k equivalents.
That’s me. Someone working in DTP will value 3&4 more than 2. Someone playing games will likely need only 2&3.
Who write these articles? Best Buy employees on commission?
If your monitor works, keep it. You’re gaining pretty much nothing other than emptying your bank account by playing these kinds of silly upgrade games.
If you want a new monitor, buy a TV that doubles as a monitor. That way you gain the best of both worlds.
See my comment below on why “buying a TV that doubles as a monitor” can be a really bad idea unless you do some serious research.
If I may add a couple of things:
1. Buying 4K TVs as large PC monitors is a dangerous game, despite all the sweet lies about “convergence” supposedly enabled by HDMI. You have to make sure the TV has a true “PC mode” with true 1:1 pixel mapping (no “subsampling”), which if it exists is usually enabled by labelling the HDMI input as “PC”, that you weren’t conned with an RGBW pixel format (or any non-RGB pixel format for that matter), and that the panel can display 6500K colour temperature in PC mode. Also please note some TVs with IPS panels have a pronounced chevron pattern in their subpixels instead of the traditional 3 vertical stripes, which makes ClearType look slightly different even in true PC mode (I personally find it cute and inoffensive, your mileage may vary) and that all 4K VA panels have bad horizontal viewing angles.
2. Make sure your RGB levels match. That is, “full range” for RGB 0-255 displays (PC monitors), “limited range” for RGB 15-235 displays (TV monitors). Sometimes the drivers doesn’t get this right. If it’s wrong, and you get “limited range” on an RGB 0-255 display, you will get washed out blacks instead of inky blacks.
“2020-06-18 7:02 pm
kurkosdr
If I may add a couple of things:
1. Buying 4K TVs as large PC monitors is a dangerous game, despite all the sweet lies about “convergence” supposedly enabled by HDMI. You have to make sure the TV has a true “PC mode” with true 1:1 pixel mapping (no “subsampling”), which if it exists is usually enabled by labelling the HDMI input as “PC”, that you weren’t conned with an RGBW pixel format (or any non-RGB pixel format for that matter), and that the panel can display 6500K colour temperature in PC mode. Also please note some TVs with IPS panels have a pronounced chevron pattern in their subpixels instead of the traditional 3 vertical stripes, which makes ClearType look slightly different even in true PC mode (I personally find it cute and inoffensive, your mileage may vary) and that all 4K VA panels have bad horizontal viewing angles.
2. Make sure your RGB levels match. That is, “full range” for RGB 0-255 displays (PC monitors), “limited range” for RGB 15-235 displays (TV monitors). Sometimes the drivers doesn’t get this right. If it’s wrong, and you get “limited range” on an RGB 0-255 display, you will get washed out blacks instead of inky blacks.”
What the hell am I going to do with a monitor that I should actually care about any of this?
The above It won’t make one bit of difference in most people’s lives.
For me the single most important factor is the format, and the choice is easy as there is only one contender: EIZO EV2730Q.
Or you could just use a 16 by 9 monitor that’s large enough and has a high enough resolution, and treat it as two (almost) squares glued together.
Nope, it would cause a constant stress as I would not be able to decide to keep it in portrait or landscape orientation. 🙂
I spent years programming my Commodore 64 using 320×200 pixels for the whole screen in 40×25 text mode on a small colour TV which got a really garish RF signal to work with. Did I even mention it defaulted to light blue text in ALL CAPS on a dark blue background? Now get off my lawn and go write some code. I’ll take the 14″ foldout with its half broken VGA in the server cabinet, no problem. 80×25 still feels wide after all these years.
BvdW,
Occasionally I do go back to 80×25 text modes for some servers that don’t use graphics modes…and speaking for myself, it is painfully lacking in information density. The characters are so huge, 80 characters per line is too few, and 25 lines is just too short and I find it necessary to pipe everything through “less” for a scrollback buffer. I remember 320×200 mode too. 8bit color was nice, but the resolution was terrible even back in the day. Anyone remember when a 320×200 program aborted and you’d be bumped into the DOS prompt in this mode? Fun times, haha.
On a 14″ standard VGA text mode isn’t huge. It’ll work just fine if you write your code in a style that doesn’t favor huge lines. The 8-bit colour mode was a graphics mode, not intended for editing text, but very much like what even earlier systems would offer as standard resolution-wise. I wouldn’t reccommend 80×25 on a 50″ TV, though.. I recently booted a Raspberry Pi into FreeBSD using the living room TV as an improvised monitor. That was some big text, readable from well across the room. Somehow I don’t think family programming will take off as a trend though.
BvdW,
Well, I was around programming in DOS back in the day and I’ll be honest, I found 80×25 text mode sub-optimal even on a small screen, but I realize that’s where technology was at the time. It was obviously feasible, but just not ideal, IMHO 800×600 and 1024×768 were significantly better. These are not good enough today, but a huge improvement at the time.
It doesn’t help that vendors are pushing consumers towards ever more restricted devices that have more hurdles to programming them. In the 80s it was normal and expected that users would program the computers, they came with development tools in the OS and even in the BIOS. It was even normal to package detailed device schematics along with the hardware. The DIY nature of the industry has seriously regressed since then and many vendors actively impede owners 🙁
Filthy rich Europeans don’t know what do do with a ton of money they’ve got.
1st world people problems.
As if a new monitor will make you magically more productive.
Damn, double facepalm.
I like high refresh rates for games. I think it makes a nice desktop response too. But I would rate pixel density as much more important.
My favorite monitors and laptops are all 4K displays. Except for this one I’m using right now which is a 30″ HP ZR30w which is 2560 x 1600 at 60 Hz.
The only high refresh rate one is a 28″ 4K GSync HDR that I run at 98 Hz so it can do full 10-bit color. It is in a kind of an in-between area where 100% is too small, 200% too big, so I run it at 150% in Windows. Text does seem to scale well and looks great. I don’t really notice the extra refresh rate although it is sort of there when moving windows around or scrolling fast.
Long before this article I bought a trio of Dell 27″ IPS 1440P monitors. Love them for text editing and the little gaming I do. Refresh rate is probably rubbish for the die hard gamers but I have never felt I needed more.
Also, years ago, I got a pair of 4Ks for work. The cheapest ones I could find on Black Friday. Acer 27″ TN displays. Absolutely crap color reproduction, and also 60Hz refresh. Text editing is ‘niiiice’ in my opinion. Work I don’t need color accuracy for my job so these are great.
I don’t know what a higher refresh rate will do for text editing. I have never worked with a monitor that has higher than 60Hz. Maybe there is a benefit but I find it a hard argument to accept since most text editing is dealing with a fairly static image most of the time. I would even say 30 Hz wouldn’t really make much of a difference for that task. All I can say is that resolution makes a difference in text rendering crispness. I got tired of looking at pixelated text and wanted something closer to print quality or, hell, modern cell phone quality with their higher DPI.
One warning though. I don’t know about Mac or Linux, but on Windows, if you have a mix of UI scaling for monitors, you will have a poor experience. I would suggest every monitor be at the same UI scaling. Otherwise the display can get weird. I.E. My work laptop works great with JUST the two 4Ks. If I try to incorporate the laptop display (1080p) in the set the 4K monitors UI rendering becomes ‘upsampled’ and you lose your resolution benefit. Could be a video card issue though; I’m not a fan of the HP laptops the company provides. With my Microsoft Surface, I connect to a 1080P TV (Surface Screen is higher resolution), scaling works for the most part but Yahoo Mail web site seems to have scaling fits on the TV but not on the Surface Screen. In my opinion, Windows or video card vendors haven’t managed to do mixed DPI properly yet. Also some software haven’t updated well to work on different scalings of the UI. Some UIs really get messed up.