Am I the only person for whom that many pixels in a 28" display sounds like overkill?
Maybe I'm just getting old, but the resolution on a cinema display is more than sufficient to make things unreasonably small. And getting closer to my desktop screen isn't really appealing either (again, maybe I'm getting old, but to see the pixels on my current screen I have to get my nose almost right up to the display, which I'm never going to do).
Retina displays on mobile devices are more understandable because you tend to be closer to them, or because it's useful to render really tiny text on a small screen, sometimes. But on a desktop display? Seems like the display version of clock-speed fetish.
> again, maybe I'm getting old, but to see the pixels on my current screen I have to get my nose almost right up to the display, which I'm never going to do
The point of going to higher and higher resolutions is that eventually, you don't see the pixels. Pixels are an implementation detail, what you're really wanting to see is the image they represent.
And for anyone who struggles to see the pixels on a typical desktop monitor, that's probably because a lot of tricks are used to disguise them. Try turning off anti-aliasing or sub-pixel font rendering and then tell me you can't see the pixels.
So the point of higher pixels is so you can't see the pixels and to prove this point, you tell people to turn off other (cheaper) technologies that already hide the pixels.
There seems to be some faulty logic here. What is the point of the higher resolution displays if the pixels are already hidden with other technologies?
Gee, thanks. All these years, and I've been looking at the pixels and not noticing the image! You've changed my life!
ahem...
Condescending explanations aside, you know that there's a limit to the human eye's ability to resolve detail, right? We can resolve up to about 150ppi at 2ft. The Apple Cinema Display is at 109ppi. There's room for improvement, but not 60% more...
Yes, but when you're actually working on a hi-PPI display, you can then lean in to view more detail, rather than zooming in.
Much like we inspect things in the real world.
As a photographer, this means a great deal. I can verify the sharpness of an image (a key component in deciding whether to keep it or chuck it) at a glance. Saves a lot of time.
Yeah, maybe...but there's still a practical limit. In order to see the pixels on a thunderbolt display (again, 109ppi) I have to get my face about 7 inches away from the screen.
Closer than about 5 inches, and I lose the ability to focus because the screen is too close -- so there's a band of about 2 inches where I can gain from a higher pixel density than 109ppi, without losing due to eyestrain. And in any case, I'm not going to spend much time in that zone. It's hard to work with your nose in the screen.
YMMV, but I think I'm fairly typical. Most people dramatically overestimate the precision of their eyes.
You can see image degradation from pixelation long before you can make out individual pixels. I can't really make out individual pixels on my MBA (130 ppi) at one foot, but looking at a MBP Retina at the same distance looks dramatically better. On the MBA, the fuzziness from the heavy anti-aliasing used to hide the pixelation is quite apparent, but on the MBP Retina pixels look like sharp-edged solid shapes.
I don't know whether it's due solely to the resolution, but I was pretty shocked to realise I can tell the difference between 300 and 600 dpi photographic prints (assuming there's enough detail in the image to do so, you need to print a 23 MP DSLR shot with high detail at a 7x10" print size to get there).
I have had other photographers tell me that they don't see any benefit to retina screens at all... you're giving me the reason why here. :)
109ppi is good enough, but there is a significant difference. It's true that beyond a certain point it doesn't make a difference (1080p phones, I'm looking at you!) but 109ppi is not that point, for most people. Retina web content and applications look decisively better.
Bear in mind that higher resolution display != smaller UI elements. It gives you that option of course, but for most people the benefit is that fonts, icons and images look so much crisper and more readable.
You're definitely not the only one who thinks it's overkill. I've shown my retina iPad and MacBook to a few people who don't get the fuss at all. I guess this reaction could be due to a) not caring much about aesthetics, b) bad eyesight, c) not spending the time to get familiar with it and use it for some actual tasks. Myself, I think the screens are amazing, and each time I go back to my 23" 1920x1080 monitor is a little painful.
I've shown my retina iPad and MacBook to a few people who don't get the fuss at all...Myself, I think the screens are amazing
Over a decade ago, I bought one of those Sharper Image ion air purifiers. I told all my friends how great it was - totally silent but really cleaned the air and left it with a fresh scent.
About a year later I started reading scientific assessments and reviews of it. It in fact did nothing to clean the air and the "fresh clean scent" it put out was potentially harmful ozone.
It was then I realized that I was as able to fall for marketing hype as much as anyone else. That made me much more wary.
I know what you mean, that it's easy to fool yourself and fall into marketing traps. But in this case of monitor resolution, there is a very noticeable difference (to me at least) between current desktop monitors and retina ones. If in 5 years some company tries to push the trend even further with 8K extreme retina or something like that I'm sure I'll be on your side, but for now, please, bring on the 4K desktop displays!
(Actually, when it comes to TVs I do think we're getting into overkill territory with 50" 4K displays, when for typical TV viewing distances 1080p is fine)
With regards to TV, one of the big effects of HD was that TVs got larger, but viewing distance remained constant. With 4k, at standard TV-viewing distances, it doesn't really make much difference below 100 inches or so. On the other hand, I want one on my desk.
The air purifiers were misleading. Better resolution is not. That said, applications/OS should be able to use the resolution. UI elements should become crisper not smaller.
Completely agree, we should remember that resolution is one thing and definition another, although they are clearly related.
I think that anybody should be able to tell the difference between Retina and a non-Retina device side by side. At home I have a set-up with a Macbook Retina and an external 1080p monitor as you do, and the difference is quite noticeable when I visually switch between the two.
I agree that there is a physical limit to improving the definition of monitors, but I don't think we have quite reached it yet.
>> Maybe I'm just getting old, but the resolution on a cinema display is more than sufficient to make things unreasonably small.
You're either getting old, sit very far away, or have bad vision - the Cinema display has a laughably low ppi for very large UI elements, and the pixels are incredibly visible.
In theory stuff should be the same size but alot sharper. That being said, i don't really feel alot of difference when looking at my 24" monitor and on my 5inch phone which has almost the same resolution.
For me a huge 40inch 4K monitor would be more useful as the DPI would be roughly the same to my setup now (2x 1920x1200) but more vertical space and no bezels.
This[1] one is 500 US dollars and is 120. I don't have access to non U.S sites here at work apparently so I can't be certain they have it over there or not.
For coding, you might not notice it as much. I run my 27" monitor at about 30 hz, and it seems Just Fine. Youtube looks okay when not full screen, but frankly I use it for editing code, and I happily trade the refresh rate for the increased resolution.
No way. Reading text on a desktop display sucks donkey balls compared to my GS4. You can't get nice crisp fonts on PCs like you can on phones these days and it's horrible. I've been saving articles from my desktop to rad on my phone ever since I got one with a hi res display. Now I won't have to.
I think there's two really great things about retina screens on desktops/laptops:
- Firstly, I definitely notice the difference. The higher res screen is noticeably nicer to read. Sure, you don't need to go to the crazy extremes of some newer high res phones, but standard desktop monitors are noticeably ugly by comparison.
- Secondly, the res is high enough that you can finally change (apparent) resolution. My parents are in their late 60s and have bad-ish eyesight, so they lower the resolution on their screen. The result is incredibly ugly. Unfortunately resolution independent display seems to be a forgotten goal in modern OSes, so having screens that can change res is definitely worthwhile.
A human eye with 20/20 vision can distinguish ~150ppi at 2 feet (a typical screen-to-eyeball distance). Beyond that, it's overkill. The apple thunderbolt display is at 109ppi, so there's a bit of room for improvement, but not 60% improvement.
Beyond this limit, I suppose there are people with 20/1 vision or something, but they're pretty rare.
That is, you can distinguish lines that are about 1/150 inch apart.
But for a display, you want to be able to have diagonal (or curved) lines at 1/150 inch apart, with pixellation artifacts that are small relative to those lines.
That probably translates to something like 450ppi (line width of 3px, so the artifacts should probably be about 1/3 of the line size).
Explanation: "The eye is not a single frame snapshot camera. It is more like a video stream. The eye moves rapidly in small angular amounts and continually updates the image in one's brain to "paint" the detail. We also have two eyes, and our brains combine the signals to increase the resolution further. We also typically move our eyes around the scene to gather more information. Because of these factors, the eye plus brain assembles a higher resolution image than possible with the number of photoreceptors in the retina."
I've never gotten this argument. I mean, I have totally normal (nearly 20/20, but not quite... thanks to aging) vision, yet I can see the blurred edges of diagonal and rounded images perfectly fine on an iphone 5s at 2 feet. I think people just aren't really looking. I tried a thunderbolt display recently, and it was painfully low resolution. I haven't seen the math recently, but I have hard time believing that any phone today, let alone any monitor or television, really reaches the resolution of print.
I have my monitors mounted on arms, so they can be anywhere from 1.5 meters away with me leaning back, to almost right in my face. The ability to physically "zoom" the whole monitor adds another dimension to use, and it's one where you really appreciate higher pixel density.
Aliasing is another big deal. Jaggies are very visible until you get into quite high pixel densities.
Maybe I'm just getting old, but the resolution on a cinema display is more than sufficient to make things unreasonably small. And getting closer to my desktop screen isn't really appealing either (again, maybe I'm getting old, but to see the pixels on my current screen I have to get my nose almost right up to the display, which I'm never going to do).
Retina displays on mobile devices are more understandable because you tend to be closer to them, or because it's useful to render really tiny text on a small screen, sometimes. But on a desktop display? Seems like the display version of clock-speed fetish.