Subscribe to Science Friday
As Black Friday approaches, you’re probably being inundated with ads for bigger, better televisions. But just how good is good enough? Are there limits to what our eyes can even make out?
Visual perception researcher Maliha Ashraf joins Host Flora Lichtman to describe her new study on display resolution—including a display calculator she and her colleagues developed to help you determine the optimal display characteristics for a given room. And retinal neuroscientist Bryan Jones joins the conversation to delve into the workings of human vision.
Further Reading
- Check out the Display Resolution Calculator from the Computer Laboratory at the University of Cambridge.
Sign Up For The Week In Science Newsletter
Keep up with the week’s essential science news headlines, plus stories that offer extra joy and awe.
Segment Guests
Dr. Maliha Ashraf is a postdoctoral researcher at the University of Cambridge in the UK.
Dr. Bryan W. Jones is a professor of ophthalmology at the University of Pittsburgh.
Segment Transcript
FLORA LICHTMAN: I’m Flora Lichtman, and you’re listening to Science Friday.
[AUDIO LOGO]
Today in the show, a close look at the biology of vision. We’ll get the big picture.
MALIHA ASHRAF: Yes, there is an upper limit for an average observer to how many pixels they can resolve. If we add any pixels beyond that, then the human eye is not able to perceive that extra resolution.
FLORA LICHTMAN: As Black Friday approaches, you’re probably being inundated with ads for bigger, better televisions, 4K, 8K, super-duper, mega crystal, QLED, UHD+. But how good is good enough? Are there limits to what our eyes can even make out? And just how similar is my visual experience to yours?
Today we are talking about how we see the world with two experts. Doctor Maliha Ashraf is a postdoc at the University of Cambridge in the UK, and she’s the lead author on a new study about display resolution, published in the journal Nature Communications.
And Doctor Bryan Jones is a retinal neuroscientist and Professor of Ophthalmology at the University of Pittsburgh. Welcome to you both to Science Friday.
BRYAN JONES: Thank you.
MALIHA ASHRAF: Thank you so much.
FLORA LICHTMAN: Maliha, your team studied what resolution people can actually see. Is there a point at which more pixels don’t matter?
MALIHA ASHRAF: Yes, so that’s exactly what we measure. In quantitative terms, it’s 94 pixels per degree. So in each angle of your field of vision, you can see up to 94 pixels if your stimulus, or the content that you’re seeing, was black and white fine lines.
And in terms of technology, if we translate that, then, yes, there is an upper limit for an average observer to how many pixels they can resolve. If we add any pixels beyond that, then the human eye is not able to perceive that extra resolution. And so it’s computationally a waste.
FLORA LICHTMAN: I’m trying to picture how you did this research, and I’m imagining an eye test from the ophthalmologist’s office on a sliding track, like moving closer and further away and at different angles. How far off base am I with that?
MALIHA ASHRAF: Yeah, so we didn’t use the traditional eye chart that you see in an optician’s office. Instead, we used a normal 4K 27-inch monitor, and we mounted it on a rig which can move back and forth. And the purpose of this was we controlled the viewing distance between the observer and the screen. And this change of viewing distance effectively changes the resolution or how many pixels which are falling on one degree of your retina.
FLORA LICHTMAN: It seems like one of the take homes here is that the resolution that your eye can make out is related to the size of the screen, the distance you are away, and then how good your eyes personal resolution is. And so what I gathered was that the farther you are away from the screen, the less resolution matters. Does that sound right?
MALIHA ASHRAF: Yes, that’s correct. So the further you are, the higher the pixel density will appear. So in terms of 4K screens, let’s say, so you hear the word 4K very often. So that simply means that there are about 4,000 pixels in a row or in the width of the screen.
Now, that 4K, or 4,000 pixels, can be on a 40-inch screen, or they could be on an 8-inch screen. So the number of pixels is the same, but because the size of the screen is different, that means that how closely packed the pixels are is different. So the pixel density is different.
So for the same number of pixels, you will have different effective resolution depending on the size of the screen. And then similarly, you can watch the screen from a meter or two meters. But the further you are, the less you can resolve the details because more pixels will fall on less area of your retina.
FLORA LICHTMAN: So, listeners, if you want to nerd out on this in advance of your personal entertainment system upgrades, we have this calculator on our website at sciencefriday.com. And in what scenario would you even be able to tell you’re watching an 8K TV versus a 4K TV? Do you need to be sitting like a couple feet away to even make that difference meaningful?
MALIHA ASHRAF: Yes, so again, depend on the size of your TV and the resolution of the TV and your viewing distance it can measure. So for example, if it was a 40-inch TV and you were watching it from a distance of 1 meter, then you might be able to see a difference between an 8K and 4K. So this distance is close enough and this screen size is small enough that the higher resolution might make a difference.
FLORA LICHTMAN: But 1 meter is like three feet away. Most people aren’t watching their TV– well, my children are. But besides children, Many people are not watching that close.
MALIHA ASHRAF: Yeah. So that’s TV, but you have other applications, lIke right now I’m watching my screen from, I don’t know, 40 centimeters.
FLORA LICHTMAN: Yes, true. Yep.
MALIHA ASHRAF: And if you’re playing games, for an office worker, so there are varying distances which might fall into that range. And also our research was not focused on TVs. We wanted to get just a general resolution limit in terms of pixels.
And now pixels can be in anything, on your phones, on tablets, on VR/AR headsets. We just use TVs because that’s a bit easier to explain. But anything which uses pixels you can apply this resolution calculator and find out if your setup is below or above the human resolution.
[MUSIC PLAYING]
FLORA LICHTMAN: We got to take a break, but don’t go away. When we come back, is color just a mass shared illusion?
BRYAN JONES: The biology of vision is actually a lot more complicated than a lot of people understand.
FLORA LICHTMAN: Stay with us.
[AUDIO LOGO]
Bryan, let’s take a step back. Is our ability to make out resolution the same as how well we score on a vision test? How is it related to having 20/20 vision or nearsightedness or farsightedness?
BRYAN JONES: So most of what we’re talking about is something called acuity, our ability to resolve differences in contrast at some defined distance. For the standardized test that most of us are familiar with, that defines normal vision of what people can readily resolve at 20 feet.
And the biology is interesting. So you have light coming in. There’s a cornea, the clear part at the front of your eye. That’s what surgeons operate on when they do, like LASIK or radial keratotomy, change the refractive properties of light that falls on the photoreceptors of your retina at the back of your eye in the right way.
If you can see things up close and not things at distance, you’re myopic. If you can see things far but not up close, you’re far sighted. And there are ways of correcting that so that you can improve the acuity of your optical system by operating on the cornea.
There’s also the lens. The lens is behind that, behind the pupil. The pupil constricts or dilates depending upon the amount of light that’s available. And pupil size also factors into acuity.
The tighter your pupil, the higher the acuity, and the more dilated your pupil. So if you go to the doctor and you get drops and it dilates your eyes so that you can get a good view of the back of the eye, then everything is a little less sharp.
And then the lens also shapes the light falling on the back of your retina as well. And as we get older, our lenses get a little less flexible. The muscles that control the shape of the lens get a little less able to do their job. And so we start having trouble accommodating or changing the acuity of light that falls on the photoreceptors of the backs of our eyes. And then there are other issues, cataracts and things like that, that can also impact that.
FLORA LICHTMAN: Does that mean skip the 8K TV if you’re getting up there?
BRYAN JONES: Yeah, so basically as we get older, the visual performance of our eyes drops. And in terms of televisions, a lot of this argument was, I think, more important a few years ago when screen sizes were smaller.
And it’s like when Apple first came up, when Steve Jobs was up there on the stage and he was talking about a retina display, a lot of us in retinal science, we’re like, oh, come on. And then we actually did the calculations, and yeah, it’s at the standard distance of a screen. It really was, quote, retinal resolution. You could not distinguish the pixels at that density.
Televisions have gotten so big now and you’re sitting far enough away that a lot of this argument really doesn’t matter. But where it’s coming back into play now is where we’ve got VR devices.
FLORA LICHTMAN: Because they’re right on your face.
BRYAN JONES: They’re right on our face. And we’ve got these displays very much closer., And resolution for those devices is going to come back into play again, absolutely.
FLORA LICHTMAN: Are some people’s retinas more sensitive than other people’s?
BRYAN JONES: So yes and no. The biology of vision is actually a lot more complicated in some ways than a lot of people understand. Certainly there’s variation in what people perceive in terms of acuity. That’s what we’ve been talking about. That’s for sure.
In terms of retinas, there are retinal conditions which affect how many colors some people can see. So 3% of males are red-green color blind. So they have trouble distinguishing reds and greens. There are some other rarer forms of colorblindness. There are some genetic women who can distinguish four colors. They’re tetrachromats, whereas most humans are trichromats.
FLORA LICHTMAN: That’s a cool, fun fact.
BRYAN JONES: That is a cool, fun fact. So that the gene for opsins in our cone photoreceptors that determine color is carried on the X chromosome. And you can have mutations in that gene.
And if you’re a male and you have a mutation, you’re out of luck. But if you’re a female, you have a backup copy. So females that have mutations can still see in trichromacy, but some of the mutations result in a shift in sensitivity that allows them to see a slightly wider chromatic range. So they’re tetrachromats. And there are other species of animals that can see far richer worlds than we can in terms of color and actually even acuity.
FLORA LICHTMAN: Really?
BRYAN JONES: Avians, a lot of bird species, have much better acuity. They can see better than we can in terms of resolution. They can see better than we can in terms of speed, the speed at which they can see things. And they can see more colors.
FLORA LICHTMAN: Did I read right that turtles can see more colors than we can?
BRYAN JONES: Turtles can see– so most trichromatic humans see a world with a mix of red, greens, and blues, so three colors. Turtles sea nominally in seven spectral channels and maybe even more because they have additional oil droplets at the ends of their photoreceptors that may act as additional spectral filters. So yeah, it’s an interesting biological world out there.
FLORA LICHTMAN: Bryan, you call color a mass shared illusion. Tell me about this.
BRYAN JONES: Yeah, so a lot of people think of retinas as cameras, but the way that they process information is more sophisticated than that. So the retina is like this multilaminar structure.
You’ve got photo photoreceptors at the back, so like a sensor array at the back. And then in front of that sensor array is a compute array. And that breaks the visual scene down into color and contrast and luminance and vector and velocity and starts calculating all the primitives, all the mathematical primitives, if you will, that form the algorithms that we see with and then sends those to brain.
Some of those primitives are channels of information that just do color. Others calculate edges. Others calculate edge movement over time and gives us velocity. And then it sends those data back to different places in our brains for processing.
Color is funny. In general, trichromatic humans tend to agree on what color is, on what shade of red is the same shade of red, on what shade of green or blue is the same shade or brightness or luminance or what have you.
FLORA LICHTMAN: So if I see a blue and my neighbor sees a blue, we probably interpret that blue the same way. In our brain, we see the same color.
BRYAN JONES: Yes. And so the thinking was that humans would have the same ratios of red, green, and blue photoreceptors across all humans, and it turns out we don’t. We have wildly different ratios of red photoreceptors versus green photoreceptors. But we still tend to agree on what the color is.
So neural systems are difference engines. They’re good at saying, this is similar to this, and this is different from this. And in some ways, it doesn’t matter what the stimulus is.
But to get back to your original question, there is no color. Photons, the things that fall in and stimulate our photoreceptors, have a wavelength. But there’s no color associated with that wavelength. The color is a biological percept that we have and we perceive as a difference in color.
And we communicate that. And coming back to the social construct, it’s an agreed-upon social construct of what that color is. But in terms of color in the universe, you could make an argument that there is no color.
FLORA LICHTMAN: Yeah, that’s fascinating because it makes you wonder, what are the other biological percepts that I’m creating in my brain? Maliha, what about for acuity? Do we have the same acuity in color as we have in black and white?
MALIHA ASHRAF: We don’t. So one of the lowest-level tasks that the retina does is divide the content into three main channels, which is achromatic, red-green, and blue-violet, roughly. And without getting too technical, the achromatic channel is the one which encodes the spatial information of what the structure looks like, so what the outlines are, what the edges are, or the shape and form.
FLORA LICHTMAN: The black and white channel?
MALIHA ASHRAF: We won’t call it black and white, more like achromatic because it just strips off the color information and only works in terms of luminance differences, or how bright or how dark the different areas of the image are. And then the red-green, and yellow-violet are the color encoding channel which strip off all the luminance information and just work on pure color signal.
So most of the heavy lifting of our perception is done by the achromatic channel. But color channel adds information on top of it. So in terms of spatial discrimination or such as visual acuity tasks, the achromatic channel has definitely way more acuity or resolving power, and the color channels require less detail.
So they add more bigger or globular information, I guess, rather than very, very high detail. So if you had very high detail in pure color, then we would not be able to perceive it as well.
FLORA LICHTMAN: Doctor Maliha Ashraf is a postdoc at the University of Cambridge in the UK, and Doctor Bryan Jones is a professor of ophthalmology at the University of Pittsburgh. Thank you to you both for joining me today.
BRYAN JONES: Thank you so much.
MALIHA ASHRAF: Thank you so much.
FLORA LICHTMAN: And if you want to calculate your perfect screen size, we have a link to Doctor Ashraf’s display calculator on our website at sciencefriday.com/display.
[MUSIC PLAYING]
Today’s episode was produced by Charles Bergquist. I’m Flora Lichtman. Thanks for listening.
Copyright © 2025 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/
Meet the Producers and Host
About Charles Bergquist
As Science Friday’s director and senior producer, Charles Bergquist channels the chaos of a live production studio into something sounding like a radio program. Favorite topics include planetary sciences, chemistry, materials, and shiny things with blinking lights.
About Flora Lichtman
Flora Lichtman is a host of Science Friday. In a previous life, she lived on a research ship where apertivi were served on the top deck, hoisted there via pulley by the ship’s chef.