Are Roses Red, And Violets Blue? Depends On Your Species
Over the millenia, animal eyes have evolved along different paths, adding or subtracting capabilities as they adapt to specific niches in the world. The result of all that evolution is that a bee, bird, or bull doesn’t see the world the same way you do. There are differences in the spatial resolution different animals can see, in the speed of their visual response, in the depth of focus, and in the way they process color.
Dogs, for instance, can’t really see red—their vision is best at seeing things that are blue or yellow. Birds and bees can see into the ultraviolet part of the spectrum, making a flower look quite different from the way humans perceive it.
This week, researchers published details of a video camera system that tries to help make sense of the way different animals view color. By combining different cameras, various filters, and a good dose of computer processing, they can simulate what a given video clip might look like to a specific animal species. It’s work that’s of interest to both biologists and filmmakers. Dr. Daniel Hanley, one of the researchers on the project and an assistant professor of biology at George Mason University, joins guest host Arielle Duhaime-Ross to describe the system and its capabilities.
A person applying UV-blocking sunscreen as seen through the eyes of a honeybee. Sunscreen looks white to the human eye because it reflects broadly over the range of wavelengths visible to us. In this depiction of a honeybee’s point of view, the sunscreen appears yellow, showing how the bee can see that sunscreen absorbs UV light. Credit: Vasas et al., 2024
A rainbow as seen through the eyes of a (A) mouse, (B) honeybee, (C) bird, and (D) human. Credit: Vasas et al., 2024
Dr. Daniel Hanley is an assistant professor of Biology at George Mason University in Fairfax, Virginia.
ARIELLE DUHAIME-ROSS: Earlier, we heard from science writer Ed Yong about the inner lives of animals and their senses. But what if you could experience some of that yourself?
For the rest of the hour, trying to take a peek, literally, into how animals see the world.
Over the millennia, animal eyes have evolved along different paths, adding or subtracting capabilities to adapt to a specific niche in the world. And the result is a bee doesn’t see the world the same way you do. There are differences in the spatial resolution different animals can see, in the speed of their visual response, in the depth of focus, and of course, in the way different animals process color.
For instance, I learned that my dogs can’t really see red. Their vision is best at seeing things that are blue or yellow. Hot tip– many dogs are big fans of the TV show Bluey because the show’s use of blues pops out at them.
This week, researchers published details of a video camera system that tries to help make sense of the way different animals view color. By combining different cameras, various filters, and a good dose of computer processing, they can simulate what a given video clip might look like to a specific animal species.
It’s work that is of interest to both biologists and to filmmakers. Joining me now to talk about this is one of the researchers on the project. Dr. Daniel Hanley is an assistant professor of biology at George Mason University, in Fairfax, Virginia.
Welcome to Science Friday.
DANIEL HANLEY: Thank you, Arielle. Thank you for having me.
ARIELLE DUHAIME-ROSS: And we have some example videos from the project on our website, at sciencefriday.com/animalvision.
So Daniel, how different are animal eyes from one species to another?
DANIEL HANLEY: There are lots of different ways of constructing eyes. We focus mostly on how those different eyes can receive color information. And our paper looks at many different types of eyes. These are what are termed dichromatic, meaning that they have two different kinds of photoreceptors; trichromatic, like us, which organisms have three different types of photoreceptors; and even tetrachromatic, so birds actually have an extra photoreceptors. They have four different photoreceptors.
And even if there’s commonalities– so even if we happen to be a trichromatic and a bee might be a trichromatic, there are lots of differences between organisms.
ARIELLE DUHAIME-ROSS: Why all of these differences? Why isn’t an eye just an eye?
DANIEL HANLEY: Well, there’s interesting evolutionary histories in the evolution of those photoreceptors that allowed organisms, for example, to detect food items and whether they were ripe or fresh. And if you looked at our own green photoreceptor, for example, there’s an interesting evolutionary history there. And it’s much more similar in its sensitivity to our red photoreceptor. And other organisms that have green photoreceptors see different kinds of green.
And so it really all comes down to what that organism needed to detect. If you look at some other objects, like flowers for example, they’re very beautiful to our eye, but they’re not really intended for us to be looking at.
ARIELLE DUHAIME-ROSS: Meaning that species who really, really rely on flowers, like bees for instance, they’ll see flowers differently?
DANIEL HANLEY: Precisely.
ARIELLE DUHAIME-ROSS: Can you tell me about this camera system? What does it let you do that’s new?
DANIEL HANLEY: Yeah, for sure. The camera system is actually built around two different cameras. And the reason that we did this was because we didn’t have good tools at our disposal that would allow us to capture animal-perceived colors in motion.
There were lots of tools that we had– for example, spectrophotometers and other multispectral cameras– that provide really accurate and high-quality data that we can use for understanding colors in general. But what they didn’t allow us to do were to see the same kind of display or activity or colors if they were moving.
For example, if a bird was dancing or showing off its feathers in a certain way, you would have to switch out your lens or switch out a filter and you would actually miss much of that behavior, miss the spatial components, the temporal components. And of course, by the time you switched your lens, either the bird or the animal would be gone or that behavior– half of the behavior– would be missing.
ARIELLE DUHAIME-ROSS: So is this the kind of tool that could be useful in a documentary about nature? Am I going to end up seeing what a lion sees, for instance, when it’s hunting one day?
DANIEL HANLEY: Absolutely. I would love to see what the prey of the lion is seeing. And so that was part of the intent, yeah. When we made this camera, we really wanted to bridge the gap between science and science communication. It’s a National Geographic-funded project.
And that was one of the main goals, that the storyteller– so somebody can go to the field and capture these beautiful images and captivate audiences about nature– they can actually go and record primary data. And that data would actually be, if collected properly, useful for a scientist.
ARIELLE DUHAIME-ROSS: And is this camera system tuned for a specific animal or is this a general purpose system where you have this library that you can just plug in a specific species and all of a sudden you see what they would see?
DANIEL HANLEY: Yeah, it’s definitely more of the latter. It’s a general purpose system. I love the way you phrased that. As long as a little bit about the organism’s vision, you can project into their visual space. And we’ve preloaded it with a few different organisms that the user could choose from. But many users might actually want to optimize it for their own species of interest.
ARIELLE DUHAIME-ROSS: Your system can go beyond the colors that people see, right? There are things that we as humans don’t see. You mentioned bees and bird vision, but what other animals make use of those extra colors?
DANIEL HANLEY: Right. Our camera sees into the ultraviolet. Many other organisms– many insects, birds, lizards, many mammals– have access to visual information that’s outside the range of what is detectable to humans. One that’s quite interesting is caribou and reindeer. And so you can imagine the utility of having the ability to peek into the ultraviolet range if you’re a caribou that lives in a habitat dominated by short shrubbery, snow that reflects all light, and polar bear.
Like our hair, polar bears’ hair absorbs ultraviolet radiation. And so having the ability to peek into the ultraviolet will allow them to see their predator against an otherwise white backdrop. So that’s our focus. But somebody else might modify this design.
So one of the things that we really liked about– or we tried to do for our project– is make it very modular. So if you were more interested in longer wave bands rather than shorter wave bands, you can focus on that as an infrared camera. You could probably modify this, to some extent, if you’re interested in polarized light– it’s not a polarized light camera– but some people are interested in other domains, other types of visual information.
ARIELLE DUHAIME-ROSS: I feel like you’ve touched on this a little bit, but I still want to ask you. Why is this camera system so important?
DANIEL HANLEY: So many of the signals and behaviors that we learn about in evolutionary biology and ecology have some element of motion in them. Imagine a male bird that’s dancing and showing off his feathers and trying to impress a female. Think of a bird of paradise, for example. That bird of paradise is using its space, it’s showing off its feathers in its best light. And some individuals are just better at doing this than others.
And so to really understand that signal, we have to capture it from the eye of the female. Many times, that female might be impressed in that male that’s displaying in this way because he’s managed to survive. That means that if she mates with him, her young also will have these beautiful feathers and have whatever agility, strength, ingenuity, because he’s avoided predation despite these conspicuous and elaborate colors.
Baked into that question, then, therefore, is that same display, that same motion, that same activity, it has to be seen from the eyes of two individuals that might see differently– the female who he’s trying to impress and the predator that he’s trying to avoid.
And we didn’t have a tool that allowed us to capture and measure that same activity, that same display, from two eyes.
ARIELLE DUHAIME-ROSS: Right. I mean, we look at a mating display and we think we understand what is attractive about it. But really, we’re missing so much information or taking in the wrong information. And your system really just puts that on display. What would you want to do to extend this work to push it even further?
DANIEL HANLEY: So there are lots of different potential directions. Right now, we’re interested in capturing as many of these visual displays in motion as we can. And one of the reasons that we want to do that is because I think that we’ve missed a lot because we didn’t have the tools necessary to ask the right questions.
And so lead author Vera Vasas, and I, we are both really excited by the potential that this has to instill curiosity and genuine inquiry that really forms the basis of the scientific method. A tool like this can really excite people to explore what’s in their own backyards.
And so much of the 19th and 20th, and even into the 21st century, has been testing hypotheses that develop from these primary observations and discoveries. We think that there could be a lot that’s out there yet to be uncovered.
ARIELLE DUHAIME-ROSS: Well, that sure sounds exciting. Daniel, thank you so much for your time. Thank you for coming on the show.
DANIEL HANLEY: Thank you so much for having me. It’s been a pleasure.
ARIELLE DUHAIME-ROSS: Dr. Daniel Hanley is an assistant professor of biology at George Mason University, in Fairfax, Virginia.
We have some example videos from the project on our website, at sciencefriday.com/animalvision.
Arielle Duhaime-Ross is science reporter for The Verge in New York, New York.