Scientists have developed a cutting-edge system, a blend of software and hardware, that can convert standard color videos to reflect the visual perspective of various animals, providing insights previously beyond our grasp.
Until now, researchers have relied on false color imagery to approximate animal vision. This method, while clever and capable of rendering useful results, has significant limitations. It’s labor-intensive and only works with still images under specific lighting conditions. The new system, however, promises to overcome these constraints, introducing a dynamic and versatile approach that could transform our understanding of non-human visual experiences.
A Novel Window into Animal Vision
Humans, with their trichromatic vision, can discern approximately 10 million colors. That’s pretty great for a mammal, but not really that impressive when compared to a bee. Our eyes, equipped with three types of photoreceptors for red, blue and green miss out on ultraviolet light and certain rapid movements effortlessly detected by other species.
Birds, for instance, are tetrachromats, having an additional photoreceptor type that allows them to see ultraviolet light. Meanwhile, mice are dichromats. Their eyes are only sensitive to green and ultraviolet light — but their color vision isn’t made any less intriguing by their lack of a third cone. Now, we finally have access to how different species of animals see the world. It’s pretty wild.
The key to this new technology is a beam splitter, which segregates ultraviolet from visible light, directing them to two separate cameras. Overall, this camera can capture light in blue, green, and red, as well as ultraviolet. The software then reconstructs the image based on the specific wavelengths perceived by an animal’s photoreceptors.
The idea of recording video in ultraviolet is not new. In fact, the first UV video dates back to 1969. However, the technical challenges have limited its application until now.
In their study, the researchers at George Mason University and Queen Mary University of London tested their system with honeybees and UV-sensitive birds. They compared the results to spectrophotometry, the gold-standard method in false color imagery. Their findings showed an astounding 92 to 99 percent accuracy, depending on environmental conditions.
“We’ve long been fascinated by how animals see the world. Modern techniques in sensory ecology allow us to infer how static scenes might appear to an animal; however, animals often make crucial decisions on moving targets (e.g., detecting food items, evaluating a potential mate’s display, etc.). Here, we introduce hardware and software tools for ecologists and filmmakers that can capture and display animal-perceived colors in motion,” said senior author Daniel Hanley of George Mason University.
In the accompanying footage demonstrating the technology, you can notice how different species, including peafowl, humans, bees, and dogs, perceive a peacock feather. Remarkably, peafowl can see an enhanced iridescence crucial for mating displays, a detail that escapes human eyes.
Seeing from a new perspective
Besides offering a precious window into the animal world this technology might have valuable practical implications. In ecology, it could help scientists to better understand the intricate balance in the mating and hunting dynamics of birds. For instance, understanding how a male bird’s flamboyant display appeals to mates while evading predators can be studied from multiple perspectives using this fantastic camera and software.
In conservation efforts, this system could significantly reduce avian fatalities caused by window strikes, which are estimated to kill 100 million birds annually in the U.S. By having a bird’s point of view of windows and decals, people could develop more effective preventive measures.
Closer to home and our love for popular science, I’m delighted to hear that the researchers are planning to use this technology to enhance future nature documentaries. David Attenborough is going to have a field day with this!
The findings appeared in the journal PLOS Biology.