homehome Home chatchat Notifications


Our eyes have a focal point -- but images don't seem to focus on it, weirdly

This, unexpectedly, makes our vision a bit better.

Alexandru Micu
August 3, 2021 @ 7:18 pm

share Share

New research says that if you want to see something better, you shouldn’t look directly at it. At least, that’s what our eyes seem to believe.

Image via Pixabay.

Researchers at the University of Bonn, Germany, report that when we look directly at something, we’re not using our eyes to their full potential. When we do this, they explain, light doesn’t hit the center of our foveas, where photoreceptors (light-sensitive cells) are most densely packed. Instead, light (and thus, the area where images are perceived) are shifted slightly upwards and towards the nose relative to this central, highly sensitive spot.

While this shift doesn’t seem to really impair our perception in any meaningful way, the findings will help improve our understanding of how our eyes work and how we can fix them when they don’t.

I spy with my little eye

“In humans, cone packing varies within the fovea itself, with a sharp peak in its center. When we focus on an object, we align our eyes so that its image falls exactly on that spot — that, at least, was the general assumption so far,” says Dr. Wolf Harmening, head of the adaptive optics and visual psychophysics group at the Department of Ophthalmology at the University Hospital Bonn and corresponding author of the paper.

The team worked with 20 healthy subjects from Germany, who were asked to fixate on (look directly at) different objects while monitoring how light hit their retinas using “adaptive optics in vivo imaging and micro-stimulation”. An offset between the point of highest photoreceptor density and where the image formed on the retina was observed in all 20 participants, the authors explain. They hypothesize that this shift is a natural adaptation that helps to improve the overall quality of our vision.

Our eyes function similarly to a camera, but they’re not really the same. In a digital camera, light-sensitive elements are distributed evenly across the surface of their sensors. They’re the same all over the sensor, with the same size, properties, and operating principles. Our eyes use two types of cells to pick up on light, the rod and cone photoreceptors. The first kind is useful for seeing motion in dim light, and the latter is suited to picking out colors and fine detail in good lighting conditions.

Unlike in a camera, however, the photosensitive cells in our retinas aren’t evenly distributed. They vary quite significantly in density, size, and spacing. The fovea, a specialized central area of our retinas that can produce the sharpest vision, has around 200,000 cone cells per square millimeter. At the edges of the retina, this can fall to around 5,000 per square millimeter, which is 40 times less dense. In essence, our eyes produce high-definition images in the middle of our field of view and progressively less-defined images towards the edges. Our brains kind of fill in the missing information around the edges to make it all seem seamless — but if you try to pay attention to something at the edges of your vision, you’ll notice how little detail you can actually notice there.

It would, then, seem very counterproductive to have the image of whatever we’re looking at directly form away from the fovea. Wouldn’t we want to have the best view of whatever we’re, you know, viewing? The team explains that this is likely an adaptation to the way human sight works: both eyes, side by side, peering out in the same direction.

All 20 participants in the study showed the same shift, which was slightly upwards and towards the nose compared to the fovea. For some, this offset was larger, for some, smaller, but the direction was always the same for all participants, and all of them showed symmetry in the offset between both eyes. Follow-up examinations carried out one year after the initial trials showed that these focal points had not moved in the meantime.

“When we look at horizontal surfaces, such as the floor, objects above fixation are farther away,” explains Jenny Lorén Reiniger, a co-author of the paper. “This is true for most parts of our natural surrounds. Objects located higher appear a little smaller. Shifting our gaze in that fashion might enlarge the area of the visual field that is sheen sharply.”

“The fact that we were able to detect [this offset] at all is based on technical and methodological advances of the last two decades,” says Harmening.

One other interesting conclusion the authors draw is that, despite the huge number of light-sensitive cells our retinas contain, we only use a small fraction of them — around a few dozen — when focusing on a single point. Even more, it’s probably the same cells all throughout our lives, as the focal point doesn’t seem to move over time. While this is an interesting tidbit to share in trivia, it’s also valuable for researchers trying to determine how best to repair eyes and restore vision following damage or disease.

The paper “Human gaze is systematically offset from the center of cone topography” has been published in the journal Current Biology.

share Share

Scientists uncover how your brain flushes out waste during sleep

Scientists uncover a pulsating system that flushes out brain waste during non-REM sleep.

Woman's nut allergy triggered after sex in bizarre first

She was allergic to Brazil nuts, but it wasn’t any she ate that sent her to the hospital.

Weekend warriors, rejoice: working out once in a while is also good for your brain

It seems that even exercise just on the weekend still has significant cognitive benefits.

Can Your Voice Reveal Diabetes? This New AI Thinks So

Researchers have developed a voice-based AI tool that can detect Type 2 diabetes with surprising accuracy.

Breakdancer develops one-inch lump on his scalp after 20 years of headspins

Surgeons removed the man's "breakdance bulge" and the patient is now okay.

Archaeologists uncover 1,300-year-old throne room in Peru linked to powerful female ruler

Recently studied murals suggest a powerful female leader once ruled the Moche.

Scientists Use Math to Show New Type of Particles Once Considered Impossible Might Be Real

Researchers uncover new particle behaviors that break the two-type mold of quantum mechanics.

Hobbyist Builds AI-Assisted Rifle Robot Using ChatGPT: "We're under attack from the front left and front right. Respond accordingly"

The viral video sparked ethical debates about the broader implications of AI weapons.

Drones Helps Researchers Uncover a Lost Mega-Fortress in Georgia

Researchers have long known about the formidable scale of the Dmanisis Gora fortress, but a recent study has unveiled its true magnitude. Using drone-based imagery and photogrammetry, a team of scientists has revealed that this 3,000-year-old structure in the Caucasus Mountains spans an astonishing 60 to 80 hectares. A cultural crossroads The South Caucasus is […]

James Webb Telescope Uses Cosmic "Magnifying glass" to Detect Stars 6.5 Billion Light-Years Away

The research group observed a galaxy nearly 6.5 billion light-years from Earth; when the universe was half its current age.