In George Orwell’s eerily prescient novel “1984,” the omnipresent telescreens epitomize the ultimate invasion of privacy. In the fictional novel, telescreens are installed in every Party member’s home and throughout public spaces, doubling as both propaganda machines and unblinking sentinels of the state. Although there are no cameras, the telescreen can nevertheless spy on people.
Now, a new study by MIT researchers shows this isn’t such a far-flung idea. The scientists devised a method that allowed them to hack ambient light sensors, tiny components designed to adjust your screen’s brightness, to serve as a window to peer into our private lives. It’s the first time that someone has shown these sensors can essentially be turned into a second camera.
A hidden camera in disguise
At first glance, ambient light sensors appear harmless. They measure the brightness around you, automatically dimming your screen in sunlight or brightening it in darker settings for optimal viewing. However, the researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have found that these sensors can capture images of what’s happening in front of them, without the need for a camera. It’s a huge privacy risk that has remained under the radar until now.
Unlike apps that require permission to use cameras, these sensors operate without asking, silently collecting data.
“Many believe that these sensors should always be turned on,” lead author Yang Liu, a PhD student in MIT’s Department of Electrical Engineering and Computer Science and a CSAIL affiliate, said in a press release.
“But much like the telescreen, ambient light sensors can passively capture what we’re doing without our permission, while apps are required to request access to our cameras. Our demonstrations show that when combined with a display screen, these sensors could pose some sort of imaging privacy threat by providing that information to hackers monitoring your smart devices.”
How ambient light hacking works
The process is as complex as it is ingenious. The ambient light sensor gathers subtle changes in light intensity caused by movements and interactions with the screen. When you tap the touch surface of a device to interact with a page or type in private data, light is blocked by the hand and reflected off your face.
Using a sophisticated algorithm, researchers can map these variations onto a two-dimensional space, essentially reconstructing a pixelated image of the activity in front of the screen. The resulting images are not as sharp as those captured with a traditional camera, but they still mark an invasion of privacy that can be used in various nefarious ways.
In experiments, the MIT team used an Android tablet to conduct three demonstrations, ranging from a mannequin interacting with the device to capturing the nuances of human hand movements. These tests showed that gestures such as swiping, scrolling, and tapping could be monitored, transforming every touch into a potential data point for hackers.
The researchers propose several measures to safeguard our privacy. They suggest tightening app permissions for ambient light sensors and reducing the sensors’ precision and speed, making it harder for unwanted observers to capture detailed information. Changing the light sensors may result in a drop in performance, but consumers gain peace of mind.
Additionally, repositioning the sensors on devices could prevent them from directly facing users. In the vast majority of devices such as smartphones or laptops, the light sensor is placed directly adjacent to the camera.
While the idea of a computer screen watching our every move may sound like science fiction, the reality is that technology is advancing in ways that continually challenge our perceptions of privacy. It’s a reminder that even basic tech features in our devices can be twisted for surveillance.
The findings appeared in the journal Science Advances.