In the age of smart devices, is privacy just an illusion? This question is relevant to most of us who are almost always surrounded by smart devices equipped with cameras. We may never know, but some of these cameras might be capturing and uploading our private photos to a someone’s cloud server.
For instance, in 2020, photos of a woman while in the toilet appeared on Facebook after being leaked from a company’s server. These photos were not taken by a person, but by her robotic vacuum cleaner and this isn’t the only such incident.
“Most consumers do not think about what happens to the data collected by their favorite smart home devices. In most cases, raw audio, images, and videos are being streamed off these devices to the manufacturers’ cloud-based servers, regardless of whether or not the data is actually needed for the end application,” Alanson Sample, associate professor of computer science and engineering at the University of Michigan (UM), said.
To solve this sensitive issue, Professor Sample and his team have developed the Privacy Lens, a unique camera that converts people into animated stick figures before storing or sending the photos to a server.
“That extra anonymity could prevent private moments from leaking onto the internet, which is increasingly common in today’s world laden with camera-equipped devices that collect and upload information,” the UM researchers note.
Privacy issues with conventional smart-device cameras
Traditional cameras used in smart devices such as cellphones, thermostats, laptops, doorbells, and health monitoring devices detect a person and capture their image using the RGB data (i.e. they capture red, green, and blue wavelengths).
These devices need the cameras to perform their purpose, but they may also end up taking all kinds of photographs including those with sensitive details and personally identifiable information (PII).
Some companies use the region of interest (ROI) feature to prevent their servers from storing any sensitive photos. This feature works like a privacy filter and eliminates certain parts of an image. However, such features often fail when RGB cameras are exposed to changed lighting and environmental conditions.
“Our study found that nearly half of the images with people still leaked personal information when using just RGB methods. Additionally, removing parts of the image often happens on a remote server, which means we need to trust that server to handle the data securely,” the authors of the PrivacyLens study, said.
How does PrivacyLens work?
PrivacyLens uses a combination of RGB and thermal cameras to overcome the limitations of normal smart-device cameras. Plus, it has a powerful GPU that enables the device to remove PII and turn a person into a stick figure before the image is sent to a server.
“A smart device that removes personally identifiable information before sensitive data is sent to private servers will be a far safer product than what we currently have,” Sample said.
This animated stick figure ensures a smart device operates without disclosing the identity of the person in view. Additionally, the UM researchers have also installed a sliding privacy scale into the device that allows a user to censor their face and other body parts.
“The core idea behind PrivacyLens is that the onboard RGB and thermal cameras can be used together to robustly detect persons and their thermal silhouettes, which are used to “subtract” them from images. Its embedded GPU efficiently removes five forms of PII (face, skin color, hair color, gender, and body shape) before any data (i.e., images or ML features) is stored or transmitted off-device,” the study authors added.
When Sample and his team conducted real-world testing of PrivacyLens in home, office, and park settings, it was able to remove 99.1 percent of PII. Whereas RGB cameras equipped with the best ROI and GPU technologies achieved only 57.6 percent of PII removal.
People want this kind of solution
The researchers performed an interesting survey, asking 15 patients whether they would feel comfortable being monitored with PrivacyLens at their homes. The participants responded that replacing themselves with stick figures in a camera’s view would make them feel more at ease.
However, PrivacyLens’ application is not just limited to health monitoring devices. Automakers and home appliance companies can use this tech to prevent their vehicles and devices from being used for surveillance.
“There’s a wide range of tasks where we want to know when people are present and what they are doing, but capturing their identity isn’t helpful in performing the task. So why risk it?” Iravantchi says,” Yasha Iravantchi, first author of the study and a doctoral student at UM, said.
Although further research and experiments are required to validate the performance and reliability of PrivacyLens, for the first time, we have a camera that promises to keep our privacy protected. This marks a crucial development in a world where some tech companies will stop at nothing to play with your data, emotions, and identity.
The study is published in the Proceedings on Privacy Enhancing Technologies (PoPETs)