MIT researchers discovered the dark side of smartphone light sensors; it can help hackers track your movements! Malicious individuals can use artificial intelligence models to reconstruct images of what’s happening in front of your screen. For example, a stranger could use your light sensor to determine whether your face or hand is in front of your smart device.
Most people recommend turning on your camera only when you’re using it. Also, some obstruct their cameras with duct tape or disconnect them to prevent prying eyes. However, light sensors could be another tool in a scammer’s kit to spy on you. Hopefully, tech companies may adjust light sensors in future devices to ensure cybersecurity.
This article will discuss how someone could spy on others using light sensors. Later, I will show how artificial intelligence improved imaging technology.
How can a light sensor spy on people?
MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) scientists created a computational imaging algorithm to recover surrounding images before a screen using its light sensor.
Most smartphones and tablets use these light detectors to trigger specific features, such as dimming your screen in the dark. Modern sensors are so sensitive that they can identify subtle changes in brightness.
It’s like surrounding objects produce shadows that the light sensor can discern. Yang Liu and his research team used an AI program to turn sensor data into images.
Like the previous analogy, the artificial intelligence’s images look like shadows contrasted against a grey background. Moreover, it compiles those pictures into a video that moves in real time.
You may also like: Samsung Screen Monitors Blood Pressure
The researchers show that their algorithm can illustrate a user’s gestures, such as swiping, hovering, and scrolling. They conducted three tests to prove it.
- They seated a dummy before the device while different hands moved toward the screen. A human hand pointed to the screen, and then they touched the monitor using a cardboard cutout.
- The researchers showed a light sensor can provide real-time tracking by deploying a faster one. Consequently, the more powerful sensor enabled them to follow movements by one frame every 3.3 minutes.
- The last test revealed that people are still at risk when watching videos like films and short clips. A light sensor can still capture gestures, even if the person has a whiteboard behind them.
“This work turns your device’s ambient light sensor and screen into a camera!” said Princeton professor Felix Heide. “As such, the authors highlight a privacy threat that affects a comprehensive class of devices and has been overlooked so far.”
How has AI improved imaging technology?
Light sensor imaging is surprising enough, but recent AI advancements have pushed the boundaries of imaging technology. For example, University of Technology Sydney researchers created an AI that turns thoughts into readable text!
They call their latest invention DeWave, which uses an electroencephalogram or EEG to decode brain waves. UTS experts trained their machine learning algorithm with rigorous training.
“It is the first to incorporate discrete encoding techniques in the brain-to-text translation process, introducing an innovative approach to neural decoding,” computer scientist Chin-Ten Lin Said.
“The integration with large language models is also opening new frontiers in neuroscience and AI,” he added. Lin and his team used trained language models, combining the BERT and GPT systems.
They tested them on existing datasets of people with eye tracking and brain activity recorded while reading text. That method helped their mind-reading program translate brain wave patterns into words.
They trained DeWave further with an open-source large language model that turns words into sentences. ScienceAlert said the Ai tool performed best when translating verbs.
You may also like: Designer creates ‘Third Eye’ for ‘smartphone zombies’
In contrast, it typically translated nouns as pairs of words with the same meaning instead of the exact translations. For example, it may interpret “the author” as “the man.”
“We think this is because when the brain processes these words, semantically similar words might produce similar brain wave patterns,” explained first author Yiqun Duan.
“Despite the challenges, our model yields meaningful results, aligning keywords and forming similar sentence structures,” he added. However, the researchers admitted their method requires further refinement.
Conclusion
MIT scientists discovered that light sensors in phones could become surveillance devices. An artificial intelligence program enables them to make real-time silhouette images of objects before the screen.
The researchers suggested letting users turn off their light sensors to protect privacy. Also, they recommended reducing the sensor’s precision and speed to deter hackers.
Learn more about this light sensor study on the Science Advances website. Also, check the latest digital tips and trends at Inquirer Tech.