There are ghosts in your machine. Cybersecurity researcher can make...

Cybersecurity researcher Kevin Fu, specializing in new technology exploitation at Northeastern University, has developed a novel form of cyberattack called “Poltergeist attacks.” These attacks leverage an “acoustic adversarial” approach to machine learning, causing self-driving cars and other autonomous systems to hallucinate false realities. Fu’s method involves exploiting the optical image stabilization found in modern cameras, including those in autonomous vehicles. By manipulating the resonant frequencies of sensors used in image stabilization, Fu’s team can send matching sound waves to the camera lenses, resulting in blurred images. When machine learning algorithms process these distorted images, they can mislabel objects, potentially leading to dangerous scenarios on the road. Fu believes that addressing vulnerabilities like these is crucial for ensuring the safe adoption of autonomous technologies as they become more prevalent in our daily lives.