Right now, we have cars that can see the road and learn from it. Soon, we will see a new wave of acoustic technology that could give vehicles the missing sense that cameras and radar can’t provide.
By detecting sirens before they're visible, picking up the chatter of pedestrians, or even transmitting urgent sounds through a driver’s headrest, researchers are teaching cars to react to the world the way humans do by listening.
“Being able to perceive exterior sounds and attribute them accurately is a crucial part of attentively observing the full traffic environment. After all, many situations on the road are preceded by an acoustic signal. Take an approaching emergency vehicle, for example, which alerts people to its presence by using a siren,” said Moritz Brandes, who leads The Hearing Car project at Fraunhofer IDMT.
Unlike optical systems, which need a clear line of sight, acoustic sensors can pick up what’s happening around corners or in crowded streets. That ability could prove essential for autonomous driving, where every millisecond of awareness matters.
The demo vehicle, dubbed The Hearing Car, is packed with microphones and AI software that can recognize and classify sounds from the road.
These sensors are designed to stand up to rain, wind, and extreme temperatures, with careful placement ensuring accurate pickup even at highway speeds. Testing has taken the car from Portugal to the Arctic Circle to stress the technology in real conditions.
To make sure drivers don’t miss critical cues, important noises can also be piped directly into the cabin via the headrest. That means a siren, horn, or warning call is not just detected but delivered right to the driver’s ear, helping them respond faster.
The project involves close collaboration with automotive suppliers and manufacturers, who see the potential of acoustic sensing as the next big leap in driver assistance.
The same technology that helps vehicles recognize a siren also enables more natural interaction with their passengers. Drivers can issue voice commands like "Open the trunk," while speaker verification ensures only authorized voices can trigger key actions.
Inside the cabin, researchers are layering on tools to monitor drivers’ health and attention. Short-range radar can measure heart rate and breathing without contact, while mobile EEG headbands track brain activity for signs of fatigue. Voice analysis detects stress or excitement, feeding real-time feedback to occupants.
No comments:
Post a Comment