Stanford engineers combine light and sound to see underwater
The “Photoacoustic Airborne Sonar System” could be installed beneath drones to enable aerial underwater surveys and high-resolution mapping of the deep ocean.
Stanford University engineers have developed an airborne method for imaging underwater objects by combining light and sound to break through the seemingly impassable barrier at the interface of air and water.
Go to the web site to view the video.
The researchers envision their hybrid optical-acoustic system one day being used to conduct drone-based biological marine surveys from the air, carry out large-scale aerial searches of sunken ships and planes, and map the ocean depths with a similar speed and level of detail as Earth’s landscapes. Their “Photoacoustic Airborne Sonar System” is detailed in a recent study published in the journal IEEE Access.
“Airborne and spaceborne radar and laser-based, or LIDAR, systems have been able to map Earth’s landscapes for decades. Radar signals are even able to penetrate cloud coverage and canopy coverage. However, seawater is much too absorptive for imaging into the water,” said study leader Amin Arbabian, an associate professor of electrical engineering in Stanford’s School of Engineering. “Our goal is to develop a more robust system which can image even through murky water.”
Oceans cover about 70 percent of the Earth’s surface, yet only a small fraction of their depths have been subjected to high-resolution imaging and mapping.
The main barrier has to do with physics: Sound waves, for example, cannot pass from air into water or vice versa without losing most – more than 99.9 percent – of their energy through reflection against the other medium. A system that tries to see underwater using soundwaves traveling from air into water and back into air is subjected to this energy loss twice – resulting in a 99.9999 percent energy reduction.
Similarly, electromagnetic radiation – an umbrella term that includes light, microwave and radar signals – also loses energy when passing from one physical medium into another, although the mechanism is different than for sound. “Light also loses some energy from reflection, but the bulk of the energy loss is due to absorption by the water,” explained study first author Aidan Fitzpatrick, a Stanford graduate student in electrical engineering. Incidentally, this absorption is also the reason why sunlight can’t penetrate to the ocean depth and why your smartphone – which relies on cellular signals, a form of electromagnetic radiation – can’t receive calls underwater.
The upshot of all of this is that oceans can’t be mapped from the air and from space in the same way that the land can. To date, most underwater mapping has been achieved by attaching sonar systems to ships that trawl a given region of interest. But this technique is slow and costly, and inefficient for covering large areas.
An invisible jigsaw puzzle
Enter the Photoacoustic Airborne Sonar System (PASS), which combines light and sound to break through the air-water interface. The idea for it stemmed from another project that used microwaves to perform “non-contact” imaging and characterization of underground plant roots. Some of PASS’s instruments were initially designed for that purpose in collaboration with the lab of Stanford electrical engineering professor Butrus Khuri-Yakub.
At its heart, PASS plays to the individual strengths of light and sound. “If we can use light in the air, where light travels well, and sound in the water, where sound travels well, we can get the best of both worlds,” Fitzpatrick said.
To do this, the system first fires a laser from the air that gets absorbed at the water surface. When the laser is absorbed, it generates ultrasound waves that propagate down through the water column and reflect off underwater objects before racing back toward the surface.
The returning sound waves are still sapped of most of their energy when they breach the water surface, but by generating the sound waves underwater with lasers, the researchers can prevent the energy loss from happening twice.
“We have developed a system that is sensitive enough to compensate for a loss of this magnitude and still allow for signal detection and imaging,” Arbabian said.
The reflected ultrasound waves are recorded by instruments called transducers. Software is then used to piece the acoustic signals back together like an invisible jigsaw puzzle and reconstruct a three-dimensional image of the submerged feature or object.
“Similar to how light refracts or ‘bends’ when it passes through water or any medium denser than air, ultrasound also refracts,” Arbabian explained. “Our image reconstruction algorithms correct for this bending that occurs when the ultrasound waves pass from the water into the air.”
Drone ocean surveys
Conventional sonar systems can penetrate to depths of hundreds to thousands of meters, and the researchers expect their system will eventually be able to reach similar depths.
To date, PASS has only been tested in the lab in a container the size of a large fish tank. “Current experiments use static water but we are currently working toward dealing with water waves,” Fitzpatrick said. “This is a challenging but we think feasible problem.”
The next step, the researchers say, will be to conduct tests in a larger setting and, eventually, an open-water environment.
“Our vision for this technology is on-board a helicopter or drone,” Fitzpatrick said. “We expect the system to be able to fly at tens of meters above the water.”
Stanford graduate student Ajay Singhvi is also a co-author on the study. The research was supported by the U.S. Office of Naval Research and the Advanced Research Projects Agency-Energy (ARPA-E).
To read all stories about Stanford science, subscribe to the biweekly Stanford Science Digest.