An unprecedented image created by UK and US researchers shows how a submerged human is “seen” by dolphins through echolocation.
Echolocation, also called bio sonar, is the biological sonar used by several kinds of animals, including dolphins. Basically, they emit sounds around them and then listen to the returning echo to locate and identify different objects or creatures around them. For this experiment, a female dolphin named Amaya directed her sonar beams at a submerged diver, while hydrophones captured the ensuing echoes. To avoid added noise, the diver went in without a breathing device that could cause extra bubbles.
The team led by Jack Kassewitz of SpeakDolphin.com used an imaging system known as a Cymascope. The system makes it possible to record and store dolphin echolocation info and then create 2D images from those sounds. Computers can then, through models, improve this image and make it 3D.
“We’ve been working on dolphin communication for more than a decade,” noted Kassewitz in a release. “When we discovered that dolphins not exposed to the echolocation experiment could identify objects from recorded dolphin sounds with 92% accuracy, we began to look for a way for to see what was in those sounds.”
The results were so good they surprised even the researchers – for the first time, we get the chance to see what cetaceans “see” through echolocation.
“We were thrilled by the first successful print of a cube by the brilliant team at 3D Systems,” said Kassewitz. “But seeing the 3D print of a human being left us all speechless. For the first time ever, we may be holding in our hands a glimpse into what cetaceans see with sound. Nearly every experiment is bringing us more images with more detail.”
For the future, they plan to determine how dolphins share these images, through some kind of sono-pictorial languages (that uses both sounds and images). It would also be interesting to see how dolphins perceive these images, as it’s certain they don’t do it the same way as us.