Scientists estimate that more than 95% of Earth’s oceans have never been seen, meaning we’ve seen less of our planet’s ocean than the far side of the Moon or the surface of Mars.

The high cost of powering an underwater camera for a long period of time, attaching it to a research vessel or sending a vessel to recharge its batteries, is a significant challenge that impedes underwater exploration in large scale.

MIT researchers have taken an important step to overcome this problem by developing a wireless, battery-less underwater camera that is about 100,000 times more energy efficient than other underwater cameras. The device takes color photos, even in dark underwater environments, and transmits image data wirelessly through the water.

The stand-alone camera is powered by sound. It converts the mechanical energy of sound waves traveling through water into electrical energy that powers its imaging and communications equipment. After capturing and encoding the image data, the camera also uses sound waves to transmit the data to a receiver which reconstructs the image.

Because it doesn’t need a power source, the camera could operate for weeks before recovery, allowing scientists to search for new species in remote areas of the ocean. It could also be used to capture images of ocean pollution or monitor the health and growth of fish raised in aquaculture farms.

“One of the most exciting applications of this camera for me personally is in the context of climate monitoring. We build climate models, but we lack data for more than 95% of the ocean. This technology could help us build more accurate climate models and better understand the impact of climate change on the underwater world,” says Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Group. Kinetics at the MIT Media Lab, and lead author of a new paper on the system.

Joining Adib on the article. The paper is published today in Nature Communication.

Do without battery

To build a camera that could operate autonomously for long periods of time, researchers needed a device that could harvest energy underwater while consuming very little power.

The camera acquires energy using transducers made from piezoelectric materials placed around its exterior. Piezoelectric materials produce an electrical signal when a mechanical force is applied to them. When a sound wave passing through water strikes the transducers, they vibrate and convert this mechanical energy into electrical energy.

These sound waves can come from any source, such as a passing ship or sea life. The camera stores the harvested energy until it has accumulated enough energy to power the electronics that take pictures and communicate data.

To keep power consumption as low as possible, the researchers used off-the-shelf ultra-low-power imaging sensors. But these sensors only capture grayscale images. And since most underwater environments don’t have a light source, they also had to develop a low-power flash.

“We were trying to minimize the hardware as much as possible, which creates new constraints on how to build the system, send information and perform image reconstruction. It took a lot of creativity to figure out how to do that,” says Adib.

They solved both problems simultaneously by using red, green and blue LEDs. When the camera captures an image, it shines a red LED and then uses image sensors to take the photo. It repeats the same process with green and blue LEDs.

Even though the image appears black and white, red, green and blue light reflects off the white portion of each photo, Akbar explains. When the image data is combined in post-processing, the color image can be reconstructed.

“When we were kids in art class, we were taught that we could create any color using three basic colors. The same rules apply to color images that we see on our computers. We just need to red, green and blue – these three channels – to build color images,” he says.

Sending data with sound

After the image data is captured, it is encoded as bits (1s and 0s) and sent to a receiver one bit at a time using a process called underwater backscatter. The receiver transmits the sound waves through the water to the camera, which acts like a mirror to reflect these waves. The camera reflects a wave towards the receiver or changes its mirror into an absorber so that it is not reflected.

A hydrophone next to the transmitter detects if a signal is reflected from the camera. If it receives a signal, it is a bit-1, and if there is no signal, it is a bit-0. The system uses this binary information to reconstruct and post-process the image.

“This whole process, since it requires only a single switch to convert the device from a non-reflective to a reflective state, consumes five orders of magnitude less power than typical underwater communication systems” , explains Afzal.

The researchers tested the camera in several underwater environments. In one, they captured color images of plastic bottles floating in a New Hampshire pond. They were also able to take such quality photos of an African starfish that tiny tubercles along its arms were clearly visible. The device was also effective at repeatedly imaging the underwater plant Aponogeton ulvaceus in a dark environment for a week to monitor its growth.

Now that they have demonstrated a working prototype, the researchers plan to improve the device so that it is practical for deployment in real environments. They want to increase the camera’s memory so that it can capture real-time photos, stream images, or even shoot underwater videos.

They also want to extend the range of the camera. They managed to transmit data 40 meters from the receiver, but extending that range would allow the camera to be used in more underwater environments.

“This will open great research opportunities both in low-power IoT devices as well as underwater monitoring and research,” says Haitham Al-Hassanieh, Assistant Professor of Electrical and Computer Engineering at the University of Illinois Urbana-Champaign, who was not involved in this research.

This research is supported, in part, by the Office of Naval Research, the Sloan Research Fellowship, the National Science Foundation, the MIT Media Lab, and the Doherty Chair in Ocean Utilization.