Researchers from MIT and Harvard have joined forces to create an algorithm which will allow us to finally visualize black holes. Their work patches up bits and fragments of information gathered from all around the world in the hope of creating a broader picture of the most massive things in the Universe.
The algorithm stitches data collected from radio telescopes scattered across the globe, basically using the entire planet as a huge radio telescope dish they call the Event Horizon Telescope project.
“Radio wavelengths come with a lot of advantages,” says Katie Bouman, an MIT graduate student in electrical engineering and computer science, who led the development of the new algorithm. “Just like how radio frequencies will go through walls, they pierce through galactic dust. We would never be able to see into the center of our galaxy in visible wavelengths because there’s too much stuff in between.”
However, it’s not all rosy – the long wavelengths also bring downsides. The first one is that they also require big antennas. The second one is that long wavelengths create low resolution pictures. For example, the largest radio-telescope dish has the diameter of 1,000 feet (300 meters) but an image of the Moon would look blurrier than with a commercial backyard telescope.
Because black holes tend to be so compact, this is a big issue.
“A black hole is very, very far away and very compact,” Bouman says. “It’s equivalent to taking an image of a grapefruit on the moon, but with a radio telescope. To image something this small means that we would need a telescope with a 10,000-kilometer diameter, which is not practical, because the diameter of the Earth is not even 13,000 kilometers.”
The solution they came up with was to use measurements from widely divergent locations and coordinate the results. Six observatories have already joined in, and many more are still to follow. But even with so many different sources of data, the resolution still isn’t good enough, and there are still wide gaps — this is where the algorithm steps in.
“There is a large gap between the needed high recovery quality and the little data available,” says Yoav Schechner, a professor of electrical engineering at Israel’s Technion, who was not involved in the work. “This research aims to overcome this gap in several ways: careful modeling of the sensing process, cutting-edge derivation of a prior-image model, and a tool to help future researchers test new methods.”
They use interferometry, a technique which combines the signals coming from several different telescopes, deriving information from the way these interfere with each other.
Usually, the technique requires two telescopes, but there’s a problem. The Earth’s atmosphere can drastically slow down radio waves, causing massive differences in arrival times and causing many errors. Bouman adopted a clever algebraic solution to this problem: If the measurements from three telescopes are multiplied, the extra delays caused by atmospheric noise cancel each other out, an MIT press release writes. This means that, of course, more data is needed (three telescopes vs two), but the results are much more precise.
“Suppose you want a high-resolution video of a baseball,” Schechner explains. “The nature of ballistic trajectory is prior knowledge about a ball’s trajectory. In essence, the prior knowledge constrains the sought unknowns. Hence, the exact state of the ball in space-time can be well determined using sparsely captured data.”
Bouman will present her new algorithm — which she calls CHIRP, for Continuous High-resolution Image Reconstruction using Patch priors — at the Computer Vision and Pattern Recognition conference in June.