The southern ring nebula, captured by the James Webb Space Telescope (JWST).
NASA, ESA, CSA and STScI
You saw the first full-color images of the James Webb Space Telescope, right? A stellar nursery that reveals previously invisible stars, the atmosphere of a giant exoplanet examined, a cluster of galaxies, a beautiful planetary nebula, and the deepest image of our universe ever captured.
Very cool, eh? But were they real?
Of course they were real!
Were they exactly as Webb captured them in a single image, as if you were taking a picture with your phone?
Not at all.
Webb is designed to be sensitive to light that we cannot see. It also has four scientific instruments and seventeen modalities.
“When you get the data, they don’t look at all like a beautiful color image,” said Klaus Pontoppidan, a Webb project scientist at STScI, who leads a team of 30 expert image manipulators. “It simply came to our notice then [and] it’s only if you know what to look for that you can appreciate them ”.
Webb’s engineers had to manipulate a lot of the images we saw before posting them, and for fairly simple, common-sense reasons.
So what’s going on?
This is not just taking a picture on the phone.
Image planning
First comes the selection of shots. NASA was looking for objects that would produce a pleasant frame, have a structure, and make use of color, while at the same time highlighting science.
Webb cannot see all parts of the sky at any one time. So given that the launch of the telescope was delayed several times, there was no way the engineers could meticulously plan the first images until Webb went into the sky last December.
When it did, the engineers had a list of about 70 targets, which were selected to demonstrate the breadth of the scientific web it was capable of and which could advertise spectacular color images.
“Once we knew when we could grab the data, we could get off that list and pick the top priority targets that were visible at the time,” Pontoppidan said. “The images were planned for a long time [and] A lot of work has been done to stimulate what the observations would be like so that everything can be set up correctly. ”
The Carina Nebula, captured by the James Webb Space Telescope (JWST). NASA, ESA, CSA and STScI
NASA, ESA, CSA and STScI
How Webb’s data returns to Earth
Before engineers can start working manipulating Webb’s images, raw data must be returned to our planet from a million miles away in space. This is done using NASA’s JPL deep space network (DSN), which is how engineers communicate and receive data from their more than 30 robotic probes in the solar system and beyond, including Webb. There are three complexes in the DSN, each located 120º from each other; California, Madrid in Spain and Canberra in Australia.
Radio waves are very reliable, but slow. The data reaches a couple of megabits per second (Mbps). However, the DSN will soon be upgraded from slow radio transmissions to super fast “space lasers” that could massively increase data rates up to 10 or even 100 times faster.
“We plan things, hang them on the observatory, take the data and bring it back down to Earth; then we have another long period of time where we process the data,” Pontoppidan said.
Why the colors in Webb’s photos are fake
Are the images on the Webb telescope colorful? Are the colors in the space photos real? No, they are not. The Webb telescope sees in red. It is up there specifically to detect infrared light, the faintest and farthest light from the cosmos.
It basically sees in heat radiation, not in visible light. See another part of the electromagnetic spectrum:
Electromagnetic spectrum, the visible range (shaded part) is shown enlarged to the right. (Photo … [+] By Encyclopaedia Britannica / UIG Via Getty Images)
Universal Image Group via Getty Images
Think of a rainbow. At one end it is red at the other end it is blue or purple. This rainbow is actually much wider, but both ends represent the limits of the colors that the human eye can perceive. Beyond blue are increasingly shorter wavelengths of light for which we have no name. Ditto beyond red, where the wavelength of light lengthens.
This is where Webb looks: the infrared part of the electromagnetic spectrum.
It uses masking techniques (filters) that allow it to detect dim light sources next to very bright ones. But none of that is in “color”.
So how can the photos we see possibly be in color for us?
How Webb’s photos are colored
Webb images move through the electromagnetic spectrum from a part we cannot perceive to the part of visible light we can see.
They take images of Webb mono brightness using up to 29 different narrowband filters, each of which detects different wavelengths of infrared light. They assign to the light collected from each filter a different visible color, from the reddest red light that has the longest wavelength) to the blue (which has the shortest wavelength). Then they create a composite image.
Is this a trap? The only thing engineers are doing is taking radiation from a part of the spectrum that our eyes cannot see and moving it to another part of the spectrum that we can see.
It’s like playing a song in a different key.
In addition, all cameras, including your smartphone, use filters to take the images you see. No, not Instagram filters, but individual red, green, and blue filters that, when combined, produce a visible image that looks “real”.
If you think Webb’s images aren’t real, you should also think your smartphone photos are fake.
The SMACS 0723 galaxy cluster, known as Webb’s first deep field, features “pointed” stars and even … [+] galaxies.
NASA, ESA, CSA, STScI, Webb ERO
How long does it take to process Webb images
It is a complex process that for Webb data had not been done before. So it takes a few weeks for each image to come out in all its color splendor.
“Typically, the process from the telescope’s raw data to the final, clean image that communicates scientific information about the universe can take from weeks to a month,” said Alyssa Pagan, developer of scientific visuals at STScI.
It was definitely worth the wait.
“In the first images we have only a few days of observations,” Pontoppidan said. “This is really just the beginning and we’re just scratching the surface.”
I wish you clear skies and wide eyes.