Are the colors in the Webb Telescope images “false”?

On July 12, the first full-color images from the Webb Space Telescope showed countless nebulae, galaxies and a gaseous exoplanet as never seen before. But Webb only picks up infrared and near-infrared light, which the human eye can’t see, so where do these gorgeous colors come from?

Image developers on the Webb team are tasked with turning the telescope’s infrared image data into some of the most vivid views of the cosmos we’ve ever seen. They assign various infrared wavelengths to the colors of the visible spectrum, reds, blues, yellows, etc.

“Something I’ve been trying to get people to think about is stop hanging on to the idea of ​​’what would this look like if I could fly out there in a spaceship and look at it?'” he said. Joe DePasquale, a senior data imaging developer at the Space Telescope Science Institute, in a phone call with Gizmodo. “Don’t ask a biologist if you can somehow shrink yourself down to the size of a cell and look at the coronavirus.”

Webb’s first test images helped check the alignment of its mirrors and captured an orange photo of the Large Magellanic Cloud. Those first snapshots were not representative color images; one used a monochromatic filter (its image was grayscale) and the other only translated infrared light into the visible color bands from red to yellow, so the team could see certain features of the cloud what did they imagine But now, with the telescope up and running, the images being released are full of blazing color, like this recent portrait of the Chariot Wheel Galaxy.

Astronomy is often done outside the visible spectrum, because many of the most interesting objects in space shine with ultraviolet radiation, X-rays, and even radio waves (which category the light falls into depends on the length of photon wave). The Webb telescope is designed to see infrared light, whose wavelengths are longer than red visible light, but shorter than microwaves.

Infrared light can penetrate thick clouds of gas and dust in space, allowing researchers to see previously hidden secrets of the universe. Especially intriguing to scientists is that light from the early universe has been stretched as the universe has expanded, meaning that what was once ultraviolet or visible light may now be infrared (what is known as as “redshifted” light).

“They are instruments that we have designed to expand the power of our vision, to go beyond what our eyes are capable of to see light to which our eyes are not sensitive and to resolve objects that we can probably see only with our eyes. eyes,” DePasquale said. “I’m trying to bring out the most detail and the greatest richness of color and complexity that is inherent in the data without changing anything.”

Webb’s raw images are so loaded with data that they need to be reduced before they can be translated into visible light. The images must also be cleaned of artifacts such as cosmic rays and reflections from bright stars that hit the telescope’s detectors. If you look at a Webb image before doing the processing work, it will look like a black rectangle with some white dots.

A raw image of the Carina Nebula as seen by NIRCam, before the infrared light is translated into visible wavelengths. (Image: Space Telescope Science Institute)

“I think there are some connotations that go along with ‘coloring’ or ‘false color’ that imply that there is some process going on where we arbitrarily choose colors to create a color image,” DePasquale said. “Representative color is the preferred term for the kind of work we do, because I think it encompasses the work we do of translating light to create a true color image, but in a range of wavelengths that our eyes are not sensitive.”

Longer infrared waves are assigned redder colors and shorter infrared wavelengths are assigned bluer colors. (Blue and violet light have the shortest wavelengths within the visible spectrum, while red has the longest.) The process is called color sorting, and the spectrum is divided into as many colors as equipment needed to capture the full spectrum of light represented in the image.

“We have filters on the instruments that pick up certain wavelengths of light, which we then apply a color that is closest to what we think will be in the [visible] spectrum,” Alyssa Pagan, a science imager developer at the Space Telescope Science Institute, said in a phone call with Gizmodo.

Color ordering also depends on the elements being drawn. When working with narrow-band wavelengths in optical light (oxygen, ionized hydrogen, and sulfur, Pagan suggests), the latter two emit in the red. Therefore, the hydrogen could pass into green visible light, in order to give more information to the viewer.

“It’s a balance between art and science, because you want to show the science and the features, and sometimes those two things don’t necessarily work together,” Pagan added.

Webb’s first representative color images were released on July 12, more than six months after the telescope was launched from an ESA spaceport in French Guiana. From there, Webb traveled about a million miles to L2, a point in space where gravitational effects allow spacecraft to stay in place without burning much fuel.

The telescope deployed to L2, so once there, mission scientists could begin aligning the $10 billion observatory’s mirrors and powering up its instruments. The telescope has four instruments: a near-infrared camera (NIRCam), a near-infrared spectrograph, a mid-infrared instrument (MIRI) and a fine-guiding sensor and a slotless spectrograph to pinpoint targets and characterize the atmospheres of exoplanets.

The large amounts of dust in some galaxies and nebulae are transparent to NIRCam, allowing it to capture bright stars at shorter wavelengths. MIRI, on the other hand, can observe disks of material that will give way to planets as well as dust heated by starlight.

When the telescope’s images are being assembled, image processors work with instrument scientists to decide which features of a given object should be highlighted in the image: its hot gas, perhaps, or a cool tail and powder

When Webb imaged Stephan’s Quintet, a visual grouping of five galaxies, the finished product was a 150-million-pixel image made up of 1,000 images taken by both MIRI and NIRCam. When only MIRI sees it, however, the hot dust dominates the picture. In the background of the MIRI images, distant galaxies shine in different colors; DePasquale said the team calls them “buns.”

DePasquale and Pagan helped create Webb’s images as we might see them, rich in color and cosmic meaning. In the case of the panorama of the Cosmic Cliffs of the Carina Nebula, different filters captured the ionized blue gas and red dust. In initial passes at imaging the nebula, the gas obscured the dust structure, so the scientists asked the image processing team to “tone down the gas a bit,” Pagan said.

Collecting light in Webb’s hexagonal mirrors is only half the battle when it comes to viewing the distant universe. Translating what’s there is another beast entirely.

Editor’s Note: The release dates for this article are US based, but will be updated with local Australian dates as soon as we know more.

Leave a Comment

Your email address will not be published. Required fields are marked *