The James Webb Telescope images didn’t start out as stunning. Here’s how they got started — and how researchers brought them to life

Eric Rosolofsky was like a child at Christmas when the James Webb Space Telescope began sending out images of a long-awaited distant galaxy.

And the first person he wanted to show him a new gift was Uncle Mike.

Rusolosky, an assistant professor of physics at the University of Alberta, was among the first Canadian researchers to use the world’s newest and largest space telescope, in his case, by observing star formation in the Triangle Galaxy, also known as Messier 33.

“When my project was in progress, I would pull (data) from the archive as soon as it came in,” says Rosolovsky. “I sent a picture of it to my uncle…” Uncle Mike! You and I are the first person on earth to see these pictures! “”

Uncle Mike, a backyard astronomer, is the one who first put Rosolovsky on the path that eventually led him to receive images of a galaxy two million light-years away from a $10 billion telescope orbiting the sun, 1.5 million kilometers from Earth. .

But the photo Rossolovsky sent Uncle Mike – important to both – was a far cry from the stunning full-color images released by NASA as JWST continues to garner acclaim for its high-resolution infrared observations of the universe.

Instead, Uncle Mike’s picture – a few spots of light on a dark background crossed with bands – looked something like this:

This stands in stark contrast – literally – with Hubble’s famous image of the triangle, or even Rosolovsky’s partially processed image on the web, both showing millions of stars, the spiral arms of the galaxy and, in the latter case, some turbulent regions where stars are born.

Stunningly detailed image of the Triangle Galaxy (M33), showing an entire spiral face glowing with the light of nearly 25 million individually resolved stars, as captured by NASA's Hubble Space Telescope.  It is the largest high-resolution mosaic image of the triangle ever assembled, consisting of 54 Hubble fields of view spanning an area of ​​more than 19,000 light-years.
A view of the Triangle Galaxy (M33), two million light-years from Earth, as captured by the James Webb Space Telescope (JWST) and processed by Canadian astronomer Eric Rosolofsky.

The difference between the images illustrates the effort researchers are making to translate the raw data from the JWST into images that astronomers can use in research and those that have stunned the general public.

In fact, Rosolowsky’s raw data – part of a real astronomical picture as it appears in his computer code – looked something like this:

A portion of the image of the Triangle Galaxy (M3), two million light-years from Earth - as captured by the James Webb Space Telescope - appears as raw data, as it appears in Canadian astronomer Eric Rosolofsky's code.

What he sent to his uncle, he says, was a “quick and dirty look” at the Triangle Galaxy.

But before these images – or that data – can become remotely useful to him as a researcher, they must be corrected, not least for the quirks inherent in the equipment.

Despite the extreme meticulousness that went into the design and construction of JWST, the data from it, in its strictest form, is uneven.

The images must be corrected for defects inherent in the cameras themselves.

Cosmic rays hitting a telescope can create stabilization in camera detectors, which are partially corrected by taking multiple copies of the same image.

And even pixels – the smallest light-sensitive units in telescope detectors – themselves have different sensitivities; For example, one of the more than a million pixels in JWST’s MIRI instrument may be more sensitive than its neighbor and less sensitive than its other.

Fortunately, JWST engineers have a solution for that – a full calibration map of how to compensate for differences in every pixel in every instrument on the Webb telescope.

“This is the work of hundreds of people,” says Rosolovsky. “When we say ‘turn on the telescope’ – it was launched and then they went through that period where you didn’t hear anything about the results.

“That’s what they were doing there. Finally turned on. They start doing the pictures and say, ‘How do we remove all these patches?”

All he has to do, he says, is apply a JWST map pixel calibration to his data to calculate those differences in pixel sensitivity.

For most researchers, much of this technical work is carried out in what they universally call a “pipeline”.

Think of it as a data processing highway, where raw data from JWST is refined as it travels from source to destination.

Along this highway, some software processing units may apply corrections to the data based on inconsistencies or aberrations in the detectors, others may apply corrections based on the calibrations of the cameras themselves, yet others may combine data from multiple exposures into one Pictures.

But the highway also contains a series of ramps from which researchers can extract their information in the early stages of processing for their more specific uses.

In Rosolowsky’s case, he wants some of that raw data, closer to the start of the pipeline.

It’s nice to have something cute to send to a loved one, Rosolowsky says, but researchers don’t practice science from pretty pictures.

Much of his work is accomplished using raw numerical data.

“Beautiful photos are great at interpreting, building and telling a story,” he says. “But then supporting that story is what requires doing computer processing to figure out how each star in this image is measured, figuring out what it is, and then using that to support your claim.”

“Most of my job is programming, so I’m writing computer code that’s going to take that big chunk of numbers and convert that into star properties,” he says. “This is my day job when you start at it.”

“Honestly, it was great.

But the situation is different for Alyssa Pagan. She works with NASA as a science visualization developer. She is one of the people responsible for processing the amazing images from the Webb telescope that has been released to the public.

In fact, the Carina Nebula’s “Cosmic Cliffs” image, one of the first five images released from JWST, was the first I worked on from the web.

“It was crazy,” she says of working on the first images from the web. “It was great, frankly, for me from the start.”

“The fact that you were one of the first people to see the photo and… process it and put it to the public – it was a huge honour. It was also a time to reflect on how far we have come, and all the people involved to make that possible.”

For Pagan, the images I started working with were retrieved from the bottom of the pipeline – with most of the compensation done and artifacts removed.

In fact, the image of the Carina Nebula was a composite of six monochrome images, taken with six different filters, each showing the nebula at a slightly different wavelength of infrared radiation, a Webb telescope environment.

But the initial photos I started working on weren’t enough to look at – just some white spots in a dark field, no detail at all.

That’s because the dynamic range of the JWST cameras – the difference between black and white in photos – was too much for even high-end Pagan computer monitors to record. The information—the details in the darker parts of the image—was there, but her observers couldn’t make out.

Images of the Carina Nebula taken at different infrared wavelengths by the James Webb Space Telescope are displayed before processing.

The solution was to compress this dynamic range – bringing black and white closer together – to fit within the range of her screen. Somewhat confusingly, this is called “stretching” the image.

Images of the Carina Nebula taken at different infrared wavelengths are displayed by the James Webb Space Telescope after compressing their dynamic ranges.

After stretching, I was able to see details in the images – the many faint stars and the cloud of dust and gas that characterize the nebula. But she was still looking at a series of black and white photos – the next step was to assign colors to each of those photos. It is not done randomly.

How does your brain understand light?

What we humans consider visible light is only a small part of the electromagnetic spectrum, the part that our eyes are sensitive to. To one end of this section, the section where light has shorter wavelengths, our brain maps to bluish colors. To the other end of the section, which has longer wavelengths, assigns reddish hues.

There are wavelengths our eyes can’t detect on both sides of the spectrum our eyes are sensitive to, but in some cases other animals – or specially designed instruments – can detect them.

The Webb telescope operates in the infrared spectrum; That is, beyond what we see in red, in the section of the spectrum with wavelengths longer than we can perceive. This has advantages in terms of astronomy. Infrared light often penetrates clouds of dust and gas that block visible light.

But this means that to see what Webb observed, we have to create a representation of each image that falls within that part of the electromagnetic spectrum that our eyes are sensitive to – the visible light section.

And to add color to those images, Pagan followed roughly the same conventions as our brain—shorter wavelengths are assigned colors of blue, and wavelengths have reddish hues.

Images of the Carina Nebula taken at different infrared wavelengths by the James Webb Space Telescope are displayed after dynamic range compression and with colors assigned to the wavelengths

When that was done, she could stack those color photos on top of each other to make one full color photo.

A full-color image of the cosmic slopes of the Carina Nebula?  Created by stacking a series of images taken at different infrared wavelengths by the James Webb Space Telescope?  Appears before final processing.

At this point, she says, she began balancing science and aesthetics.

“This is where it becomes completely subjective,” she says. “We’re working with scientists through this process…to make sure we’re showing the data as accurately as possible and being as honest as possible.

“But it’s subjective now because we adjust like tonality, contrast, color and all those things to make the features in each filter more prominent or just to show and highlight those different areas.”

Working in Photoshop, Pagan tweaked the image to produce something aesthetically pleasing — “organic” being the term she likes to use — and scientifically accurate.

A full-color image of the cosmic slopes of the Carina Nebula?  Created by stacking a series of images taken at different infrared wavelengths by the James Webb Telescope?  It appears after the final processing.

When launched by NASA, on July 12, the image of the Karina Nebula dazzled astronomers and scientists alike, many of whom have waited decades to see the depths of the universe through the eyes of the most powerful telescope ever built.

It was, in the words of JWST’s Canadian Scientific Director, René Doyon, “the beautiful bridge between science and art.”

“It’s great to see everyone talking about it,” says Pagan. “It feels like a beautiful, connected and universal moment to appreciate science and humanity.”

Join the conversation

Conversations are the opinions of our readers and are subject to behavioral rules. The star does not endorse these views.

#James #Webb #Telescope #images #didnt #start #stunning #Heres #started #researchers #brought #life

Leave a Comment

Your email address will not be published. Required fields are marked *