Who, How And Why Processes Images From Space - Alternative View

Table of contents:

Who, How And Why Processes Images From Space - Alternative View
Who, How And Why Processes Images From Space - Alternative View

Video: Who, How And Why Processes Images From Space - Alternative View

Video: Who, How And Why Processes Images From Space - Alternative View
Video: Why All Images of Space Are Photoshopped - Cheddar Explores 2024, May
Anonim

Photos from space, published on the website of NASA and other space agencies, often attract the attention of those who doubt their authenticity - critics find traces of editing, retouching or manipulation of color in the images. This has been the case since the inception of the "lunar conspiracy", and now the pictures taken not only by Americans, but also by Europeans, Japanese, and Indians have come under suspicion. We propose to understand why space images are processed at all and whether they can, despite this, be considered authentic.

In order to correctly assess the quality of satellite images that we see on the Web, it is necessary to take into account two important factors. One of them is related to the nature of interaction between agencies and the general public, the other is dictated by physical laws.

Public relations

Space imagery is one of the most effective means of popularizing the work of research missions in near and deep space. However, not all personnel are immediately at the disposal of the media.

Images obtained from space can be roughly divided into three groups: "raw" (raw), scientific and public. Raw, or original, files from spacecraft are sometimes available to everyone, and sometimes not. For example, images from the Mars rovers Curiosity and Opportunity or Saturn's moon Cassini are published in near real time so anyone can see them at the same time as scientists studying Mars or Saturn. Raw photos of the Earth from the ISS are uploaded to a separate NASA server. Astronauts fill them with thousands, and no one has time to preprocess them. The only thing that is added to them on Earth is geo-referencing to facilitate search.

This is not the case with Messenger, New Horizons or Dawn. Raw images obtained from these devices are not published immediately upon receipt, but are released with a delay of weeks, months or even years. This is necessary so that scientists working on relevant projects can safely analyze the data and, in the event of any discoveries, first report them at conferences.

Files with scientific personnel often have a specific format that only special programs or applications understand. Such files contain a large amount of information about the circumstances of the shooting (time, position of the spacecraft, position of the subject, lighting angle, shooting characteristics, etc.). This information, not being classified, is so uninteresting to most astronautics enthusiasts that it is usually posted in places that are convenient for scientists, but frighten off outsiders with a complex interface. Such sites or FTP servers in the public domain are NASA PDS, ESA PSA, JAXA archive. Even China posted footage from the Moon on the website of its Academy of Sciences (whose server periodically crashes). When the previous Russian meteorological satellite "Electro-L" was engaged in surveying, frames from it could be found on the NTsOMZ server;there are no images from the new satellite in the public domain at all. Only preliminary images can be viewed from the remote sensing satellites, and the images themselves will have to be ordered on the Roscosmos Geoportal.

Promotional video:

But usually, public shots that are attached to press releases from NASA and other space agencies are criticized for retouching, because it is they that catch the eye of Internet users in the first place. And if you want, you can find a lot there. And color manipulation:

Photo of the landing platform of the Spirit rover in the visible range of light and with the capture of the near infrared
Photo of the landing platform of the Spirit rover in the visible range of light and with the capture of the near infrared

Photo of the landing platform of the Spirit rover in the visible range of light and with the capture of the near infrared.

And overlay multiple shots:

Image
Image
Earth rise over the lunar crater of Compton
Earth rise over the lunar crater of Compton

Earth rise over the lunar crater of Compton.

And manipulations with cut images (copy & paste):

Image
Image
Copy-paste traces on a composite image of the Earth
Copy-paste traces on a composite image of the Earth

Copy-paste traces on a composite image of the Earth.

And even direct retouching, with erasing some parts of the image. NASA's motivation in the case of all these manipulations is so simple that not everyone is ready to believe it: it's more beautiful.

But the truth is, the bottomless blackness of space looks more impressive when it is not disturbed by debris on the lens and charged particles on the film. A color frame is, indeed, more attractive than black and white. A panoramic shot is better than a single shot. At the same time, it is important that in the case of NASA, you can almost always find the source frames and compare one with the other. For example, the original version (AS17-134-20384) and the print version (GPN-2000-001137) of this image from Apollo 17, which is cited as almost the main proof of lunar photo retouching:

One of the footage from the Apollo 17 mission
One of the footage from the Apollo 17 mission

One of the footage from the Apollo 17 mission.

A highlighted version of the original image
A highlighted version of the original image

A highlighted version of the original image.

A highlighted version of the published image
A highlighted version of the published image

A highlighted version of the published image.

Or find the "selfie stick" of the rover, which "disappeared" when creating its self-portrait:

Image
Image
Curiosity snapshot dated January 14, 2015, sol 868
Curiosity snapshot dated January 14, 2015, sol 868

Curiosity snapshot dated January 14, 2015, sol 868.

Physics of digital photography

Typically, those who blame space agencies for manipulating color, using filters, or publishing black and white photographs “in this digital age” do not consider the physical processes of digital imaging. They believe that if a smartphone or camera immediately gives out color frames, then the spacecraft should be all the more capable of this, and they do not even know what complex operations are necessary for a color image to immediately hit the screen.

Let's explain the theory of digital photography: the matrix of a digital apparatus is, in fact, a solar battery. There is light - there is current, no light - no current. Only the matrix is not a single battery, but many small batteries - pixels, from each of which the current output is read separately. Optics focuses light on a photo matrix, and electronics reads the intensity of energy release from each pixel. From the data obtained, an image is constructed in shades of gray - from zero current in the dark to maximum in the light, that is, at the output it turns out to be black and white. To make it colored, you need to apply color filters. It turns out, oddly enough, that color filters are present in every smartphone and every digital camera from the nearest store! (For some, this information is trivial, but, according to the author's experience, for many it will turn out to be news.) In the case of conventional photographic equipment, an alternation of red, green and blue filters is used, which are alternately superimposed on individual pixels of the matrix - this is the so-called Bayer filter.

A bayer filter is half green, while red and blue each take up one quarter of the area
A bayer filter is half green, while red and blue each take up one quarter of the area

A bayer filter is half green, while red and blue each take up one quarter of the area.

NASA is not tasked with delivering beautiful photographs for press releases and media. Spacecraft cameras are primarily engineering or scientific instruments that help control these spacecraft or obtain information about space. We have already discussed this in detail in the article "How the planets are investigated with the help of light."

Here we will repeat: navigation cameras produce black and white images because such files weigh less, and also because color is simply not needed there. Scientific cameras can extract more information about space than the human eye can perceive, and therefore a wider range of color filters are used for them:

Matrix and filter drum of OSIRIS instrument at Rosetta
Matrix and filter drum of OSIRIS instrument at Rosetta

Matrix and filter drum of OSIRIS instrument at Rosetta.

The use of a near-infrared light filter, which is not visible to the eye, instead of red, led to the reddening of Mars in many frames that went to the media. Not all of the explanations about the infrared range were reprinted, which gave rise to a separate discussion, which we also discussed in the material "What color is Mars".

However, the Curiosity rover has a Bayer filter, which allows it to shoot in a color that is familiar to our eyes, although a separate set of color filters is also included with the camera.

The use of separate filters is more convenient from the point of view of choosing the ranges of light in which you want to look at the object. But if this object is moving quickly, then in the pictures in different ranges, its position changes. On the Electro-L footage, this was noticeable on the fast clouds, which had time to move in a matter of seconds while the satellite was changing the filter. On Mars, a similar thing happened when shooting sunsets from the Spirit and Opportunity rover - they do not have a Bayer filter:

Sunset, filmed by "Spirit" in 489 sol. Overlay images captured with filters at 753,535 and 432 nanometers
Sunset, filmed by "Spirit" in 489 sol. Overlay images captured with filters at 753,535 and 432 nanometers

Sunset, filmed by "Spirit" in 489 sol. Overlay images captured with filters at 753,535 and 432 nanometers.

On Saturn, Cassini has similar difficulties:

Saturn's moons Titan (back) and Rhea (front) in Cassini images
Saturn's moons Titan (back) and Rhea (front) in Cassini images

Saturn's moons Titan (back) and Rhea (front) in Cassini images.

At the Lagrange point, DSCOVR encounters the same situation:

The lunar transit across the Earth's disk in the DSCOVR image on July 16, 2015
The lunar transit across the Earth's disk in the DSCOVR image on July 16, 2015

The lunar transit across the Earth's disk in the DSCOVR image on July 16, 2015.

To get a beautiful photo from this shot, suitable for distribution in the media, you have to work in an image editor.

There is another physical factor that not everyone knows about - black and white images have a higher resolution and clarity compared to color. These are the so-called panchromatic images, which include all the light information entering the camera, without filtering out any parts of it. Therefore, many "long-range" satellite cameras shoot only in panchrome, which for us means black-and-white frames. Such a LORRI camera is installed on New Horizons, a NAC camera on the LRO lunar satellite. In fact, all telescopes are filmed in panchrome, unless filters are specifically used. ("NASA is hiding the true color of the moon" is where it came from.)

A multispectral "color" camera, equipped with filters and having a much lower resolution, can be attached to a panchromatic camera. At the same time, its color images can be superimposed on panchromatic ones, as a result of which we get high-resolution color images. This method is often used when surveying the Earth. If you know about this, you can see in some frames a typical halo, which leaves a blurry color frame:

Pluto in New Horizons imagery
Pluto in New Horizons imagery

Pluto in New Horizons imagery.

It was through this overlay that the very impressive frame of the Earth above the Moon was created, which is given above as an example of overlapping different images.

You often have to resort to the tools of graphic editors when you need to clean up a frame before publishing. The idea of the impeccability of space technology is not always justified, so garbage on space cameras is a common thing. For example, the MAHLI camera on the Curiosity rover is simply filthy, otherwise you can't say:

One of the panoramas shot by Curiosity with the Mars Hand Lens Imager (MAHLI) in Sol 1401
One of the panoramas shot by Curiosity with the Mars Hand Lens Imager (MAHLI) in Sol 1401

One of the panoramas shot by Curiosity with the Mars Hand Lens Imager (MAHLI) in Sol 1401.

Sorinka in the STEREO-B solar telescope gave rise to a separate myth about an alien space station constantly flying over the north pole of the sun:

Image
Image

Even in space, charged particles are not uncommon, which leave their traces on the matrix in the form of individual points or stripes. The longer the exposure, the more traces remain, “snow” appears on the frames, which does not look very presentable in the media, so they also try to clean it off (read: “photoshop”) before publication:

Image
Image

Therefore, we can say: yes, NASA "photoshopped" images from space. ESA photoshop. Roscosmos "photoshop". ISRO "photoshop". JAXA "photoshop" … Not only "photoshop" is the National Space Agency of Zambia. So if someone is not satisfied with NASA images, then you can always use their space images without any signs of processing.

Vitaly Egorov

Recommended: