Jump to content

[Question] Why are spacecraft images always black and white?


Recommended Posts

That's the question! Black-and-white cameras are, like, very old now. Why do we still use them for missions that were launched only a decade ago (like rosetta and new horizons)? Also, why are the colour images always "False" colour? Forgive me for being so utterly uninformed. I'm sure there's a big difference between a space camera and a regular colourful digital camera, but i don't know what. :)

Link to comment
Share on other sites

Black and white cameras are not old and obsolete. If I ever become loaded with money, I'm gonna buy a monochrome camera.

It's because a monochrome camera gives much more sensor density. If you have a regular digital color camera, this is how its sensor works.

Bayer-Pattern-Example.jpg?itok=MpSx6A-w

Basically you have a third of your sensor used for every color channel. With monochrome, you can have very dense packed image data, and if you want color images, you put a filter over the camera that absorbs everything but the band you want (for example you put a 523 nm transmissive, and rest of the spectrum restrictive) and you get that channel.

With three channels like typical primary red, primary green and primary blue, a computer can synthetize a color RGB image and display it over a monitor for us to see.

Sometimes probes don't have R, G and B filters. They might have near infrared band pass filter, orange filter and soft ultraviolet one. Those can also be fed to the software, telling it to synthetize a RGB image using those channels. The resulting image does not have true colors. It has approximation of true color which is usually good enough.

I've done lots of such experiments over the years. Just recently I did two images using several filters for a RGB model.

This one used, if I recall correctly, near infrared, green and very deep and quite narrow blue band.

rgbsynth1.png

This one used something close to primary colors. Still a bit off.

RGB_test1.png

Edited by lajoswinkler
Link to comment
Share on other sites

Because any sensor CAN'T differ between red, green, and blue photons. Only filters ensure that the light passed is only of either green, blue, red, or sometimes a range (ex. green to blue, red to green etc). Knowing which photons were recorded (via filters), you can color them (or, make their value into the RGB space). Then you get color images. This are true for your phone, DSLR, or even some reconnaisance sat camera.

Three ways to deal with it :

1. Take three consecutive images, one for each color. Of course a problem if you want to capture a very specific moment, like an explosion or someone jumping into a pool.

2. Use three separate sensor, like lajoswinkler said. Suffers from less intensity (you need to divide it into three sensor, remember ?) and more costly, often offset by using CMOS sensors (astronomical observation prefers CCD for their sensitivity).

3. Literally put a filter in front of each sensor pixel. Results in lower resolution (need to merge three sensor pixel into one image pixel) or the need for larger, more expensive sensor, which are often offset by using CMOS sensors.

Bayer_pattern_on_sensor.svg

That's why scientific, astronomical observations prefer the first... It's not like the object is going to dissappear anyway, so a few more images won't hurt.

Edited by YNM
Link to comment
Share on other sites

B&W is more gives better shading, Color gives better composition. You can extract point and shoot color with a spectrograph, which is more desirable anyway since digital cameras only pick up what is filtered and a spectrograph records the spectrum, you can't really verify flourescence with digitized color either, but you can with a spectrograph. Its more efficient to send the shading and spectrum information separately. https://en.wikipedia.org/wiki/Space_Telescope_Imaging_Spectrograph (note image to the right)

BTW the 67P comet appears to be largely grey tones, as with Ceres. I would rather have the spectrographic information since that informs on chemical composition.

Link to comment
Share on other sites

Since others have covered the basics, I just want to point out that the filter wheels carried by spacecraft cameras often result in *more* colors than the standard RGB. (Though chances are you'll only see a few at a time)

In the case of New Horizons, it's worth noting that we already have low resolution colors, but have not recieved the vast majority of the data. That will be arriving in lossily compressed form in coming weeks-months, with the full images, etc taking a year or more.

Link to comment
Share on other sites

They might have near infrared band pass filter, orange filter and soft ultraviolet one.
Depending on how the camera is lensed the best instruments use separate cameras for non-visible light, lensing distorts the light because different wavelength bend differently in a lens. Of course modern computers can transform filtered images correcting for the lensing effect. This is the reason that large telescopes use mirrors instead of lenses to focus the image.

-------------------

Because any sensor CAN'T differ between red, green, and blue photons.
Photoelectric effect, sensors can do a binary sorting of photons, they either have sufficient energy to trigger the effect or not. Therefore sensors with thresholds in the visible spectrum can.
The photons of a light beam have a characteristic energy proportional to the frequency of the light. In the photoemission process, if an electron within some material absorbs the energy of one photon and acquires more energy than the work function (the electron binding energy) of the material, it is ejected. If the photon energy is too low, the electron is unable to escape the material.-WP-Photoelectric effect
Link to comment
Share on other sites

Because any sensor CAN'T differ between red, green, and blue photons.
Photoelectric effect, sensors can do a binary sorting of photons, they either have sufficient energy to trigger the effect or not. Therefore sensors with thresholds in the visible spectrum can.
The photons of a light beam have a characteristic energy proportional to the frequency of the light. In the photoemission process, if an electron within some material absorbs the energy of one photon and acquires more energy than the work function (the electron binding energy) of the material, it is ejected. If the photon energy is too low, the electron is unable to escape the material.-WP-Photoelectric effect

But then... How would getting color index with CCDs only by the count works ? AFAIK if there's a difference it's too small (and so, somewhat neglected). After all, CCD materials are photosensitive semiconductors, they're not quite the photoelectric as in this case - any photons would do (else you're not seeing STIS with 200 - 1030 nm wavelength range).

Link to comment
Share on other sites

Many probes have imaging spectrographs, where you get readings for usually a few hundred wavelength channels per pixel. You can easily get true-colour images from that, but they usually aren't in the visual range and tend to have much lower resolution than the CCD sensors.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...