Jump to content

Camera system for Mariner 4


phoneix_007

Recommended Posts

9 minutes ago, phoneix_007 said:

So the photo was a bunch of numbers when printed?

It's basically an old tv camera type thing. I didn't know the signal was digitized for transmission back in the day, that makes sense though. 

What about printing? I don't see what you're after, printing pictures isn't exactly a dark art?

Link to comment
Share on other sites

30 minutes ago, phoneix_007 said:

So the photo was a bunch of numbers when printed?

The image was transmitted as a sequence of numbers and then converted to an actual image down on Earth, i.e. just like a modern camera with the only difference being the actual camera type.

And if you are hard enough, you can even draw the image from the sequence of numbers by hand:

800px-First_TV_Image_of_Mars.jpg

(Source: https://commons.wikimedia.org/wiki/File:First_TV_Image_of_Mars.jpg from the Wikipedia article https://en.wikipedia.org/wiki/Mariner_4)

That above is the first image of Mars transmitted by Mariner 4, hand-drawn from the sequence of numbers, as the engineers didn't want to wait for the computers to process and print it. (If you zoom in on the original image, you can even read the sequence of numbers printed on strips)

Link to comment
Share on other sites

50 minutes ago, phoneix_007 said:

So the photo was a bunch of numbers when printed?

A digitized photo (in grayscale, the Mariner camera didn't have color) would be transmitted and stored as an array of numbers, which would define the level of brightness of each pixel in the array -- akin to a grayscale BMP format image file.  I, also, hadn't been aware that Mariner had transmitted digitally; I don't think even NASA had the technology to do that in the mid-1960s.  The camera would have to have been slow-scan in order to digitize an image with even NTSC broadcast resolution (approximately 1/3 megapixel) with the electronics available in the day, and there was no way other than magnetic tape loop or storage scope to store a video still.

Apollo footage from the Lunar surface was transmitted in analog form (albeit at higher resolution than then-current broadcast TV) -- why the step backward if Mariner had the ability to send digital data?

Edit: 'Doh!  I'm confusing Mariner with Ranger and Surveyor on the Moon.  By the mid-1970s, the technology to digitize a TV image could proceed in nearly real time and the sensors were solid state (at least in NASA equipment).

Edited by Zeiss Ikon
Link to comment
Share on other sites

3 hours ago, Zeiss Ikon said:

A digitized photo (in grayscale, the Mariner camera didn't have color) 

Wikipedia says that it had red and green filters though, any idea what was done with them? I'm assuming the camera was too slow to enable compositing into color images.

Link to comment
Share on other sites

3 hours ago, Tullius said:

The image was transmitted as a sequence of numbers and then converted to an actual image down on Earth, i.e. just like a modern camera with the only difference being the actual camera type.

And if you are hard enough, you can even draw the image from the sequence of numbers by hand:

800px-First_TV_Image_of_Mars.jpg

(Source: https://commons.wikimedia.org/wiki/File:First_TV_Image_of_Mars.jpg from the Wikipedia article https://en.wikipedia.org/wiki/Mariner_4)

That above is the first image of Mars transmitted by Mariner 4, hand-drawn from the sequence of numbers, as the engineers didn't want to wait for the computers to process and print it. (If you zoom in on the original image, you can even read the sequence of numbers printed on strips)

Color By Numbers, Elite-level.

Link to comment
Share on other sites

1 hour ago, kurja said:

Wikipedia says that it had red and green filters though, any idea what was done with them? I'm assuming the camera was too slow to enable compositing into color images.

I got the sense that they stored the analog data from the camera onto analog tape and Mariner then digitized that data for transmission back to Earth. This system makes sense because they wouldn't have had to do the processing in real time and it could be managed by slow electronics. The Wikipedia article also says that they transmitted all the images twice for redundancy.

Link to comment
Share on other sites

3 hours ago, PakledHostage said:

I got the sense that they stored the analog data from the camera onto analog tape and Mariner then digitized that data for transmission back to Earth. This system makes sense because they wouldn't have had to do the processing in real time and it could be managed by slow electronics.

Correct.  Mariner 4 had neither the onboard processing power nor the bandwidth to send images in real-time.   In fact, the whole system was so slow that it only took 22(!) images during the entire flyby.

Link to comment
Share on other sites

4 hours ago, PakledHostage said:

I got the sense that they stored the analog data from the camera onto analog tape and Mariner then digitized that data for transmission back to Earth. This system makes sense because they wouldn't have had to do the processing in real time and it could be managed by slow electronics. The Wikipedia article also says that they transmitted all the images twice for redundancy.

Any idea what was the use of having those color filters?

Link to comment
Share on other sites

58 minutes ago, kurja said:

Any idea what was the use of having those color filters?

I presume that they would have mounted each filter separately, captured the analog data for that image through the filter and then digitized and downlinked the data at a later time. Post processing on the ground would then combine the colour filtered images to render some sort of false colour image.

If you think about it, that isn't too different from how a modern digital camera works. At least not in principle... Modern camera sensors just have the A/D conversion built in to their hardware.

Each "pixel" on my camera's sensor is really three individual sensors. A microscopic filter is mounted in front of each of the three sensor elements at each pixel to make them pick up only one of red, green or blue light. The A/D converter hardware in each of those three sensor elements then converts the observed red/green/blue light intensity into a 12 bit number.  The resulting 36 bits of RGB data acquired at each pixel is then stored directly into a 'RAW' image file or converted in camera to some other compressed format (like .JPG). In other words, a modern camera is really taking three images simultaneously, each with a different filter installed, and then combining the data in post processing to render a "colour" image. It just does it in near real time. But if someone were crazy enough to try, they could also print out the data from each of the red, green and blue sensors onto a grid and hand-colour it to generate an image... 

Link to comment
Share on other sites

I'm familiar with how color information is formed on an ordinary camera, but the Mariner pictures seem to all be monochromatic and I'd think they were taken each too far apart to be later composited into a color image. Hence the question.

It's also interesting that there were only two filters. Maybe someone here knows what they were for?

58 minutes ago, PakledHostage said:

Each "pixel" on my camera's sensor is really three individual sensors.

Not quite so - there's one sensing element for each output pixel, values for the other two color channels are constructed from values of surrounding sensels. If you're interested, pixinsight devs explain some of the possible methods here, it's not too long of a read: https://pixinsight.com/doc/tools/Debayer/Debayer.html#usage_002

Edited by kurja
Link to comment
Share on other sites

1 hour ago, kurja said:

Not quite so - there's one sensing element for each output pixel, values for the other two color channels are constructed from values of surrounding sensels. 

How it is done depends on the type of sensor and I was referring to my own camera.

Edited by PakledHostage
Link to comment
Share on other sites

The slow hardware and rapidity of Mariner's flight didn't allow it, but you can create a full color image by combining any two primary filters  (or even, as demonstrated by Edwin Land, one color and an unfiltered image).  This was done commercially for some years prior to about 1940, with "two-color Technicolor", which used only two color filtered films (usually red and green filtered), sometimes with a grayscale strip, to project an image that the eye would interpret as "color".  Comparing two-color with the more commonly seen three-color Technicolor, you can tell which is which, but the two-color is clearly still "color".  Land Color, using one filter and one unfiltered, always lacks some hues, but still reads as "color" image when viewed (BTW, Land Color even works if the eyes see the two images, filtered and unfiltered, separately).

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...