Too many tricky questions to answer I guess and now we'll never know. We will just have to speculate until someone knowledgable comes along who is willing to answer these queries of ours.
If the rovers were operating under wildly varying temperatures, then contaction and expansion would probably be an issue.
Dust would also be an issue and the dust storms cannot be relied apon to come along at the right moment to clean the front of the cameras. The dust may also be abbrasive to the lenses, particularly when we are told that there are so many dust devils moving across the surface of Mars. Not only the lenses either - the solar panels have been shown to get very dusty on some images and extremely clean and new-looking in others.
If the rovers did have RTG (and it makes perfect sense to me, for them to have this as well as solar panels) then Spirit can hardly have run out of power can it?
Yes I will be interested too.. From experience With Military contractor .. usually camera housings are based on commercial housings used on commercial robots working in hazardous environments + strengthening. Link below shows the lengths gone to, to protect vision system from harsh elements. Nice images of Filter housing and other associated elements.
Biggest enemy for normal camera equipment is water, temp and Dust. Tolerance levels are of course are increased but Im curious over operation voltages. Battery leakage is higher in colder conditions but circuitry performance is quicker. Im not sure what Clear covering is used for the Solar panels but Im pretty sure self cleaning glass (Pilkinton) could have been looked into, Maybe there was a weight issue maybe not enough sunlight to activate the cleaning process?
Ive always wondered why a modified version of a RTG nuclear power plant wasnt adopted. More info here on RTGs. A considerable number of space craft have successfully used this nuclear solution.. This could have bee used to extend rovers range and operation period, especially for night time reconnaissance. Nasa has used these mini nuclear plants with great success .
Voyager doesn't have any solar panels; they wouldn't do any good so far from the Sun. The probe stays in touch by carrying its own power source, an early radioisotope thermoelectric generator (RTG), which converts the heat generated from the natural decay of its radioactive fuel into electricity. Its RTG will supply Voyager with electricity at least until 2020.
Space probes that travel much beyond Mars need more power than solar cells can provide. Another example is the Ulysses spacecraft. It was launched in October 1990 from the space shuttle on a mission to study the Sun's poles. To get above the Sun, Ulysses had to fly around Jupiter and slingshot out of the plane of the planets. Near Jupiter, the Sun's rays are 25 times weaker than near Earth. Solar panels large enough to catch this weak energy would have weighed 1,200 pounds, doubling the weight of the spacecraft and making it too heavy for booster rockets from the shuttle. Instead, Ulysses was equipped with an RTG weighing only 124 pounds. It easily powers all the probe's on board systems, including navigation, communication and scientific instruments.
Its always been a pet thought that 1 or more of these robots must have such a power plant.. talk less of the MOC..
As far as temperatures go, I believe these rovers are near the equator which OBrien has said is not that cold, although it is cold. I thought that most of the stuff that was more temperature sensitive is in the deck which is heated by both the electrical components, including large resistors they can turn on if they have enough power. I think there is also some kind of radioactive element which provides heat too. Maybe I have remembered wrongly.
I wonder if the rovers have a heating onboard , especially for the cameras and how the problem of extension and contraction of the different materials and the special glass of the lenses has been solved ? Even digital cameras need lenses and these lenses should always be clean and protected from abrasion by suddenly upcoming duststorms for example. How is this problem solved, regarding the communicationtime needed to give orders to the rover (s). By some sort of intern softwareprogram ?
It is true, I wonder how they deal with the expansion and contraction of the glass, but I read in the chapter-31 document that the designer was not allowed to use anything between the lenses due to the cold conditions, but even so, the glass itself, the filter mechanisms and lens mounts will be expanding/contracting a lot. As far as the dust goes, I think that must be a problem and they do not have 'windscreen wipers' so I wonder how they deal with that and keep the dust out of the pictures..
OSD - as you say, OBrien has a better grasp of these things and may be able to shed some light on these issues.
Hi OBrien, thank you very much for your explanations. A lot of people wonder about how these little guys ( Rovers) survive the martian conditions still functioning. These many winters with their antarctian temperatures and these maybe aggressive chemicals in the soil, the electric activity in the atmosphere and the tribes of tiny people, who try to take eyerything from them, which is usable as a tool ( This was a joke.). I wonder if the rovers have a heating onboard , especially for the cameras and how the problem of extension and contraction of the different materials and the special glass of the lenses has been solved ? Even digital cameras need lenses and these lenses should always be clean and protected from abrasion by suddenly upcoming duststorms for example. How is this problem solved, regarding the communicationtime needed to give orders to the rover (s). By some sort of intern softwareprogram ?
Scence Daily! Exploration Rover Spirit skipped a planned communication session on March 30 and, as anticipated from recent power-supply projections, has probably entered a low-power hibernation mode.
"The temperature limit was for a new rover. We now have an older rover with thousands of thermal cycles on Mars, so the colder temperatures will be a further stress," Callas said. JPL, a division of the California Institute of Technology in Pasadena, manages the Mars Exploration Rover Project for NASA's Science Mission Directorate, Washington. For more information about the Mars rovers, visit FULL STORY LINK HERE
Hi OBrien, thank you very much for your explanations. A lot of people wonder about how these little guys ( Rovers) survive the martian conditions still functioning. These many winters with their antarctian temperatures and these maybe aggressive chemicals in the soil, the electric activity in the atmosphere and the tribes of tiny people, who try to take eyerything from them, which is usable as a tool ( This was a joke.). I wonder if the rovers have a heating onboard , especially for the cameras and how the problem of extension and contraction of the different materials and the special glass of the lenses has been solved ? Even digital cameras need lenses and these lenses should always be clean and protected from abrasion by suddenly upcoming duststorms for example. How is this problem solved, regarding the communicationtime needed to give orders to the rover (s). By some sort of intern softwareprogram ?
There have been a couple comments and questions about why spacecraft photos are in black and white rather than color. I thought I'd do a very short tutorial and then go into detail if people had more questions. Some of the explanations are intentioally simplistic (and not rigorously correct) to keep it short.
The first question is, what IS a color photo?
In modern color film, there are miniscule areas of chemicals (grains) that are activated by light. Some are activated by red light, some by green, and some by blue, distributed all over the film. If you look at a processed color negative under high magnification, you'll see little areas, each that transmits a different color. In printing a color picture from the negative, the areas that transmit red determine the amount of red that gets printed, the areas that transmit green determine the amont of green that get printed, etc.
But that's not the way it was always done. Early color photography used three black and white photographs that were taken in quick succession with a red, a green, and a blue filter in front. The pictures were then combined by projecting the three onto a screen, each projector having either a red, a green, or a blue filter. The red image, the green image, and the blue image would overlap to make a full color image.
By good fortune, there's a recent article here that shows some spectacular photos made this way in Russia 100 years ago.
In electronic sensors, things work a little differently. In a CCD (charge coupled device) there are little light sensitive areas called pixels usually arranged in a rectangular array. When light falls on a pixel, electric charge comes out proportional to the amount of light. It doesn't matter a lot if the light is red, green, or blue, so the resulting picture will be monochromatic. That is, one color, or black and white.
Black and white is a bit of a misnomer, though. The only reason it's black and white is that whoever displayed the data chose to assign the voltage levels out of the CCD with greyscale values that go from black to white. He could have chosen a different colormap, one that goes from dark red to light red, or even a rainbow colormap that changed color based on intensity.
But the CCD can't distinguish one color from another without some outside help. There need to be some color filters in front of the CCD for it to know what's red, what's green, and what's blue.
In the cameras that you now buy, and that are in every cell phone, there are different tiny color filters in front of every pixel. See the detailed explanation here.
But it means that the monochromatic resolution of the CCD is reduced by a factor of 4 because you need 4x the pixels to collect the light over the same spectrum. It also means that processing the image to get a single color is more complicated.
So the answer to the simple question "Why use a monochromatic camera chip instead of a color chip like I can get in any cellphone?" is that the resolution would be 1/4 as good.
For the systems that do take color pictures, it's far more efficient to use a system like those Russian three-color photos. In the Mars Rover PanCam, for instance, each camera has only one sensor, but a filter wheel moves to determine which filter gets used. In this way, there's more than just red, green, and blue. There are many different filters that allow for looking at the geology in many different wavelengths. When a "true color" picture is required, several pictures with different filters need to be combined.
In HiRISE, there's one set of CCDs that take the most data. They have a filter (called "red" but actually moderatedly wide spectral range centered around red) because most of the surface is reddish. It also has two smaller sets that take the other color data. One is "blue-green" and the other in the near infrared. So to make a color picture, the images from each set are combined. Because of this, there is no such thing as a perfectly faithful color reconstruction ... but the same is true of a cellphone or commercial camera. The reconstruction is only an approximation of true color.
Because there are three different images, the brightness and contrast of each image can be adjusted. The color used to display each image can be adjusted. In this way the "false color" images that highlight particular features like the blue interior of the Cerberus Fossae channels can be created.
So when you look at the data that comes back from a spacecraft it is never "in color" in its original form. When color is available it is in the form of separate images each of which is assigned to a color and must be reconstructed knowing the details of the filter it was taken with.