It may be Nasa does not hide the true colors of Mars...
There is a tricky formule. Take the photo with low horizont, max 10 degrees. (this condition is satisfied with pancam...) Do not show the sky photos captured at higher angle, so everybody believe that the sky has the same color as the horizont.
Look at this image:
I have found its part-images on Spirit's sol120. Tha lower part of this image has identifier 2p137011308 (L2) and the consecutives filter images, the upper part-image is the 2p137032251 (L2) and the consecutive filter images.
The lower image has dotted contour. Its horizont at the top of the image is only at 3 degrees related to horizontal line. Of course the dusty atmodphere over the surface is dust-aerosol polluted, so its color is yellowish, a bit reddish. But the bottom of the upper part-image is at 10 degrees, because the angle of its optical axis is at 18 degrees. So between the two image is a gap. The sky on the upper part-image is basically bluish, so we can fill the gap with a color-transition from the yellowish of the lower image to the bluish of the upper image.
It is visible, that the real view on Mars can absolutely different from the official NASA presentation, because they show us mostly the lower part of the whole view.
Of course, when there is dust storm or any other high dust-pollution in the atmosphere, the sky from the horizont upt to the zenith is reddish-orangeish, but this is not too frequent event.
Remark: this image was rendered by using the original pds filter-images.
The images I posted above of the calibration dial show the differences in the hue of the four colour 'chips' when different filters are used to produce a colour-composite.
The image that is close to true colour is the L456 version.
What is also interesting about this particular image is the colour of the terrain. At this particular location it is showing up as a 'rusty' or reddish-brown colour but it has to be noted that at other locations the colour of the terrain may be a different colour.
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
OK OBrien, thanks. It would be useful to read about official document of post-stretching of rover raw images because the difference between the auto-streching and the auto-exposure is huge. Look at the first picture, it is an image with narrow contrast range (with offset and limited brightness).
The next one is an autoexposed (by procedure of rovers):
And the third az an auto-streched variant:
It is well observable that the auto exposure has increased the upper limit of lighting while the stretching has filled out the whole range if lighting. The auto-exp is only a simple "amplification" therefore it does not change the brightness ratio between the pixels, so an auto-exposed image is usable in color image composition unlike the stretching that changes the ratios so will distort the colors. There is an other disturbing circumstance with the raw image found in rovers gallery. This is the nonlinearity of images. The original pixel values are 12 bit values but the rover sends them in 8-bit format which is in accordance the usual image displaying on computers. Therefore the rover's on-board computer encode them into 8 bit format by a look-up table. The encoding result will be special: the pixel values in low value range will be divided by less numbers than in the higher range, and the end-value will be divided by 16 (according to 4 bit length-difference). Thus the darker pixels will be lighter than in reality, i.e. the nonlinerized image is unusable for composition true color images. Before using they should be linearized.
OBrien, I don't believe that the posted two images are same. (it is not your fail ) My opinion is that there is no autocorrect process that can create or remove shadow. The shadow of the pole is in different position on the images. Can you cite any document what writes about the post-process of rover raw images? I have found information about on-board auto-exposure feature that causes quasi stretching. If auto-stretch process would have been applied for all raw images then we cannot see wrong-exposured images. But we can find a lot of them.
In order to understand how the different filters affect the display of the coloured 'chips' on the calibration dial, I have produced a series of images to demonstrate the effect when different filter combinations are used.
Images L2, L3, L4, L5, L6 and L7 from Sol 50 (Spirit) were selected.
You're one step closer to generating accurate color representations.
An important modification to the process is to avoid using the images on the marsrover.nasa.gov web site. Because they're designed for general public consumption, the images presented there have an autostretch algorithm applied to them so that they have consistent contrast. I have shown below an example of the differences. For instance, notice the difference in details in the shadow of the gnomon and the brightness of the deck.
For actual color image manipulation, the images in the Planetary Data System should be used in cojunction with the listed exposure times in order to get the correct intensity. Every image has a different exposure time. For instance, the images used to generate the first color picture (L256) have exposure times of
2P130799733ESF09BVP2111L2M1 - 169 ms 2P130799823ESF09BVP2111L5M1 - 512 ms 2P130799853ESF09BVP2111L6M1 - 686 ms
Good luck with the color calibration. It's a tricky prospect with so many different factors to consider and it's easy to miss some of the more subtle ones.
In order to understand how the different filters affect the display of the coloured 'chips' on the calibration dial, I have produced a series of images to demonstrate the effect when different filter combinations are used.
Images L2, L3, L4, L5, L6 and L7 from Sol 50 (Spirit) were selected.
Adobe Photoshop v7.0 was used to produce the colour images.
Thanks iceman, this is useful information about general exposure settings on photographs, but I meant the ones for the rover images. So I if I wanted to find out what the exposure setings for a particular picture were, I could go to some site/page and find out. I think it was posted by OBrien, so I will have to go and search through some old posts. I just wondered if anyone had a book mark to the rover exposure setting page, thats all.
Also, although everyone probably knows this one, I found a good example demostrating why the blue chip appears red and explanation plus colour formula etc
edited to insert link
-- Edited by qmantoo on Friday 17th of September 2010 09:41:02 AM
Dr. Bell of Cornell has explained this quirk in the panorama pictures. His email response is below:
Thanks for writing. The answer is that the color chips on the sundial have different colors in the near-infrared range of Pancam filters. For example, the blue chip is dark near 600 nm, where humans see red light, but is especially bright at 750 nm, which is used as "red" for many Pancam images. So it appears pink in RGB composites. We chose the pigments for the chips on purpose this way, so they could provide different patterns of brightnesses regardless of which filters we used. The details of the colors of the pigments are published in a paper I wrote in the December issue of the Journal of Geophysical Research (Planets), in case you want more details...
All of us tired folks on the team are really happy that so many people around the world are following the mission and sending their support and encouragement...
Thanks,
Jim Bell Cornell U.
Typical RGB values for recording and display are Red-600nm, Green-530nm and Blue-480nm. As we can see these coincide with the L4, L5 and L6 filters on the PanCam. The difference is, in this panorama image, and in most images taken by the Rover, the L2 is used for the Red-Channel instead of the L4. The L2 is at 750nm, and right at the extreme end of the visible spectrum, the near infra-red range. This increases the range of the spectrum that can be recorded by the PanCam, allowing higher definition to be recorded, making it easier to see into the shadows and so forth.
As Dr. Bell explained in his email, and as visible by viewing the Raw images hosted by NASA, the color-chips are not as simple as they appear. The pigments are designed to have different brightness at a variety of wavelengths. Not just RGB values, so as to "provide different patterns of brightnesses regardless of which filters we used", as Dr Bell points out. The blue pigment is very bright in the near-IR range. Thus the L2 plate has a very bright recording of the blue pigment.
The crux of the problem I think is the L2 filter (The L4,L5 and L6 filters correspond to the points R,G and B) replacing the L4 filter thereby shifting the Red point by 150nm to the very edge of infra-red.
There was posted somewhere on this forum, a link to the site which gives exposure times and suchlike for these rover pictures. Does anyone have it saved somewhere, as it may add more information to the discussion.
I have written in my last post an erronous information about the auto-exposure of marsrover cameras. The proper information is: the exposition is in process until certain number of the ccd pixels will exceed a certain value threshold. (Both the number and the threshold are adjustable)
the differences between the three composed images is very simple explainable. The autoexposure feature of the rover works in the following way: the exposition is in process until certain percent of the ccd pixels will be saturated. This means that the exposition time is depend on the filter wavelength and the color of the object to be photographed. In the case of your images: in the first images the sky contain a lot of greenish and bluish color therefore the expositon time of the L5 and L6 filter imahes will be similar to the exposition time of the L4's. In these images the pixel valus of the soil-pixels are relatively low.So if you compose directly a RGB images you will obtain bearable color image, but you should to correct it because the color of the soil is false. In the second and the third images the main part of the image-field is the soil with color with less blue and green content. Of course the rover has exposed these L5 and L6 images with much longer exposition time because the number of the saturated pixels was very few related to the autoexposure threshold. In these images the pixel values of the soil-pixels are relativeli high. So if you compose from them directly a color image, you will obtain an absolutely false colorized image with bluish-gray colors. This is why the color-balancing is necessary. The caltarget was usable only in the first 20-30 sols, later it became too contamined. Now, after 2300 sol it is totally unusable for this purpose. Insted of it use the color of the soil.
The picture below just illustrates my earlier point. Each of the images on the left were processed by me in a very standard way (I have a video up of my process on youtube for anyone who wants to see my method). All three were processed identically with no enhancement or filters added whatsoever. The three images acquired to create the mosaic are all different - the soil color is not the same from one image to the next. In the pia all the images combined ~appear~ to have different lighting but as we know they don't - they were taken minutes apart. The differences are probably due to different settings on the camera for each picture, or changes made in the rover software before they were returned. The bad thing about this is that even if we are able to get a good fix on the calibration target in its image, the changes we make can not be equally applied to the sky image - as its settings were different. Thus, we must have a single snapshot that contains the sky and the calibration target to be sure that the settings were the same rather than a mosaic. Notice also how the nasa pia has lost its color content - and seems to have a brown overlay on top of the color. These brown shots should carry a disclaimer saying exactly the processes applied to them - as they are not representative of the original data we are given.
The next image is is my attempt to get correct colors on just the calibration target image. Notice how when my colors start becoming intense enough to equal what we should hope to see, the landscape becomes too dark to make out any detail. This seems to corroborate the opinion that the rover software has at least auto equalized the images before they are transmitted back to earth.
here is the answer and the solution for your false caltarget-strip color problem:
Consider that the blue and green color strips show extreme reflexivity beyond the wavelength 650 nm. This means that the L3 and the L2 filtered images contain unnatural pixel values of green or blue colors although the natura would be any low value. If you apply the L2 filter image as red layer in an RGB image, then you will obtain unnatural color balance , and in special cases like the caltarget the colors will be total false. Regarding the calibration of the martian colors there is a strange thing what published by Landis from the NASA Glenn Reserarch Center (Cleveland). In the early sols the martian global spectrum at the Opportunity landing site was very close to the earthian sunlight at noon. See the following diagram:
The color diagram is the earthian sunlight spectrum on the surface and the dotted curves show the spectral intensity on the Meridiani Planum. They based on the caltarget white-ring reflexivity mesurement in L2...L7 , R3..R7 images. We can say that at the optical depth value 0.94 (!!), at noon the two spectrum is very close, only at 601 nm wavelength(L4 filter) is a small peak, but at the L3 filter wavelength the martian spectrum is a bit less than the earthian. In this case the white color consist of 100/98/95 R/G/B respectively instead of 100/100/100. The difference is very small, inappreciable. Therefore if someone create a good color rendering method balanced the colors of the martian caltarget images to the colors of the earthian colors of the caltarget, good color images will be rendered later. This is valid for the Opportunity images. If the martian atmosphere is clear (at low optical depth values such as 0.2-0.3 or less) the martian sunlight spectrum will be more bluish than the earthian because of lack of the filtering particles. In this case the colors will contain more blue component i.e. the white will be bluish-white, the martian soil will have other hue than in case of tau=0.9 etc. I think we (and NASA) "over-mysticize" the problem of true color Mars-images. The human eye and brain is "developed" to get information in earthian spectral light. If you see anything in darkness or in disco-light you will examine it also at daylight because the other illumination served false or few information. I think the martian true color is simlar probleme to this: it would be more important to see the martian object in colors belonging to earthian sunlight spectrum instead of martian reddish spectrum. If the martian atmosphere is clear, this condition is satisfied, in other cases can be use white-balancing if it is necessary.
Forgot to say.. Cheers xiriux. I have a sneaky feeling that your images are pretty close but without that caltarget data I just cant be sure.
This means that in the martian atmosphere are present also particles in size under 1.3 microns causing direction-dependent Mie-scattering, and not only over 1.3 microns-particles which cause direction-independent "white-scattering". It seems that the Rayleigh-scattering of the martian atmosphere can be higher than expected.
I thin youre on the money. the sky is a bit like a diffuser / reflector, in certain conditions, scattering a huge amount of monochromatic light.. Jst amazing to see.
TW, you have missed the image of calibration target. Voila:
Cheers xiriux.. I posted that image in my earlier posts. I calibrated as best i could to the target but no colors due to surrounding light. Check near bottom. Also when you attempt to calibrate that image you compromise the true time of day lighting. its darker. You see the problem that there reflective lighting readings are correct (probably). But because the caltarget is inaccurate they cant get the color balance. Its not the brightness its the color. Its supposed to be dusk or dawn. look at the shadow on the Caltarget. the original image is much darker. The saturation of reds and browns is correct due to the position of the sun (near horizon and surrounding terrain. The air contain quite a bit of airborne dust (being in a desert environment) .The solar panels are supposed to be much much darker. Not sandy colored. Theyre not absorbing enough light to drive the rovers talk less operate the processing systems..
The faintness in the caltarget is not good enough! You need good strong feed back to get any where close to the true colors. if its covered in dust and partially exposed its useless . the hue has changed resulting in what we have now. Unreliable, suspect, frustrating color images with absolutely no calibrated source. Its a bit like a dog chasing its tail. You think you can see the colors but with an over elaborate filtration system (completely laughable), surprisingly on board (shaking my head with disbelief). A dial that caltarget system thats just waiting for failure. Why on gods earth would one design a robot that tries to be a processing lab at the same time? running on solar panels? let us on earth to the filtration just transmit back your default ccd data.
Here's an example of Viking mission problems with this badly designed system...
Solving the color calibration problem of Martian lander images (Proceedings Paper)Author(s): Ron L. Levin; Gilbert V. LevinPaper Abstract
The color of published Viking and Pathfinder images varies greatly in hue, saturation and chromaticity. True color is important for interpretation of physical, chemical, geological and, possibly, biological information about Mars. The weak link in the imaging process for both missions was the reliance on imaging color charts reflecting Martian ambient light. While the reflectivity of the charts is well known, the spectrum of their illumination on Mars is not. “Calibrated” images are usually reddish, attributed to atmospheric dust, but hues range widely because of the great uncertainty in the illumination spectrum. Solar black body radiation, the same on Mars as on Earth, is minimally modified by the atmosphere of either planet. For red dust to change the spectrum significantly, reflected light must exceed the transmitted light. Were this the case, shadows would be virtually eliminated. Viking images show prominent shadows. Also, Pathfinder’s solar cells, activated by blue light, would have failed under the predominately red spectrum generally attributed to Mars. Accordingly, no consensus has emerged on the colors of the soil, rocks and sky of Mars. This paper proposes two techniques to eliminate color uncertainty from future images, and also to allow recalibration of past images: 1. Calibration of cameras at night through minimal atmospheric paths using light sources brought from Earth, which, used during the day, would permit calculation of red, green and blue intensities independent of scene illumination; 2. Use of hyperspectral imaging to measure the complete spectrum of each pixel. This paper includes a calibration of a NASA Viking lander image based on its color chart as it appears on Earth. The more realistic Martian colors become far more interesting, showing blue skies, brownish soil and rocks, both with yellow, olive, and greenish areas.
=========
If JPL paid me half what they spent on this appalling system Id of designed something soooo much better! 4 LEDs . Readings at night. leds on short stalks to avoid dust build up. They re low voltage. Even if there's any build up on the surface they will transmit through the dust a more reliable calibration ref signal. A system of replacements can be factored in. When one fails the next led takes over in circular sector. total cost $3,500. Will last around 15 years. The whole system would sit on a 4-5 sectored disc 30cm across, with rgb and Y in silver oxide impregnated pigments ( stops radiation bleaching)
"This view combines separate images taken through Pancam filters centered on wavelengths of 753 nanometers [note: a non-visible wavelength], 535 nanometers and 432 nanometers. It is presented in a false-color stretch to bring out subtle color differences in the scene."
Cheers.. FALSe color in big letters lol.. Just a problem still. The target panel does not match colors available in scene. Greens, blues. The target has just a hyper saturated / hue blue to red????? Really cannot compute to mosaic. If they input false colors (to the raw data) then why is that not reflected in the Caltarget?
But why would they do this now and what subtle colour differences in the scene?
I mean, if they referred to something interesting that needed this processing to show what they were talking about, then I can understand their motivation.
The only thing that the description says which possibly, possibly, possibly maybe what they mean is this... The western edge of Home Plate is in the foreground, generally lighter in tone than the more distant parts of the scene.(my bolding)
Even so, if there was some reason like that, they probably would not do it in a panorama but rather do it in an individual picture.
Maybe you can tell us what subtle colour differences they are revealing to us?
TW, you have missed the image of calibration target. Voila:
You can see that it is fairly contamined with dust but the solar cells are relatively clean. It seems that they are after a cleaning cycle. I should like to draw your attention to the caltarget: it is much less contamined that the Spirit's one.
Mike, the "blue patch" between the Husband hill and the Home Plate is an sand-patch named as Eldorado. Its color is not blue but sand-color with some bluish and greenish hue in shadowed parts.
Look it:
OBrien, you wrote: "While the four images I posted were taken at different times of day, they were taken a few minutes apart, not a few hours apart as I originally thought.
Therefore there are four images taken within a few minutes of each other, all taken with the same camera through the same filter, that have significantly different sky brightnesses at 753 nm.
I don't have an explanation for that, but it is not because of different illumination at different times of day."
You are right, the illumination and the optical depth do not change in 3 minutes ( the time different from the first L2 image to the last one is about 190 seconds).
If you examine the images you can find that the position of the pancam image by image turns toward the direction of the Sun. In the first positon the direction of the illumination is right and from a bit back, in the fourth images it is less by about 50 grads (the FOV of the pancam is 16 grad), so the sunlight illuminates the atmospheric aerosol from other angle. This is the cause why in the fourth image the sky is much brigther than in the first image.
Everyone knows this phenomenon here on the Earth: if there is clear weather, the sky is very nice deeper blue if we look it back to the Sun. If we turn toward the Sun the sky becames more and more brighter and whiter because of the light scattering of the atmospheric aerosol. (When the sky is blue, its color contains less red component than green and blue, and when it becames whiter the intensity of the red component (and also the green's ) increases.
My opinion is that in this case we can see this phenomenon in the mentioned Opportunity images.
This means that in the martian atmosphere are present also particles in size under 1.3 microns causing direction-dependent Mie-scattering, and not only over 1.3 microns-particles which cause direction-independent "white-scattering". It seems that the Rayleigh-scattering of the martian atmosphere can be higher than expected.
Unfortunately the droid is color blind. This explains the varying pallets of color on some images. JPL is in a pickle over this and they need to come clean (at least) on the problem.
"This view combines separate images taken through Pancam filters centered on wavelengths of 753 nanometers [note: a non-visible wavelength], 535 nanometers and 432 nanometers. It is presented in a false-color stretch to bring out subtle color differences in the scene."
Cheers Mike for important image... Unfortunately the calibration is completely out. Thankfully the target (in yellow box) tells the whole story..
Interestingly the 3 other colors are washed out. The hue , saturation etc is out of sync, wildly. Now maybe im wrong.. if so lol then the genius whom (painted) in the red/magenta color put it on the wrong side.
Ive said before and Im pretty sure, thanks to getting another great image from MIke, with color target in shot, that the calibration target is badly compromised. Radiation / dust and other factors have reduced the ability to color match the surroundings. Basically bleached out the target.
Unfortunately the droid is color blind. This explains the varying pallets of color on some images. JPL is in a pickle over this and they need to come clean (at least) on the problem.
Those solar panels are not powering the robot .. If JPL admit to the dust problem then it stands to logic that the dust has been landing elsewhere on the robot.
Now this image is a better calibrated one from JPL that shows plenty of greens and blues! And a pale blue sky to boot! This was probably taken when the sky was clear with no red dust in the atmosphere. You can see the calib wheel in the foreground but alas, no colors can be made out. If you download the 133 MB version, which I don't have the time and inclination to do, you can perhaps get more wheel clarity. Can anyone check this out?
In other words, the true color of Mars is more like this one. The others with the reddish tinge are perhaps due to dust storms. In other words Mars is just not only red!
Cheers!
P.S. Check out the blue patch below the hill! Any guesses what that might be?
The crunch point here as mentioned in the web site:
"It has been calibrated and processed to approximate the colors that would be seen by humans if they could be present for this lovely Martian view. "
As seen by humans? Lovely Martian view? That's the worst calibration I've ever seen! If that's the nearest approximation they've managed, then this guy working for JPL's image processing lab needs to be fired pronto!
And add to that the fact that PanCam returns digital data directly to the rover computer which has the capability to perform a limited set of image processing tasks on PanCam data prior to transmission. Some of these tasks include:
Bias and dark current subtraction.
Electronic shutter effect correction.
Bad pixel replacement.
Rudimentary automatic exposure control (to maximize the signal to noise ratio, (SNR) of downlinked data while preventing data saturation).
Image subsampling.
Image compression (using a JPL-developed wavelet compression algorithm called ICER, I think).
Considering these multifarious functions being performed by a computer on board the rover before transmission, is it any surprise that what we got is not what we were actually supposed to get?
Well, at this point, I am convinced we won't get meaningful results from the jpl site images. The next step, one of us needs to link in the pds raw images and see if we are having the same problem working with those.
If it is a result of different auto exposures on the rover - with different exposures on each of the three shots- the same problem will exist in the pds images, and the mosaic will tell us little or nothing. If that's the case, we'll need a single image (non-mosaic) that contains the callibration target and the sky to truly resolve the issue. Otherwise, we're still just guessing at some of the data - though my experiments are indicating xiriux's guesses are prob somewhat accurate - I'd also like to see his complete shot containing the callibration target and the sky in the same image.
The image description says this was taken around 3:00 P.M. local Mars time, which I suspect is an early dusk - with the sky darkening earlier in the afternoon than it would on earth - due to the distance from the sun. The auto exposure settings are therefore brightening the image to show us the full scene - and in effect, washing out the color.
Even so, none of this explains the muddy pictures Nasa is giving us since the color can be derived if they are properly using the calibration target.
Even now after a couple of months, I am not totally sure of my calculations on the website. I THINK they are correct, but there are always bugs in programs, the deeper they are, the longer they take to come out. It is quite possible that I have miscalculated anything, although I was careful when I wrote it.
However, the time differences are likely to be correct because the timestamp is in seconds.
The calculated Sols I know are not exactly correct but should be within a day either side as I have averaged out the number of hours/minutes each martian day according to the info I found on official sites. There the information is in the database the sol is taken from the url.
Hi, The variation in exposures is probably down to exposure bracketing. It saved my ass on certain difficult jobs in the past. I found this nice simple explanation.
When you expose for a scene, your camera's light meter will select an aperture / shutter speed combination that it believes will give a properly exposed picture.
Exposure bracketing means that you take two more pictures: one slightly under-exposed (usually by dialing in a negative exposure compensation, say -1/3EV), and the second one slightly over-exposed (usually by dialing in a positive exposure compensation, say +1/3EV), again according to your camera's light meter.
The reason you do this is because the camera might have been 'deceived' by the light (too much or too little) available and your main subject may be over- or under-exposed. By taking these three shots, you are making sure that if this were ever the case, then you would have properly compensated for it.
Its a simple solution to just step throuigh exposures on either side of metered exposure. This gives a wider variation for printing the final scene .
OBrien wrote: That is not the only image taken through the L2 filter on that sol. There are 4 taken at different times of day, all through the L2 (753 nm) filter.
...
The sky illumination changes throughout the day. The last photo in the series, 1P326071499EFFAGEIP2423L2M1, the one with the brightest sky, was taken at a time nearest the times of the images used in the color mosaic. It's not valid to choose the first one with the darkest sky (as you have done) to estimate the red/infrared content of the sky illumination and then apply it to a color representation taken at a different time of day, especially when it can be seen that the sky has lightened tremendously.
While the four images I posted were taken at different times of day, they were taken a few minutes apart, not a few hours apart as I originally thought.
Therefore there are four images taken within a few minutes of each other, all taken with the same camera through the same filter, that have significantly different sky brightnesses at 753 nm.
I don't have an explanation for that, but it is not because of different illumination at different times of day.
Until a reasonable explanation is found, it's still questionable to use the darkest sky value through the L2 filter as a calibration for an image created from the L4, L5, L6 filters.
Thanks, Mike and the watcher! Even though it is a mosaic, it should suit our purposes perfectly - if the caption is correct. The caption indicates the frames were taken at or near the same moment in time - which means that the lighting will be the same for all - just as if it was a single shot. I'll work on this one and see what it reveals when I get a chance. (Opportunity sol 2229).
This is the caption: "PIA13104: Two Worlds, One Sun
"While humans' lives unfolded on Earth, NASA's Mars Exploration Rover Opportunity paused in its southward trek and captured this photomosaic around 15:00 local Mars time on May 2, 2010. The timing for this photography with Opportunity's panoramic camera (Pancam) was coordinated with a "moment in time" simultaneous photographic event in thousands of locations on Earth, organized through New York Times photography blog, Lens. (See http://lens.blogs.nytimes.com/2010/05/04/readers-14/.)
Dusty, reddish brown dunes stretch southward to the horizon along the rover's route ahead.
The "Two Worlds, One Sun" theme is a reference to the motto inscribed on the Pancam calibration target, seen on the back of the rover deck at the bottom of this view. The target is used to properly calibrate and color-balance the Pancam images, and with its artistically styled shadow post, or gnomon, it doubles as a sundial (also known as a "Marsdial") for educational purposes. (See PIA05018.)
This scene is a three-tall by one-wide mosaic of Pancam images taken through the camera's red (602 nanometer), green (530 nanometer) and blue (480 nanometer) filters. It has been calibrated and processed to approximate the colors that would be seen by humans if they could be present for this lovely Martian view. The camera took the images during the 2,229th Martian day, or sol, of Opportunity's mission on Mars.
Some consideration regarding to the imgae PIA13104 linked by Mike:
My opinion is that NASA-published color image has false colors. Why?
See the following image captured on same sol with L2 filter:
It is obvious that on the photo the sky is very dark ...
That is not the only image taken through the L2 filter on that sol. There are 4 taken at different times of day, all through the L2 (753 nm) filter.
The sky illumination changes throughout the day. The last photo in the series, 1P326071499EFFAGEIP2423L2M1, the one with the brightest sky, was taken at a time nearest the times of the images used in the color mosaic. It's not valid to choose the first one with the darkest sky (as you have done) to estimate the red/infrared content of the sky illumination and then apply it to a color representation taken at a different time of day, especially when it can be seen that the sky has lightened tremendously.
Also, if you're using the images from the marsrovers.jpl.nasa.gov website, be aware that those have all been stretched to provide relatively uniform contrast for consumption on a public information web site. The brightness values are relative, not absolute. You cannot assume that the intensity values from filter to filter are correct with respect to each other, so any color image created from them will potentially have very large errors. You must use original calibrated images from the PDS to assure some measure of color fidelity.
I'm afraid you have misunderstood me and you are wrong.
1. This color image was rendered from the original L4, L5 and L6 filtered raw images found in gallery of Opportunity. The NASA's PIA13104 is composed from 2 part-images, and this image is the original one of the upper part of that. Ergo I did not crop the calibration target from the photo, because it is visible on an other raw image. 2. Tha raw image called as raw image because its pixels contain the original lighting values derived from CCD of the rover's camera. So it is originally black and white: the ccd pixel has no color. 3. You are wrong regarding to the L2 red-filtered image. If you make a photo with red filter, the value of the ccd-pixel will be higher (whiter on the raw image) if the object is white or red, and it will be lower (darker on raw image) if the reflected or emitted light of the object contains less red light, i.e. it is green or blue or has low intensity. Look carefully the L2 image: the sky is very dark, but the soil and the stone in the foreground is much more lighter. This means that there is enough sunlight but there is few of red color in the sky. If you examine red-filtered river images on other sols you can see that the sky is as light as the soil, i.e. its color contains fair red component (this occurs for example during dust-storms) 4. I used as color reference the color of the "average soil" The "average soil" can be obtained blurring extremely a large part of the soil in the image. I think it gives good result because the Meridiani Planum has fairly homogen colorization on surface except the stones . The color of the soil is known from any article of Jim Bell (Cornell University) or other authors. 5. You are right I don't know wether raw images are processed or not. I believe that they are original without changing. The tone correction was processed in the rover using auto-exposure. Although you can find relatively numerous wrong photos.
Here's the image you wanted - the Martian sky and the color calib wheel both in one frame!
To be strictly accurate, the two are not "in one frame" since the picture is a mosaic.
"This scene is a three-tall by one-wide mosaic of Pancam images taken through the camera's red (602 nanometer), green (530 nanometer) and blue (480 nanometer) filters. It has been calibrated and processed to approximate the colors that would be seen by humans if they could be present for this lovely Martian view. The camera took the images during the 2,229th Martian day, or sol, of Opportunity's mission on Mars."
Some consideration regarding to the imgae PIA13104 linked by Mike:
My opinion is that NASA-published color image has false colors. Why?
See the following image captured on same sol with L2 filter:
It is obvious that on the photo the sky is very dark, i.e. its color contains few red components because the atmosphere is clear. If the martian atmosphere is clear ( the optical depth (tau) is about 0.3 or below) it contains very few particles that filter the sunlight, so the color of the sunlight is very similar to the earthian (a bit more bluish because of the less Rayleigh-scattering of the atmosphere). This circumstance eliminates the color-shifting. You can see on tha NASA image a permanent orange mist. It is impossible because the original raw images do not show any mist. It seems that this orange mist is artificial. I have linearized the existing L4, L5 and L6 images, composed the RGB color images then I have balanced the R/G/B ratios according to the R/G/B ratios of the "average soil" (it is approximately 1/0.66/0.35 on the Meridiani Planum where Opportunity landed). After these operation I have calculated the tristimulus values (interpolating the missing filtered-image values) of the "average soil". Finally I have calculated the tristimulus values of some sky area and derived their RGB colors from the XYZ values, then I have corrected the color image composed from L4,L5,L6 raw photos. Voila the result:
Without orange mist and with fairly clear sky. I am not sure that the turquoise sky is the real color, but let us agree to blush instead of reddish.
Interesting.. Just one question xiriux why did u crop out the Calibration disk? I think you should include it in your enhanced images as then we can see what the disks full state. Without external calibration its just a guessing game.. Thats why photgraphers and printers need a color reference... or meter reading.. Filters wont hack it if you dont know the true readings from surface and from sky + a color chart in the said enviroment giving feed back (at source) of what the color balance is for future printing.
It is obvious that on the photo the sky is very dark, i.e. its color contains few red components because the atmosphere is clear.
I think your wrong here xiriux . What does red do in black and white??? Darken tones. There is red ish tones (more tanned ) in the sky. Thats why you have dark tones.
The color of an object depends on the wavelengths of colors reflected from the object. A red apple is red because red wavelengths in white light are reflected and other wavelengths are absorbed. In fact, if a red apple were to be illuminated by light that had no red wavelengths, the apple would appear almost black. What your black and white has done is remove the color. But thankfully red has a signature in BW. As a phoptgrapher (many many years ago) I used red filters on blackwhite film to give me those black skies with white clouds.
You can see on tha NASA image a permanent orange mist. It is impossible because the original raw images do not show any mist. It seems that this orange mist is artificial. I have linearized the existing L4, L5 and L6 images, composed the RGB color images then I have balanced the R/G/B ratios according to the R/G/B ratios of the "average soil" (it is approximately 1/0.66/0.35 on the Meridiani Planum where Opportunity landed). After these operation I have calculated the tristimulus values (interpolating the missing filtered-image values) of the "average soil". Finally I have calculated the tristimulus values of some sky area and derived their RGB colors from the XYZ values, then I have corrected the color image composed from L4,L5,L6 raw photos. Voila the result:
Printing a color image is purely down to averages and aesthetic judgments. the metering is for the 1st generation capture.
You say : You can see on tha NASA image a permanent orange mist. It is impossible because the original raw images do not show any mist. It seems that this orange mist is artificial.
This is my point in processing. You have a choice of 2 images from JPL. The RAW ( which was processed by a HUMAN! Or the image here in this thread, processed by a Human. Because an image a a raw identity, are you saying it hasnt been tonally corrected? printing will always throw up a wide variety of results if they dont have the CALIBRATION settings. Show me the light readings of the incidental and reflected light from all zones of the image and I will agree with you. Unfortunately theres no way you will no that as I think in some cases JPL dont even fully know.
I think what you've attempted is important but unfortunately fruitless as you dont have the essential variables.
Im not JPLs hugest fan but I have worked with the nightmare qualities of light for a number of years. Sometimes complicated results mask a truly simple problem.
Some consideration regarding to the imgae PIA13104 linked by Mike:
My opinion is that NASA-published color image has false colors. Why?
See the following image captured on same sol with L2 filter:
It is obvious that on the photo the sky is very dark, i.e. its color contains few red components because the atmosphere is clear. If the martian atmosphere is clear ( the optical depth (tau) is about 0.3 or below) it contains very few particles that filter the sunlight, so the color of the sunlight is very similar to the earthian (a bit more bluish because of the less Rayleigh-scattering of the atmosphere). This circumstance eliminates the color-shifting. You can see on tha NASA image a permanent orange mist. It is impossible because the original raw images do not show any mist. It seems that this orange mist is artificial. I have linearized the existing L4, L5 and L6 images, composed the RGB color images then I have balanced the R/G/B ratios according to the R/G/B ratios of the "average soil" (it is approximately 1/0.66/0.35 on the Meridiani Planum where Opportunity landed). After these operation I have calculated the tristimulus values (interpolating the missing filtered-image values) of the "average soil". Finally I have calculated the tristimulus values of some sky area and derived their RGB colors from the XYZ values, then I have corrected the color image composed from L4,L5,L6 raw photos. Voila the result:
Without orange mist and with fairly clear sky. I am not sure that the turquoise sky is the real color, but let us agree to blush instead of reddish.
As you can see. Taking into account color, lighting levels both incidental ambient etc there is a definite build up of material on the disk but more tellingly on the darker surround (which is a nice gauge for dust levels between the 2 images. The colors can just be made out but really its a bit of a disaster! Whats a little more concerning is whats happening to the rest of the robot, especially the solar panels? TW
Just a few other things.. Time of day? Light falling due to sunset or rise? Trying to calib during these times of day (sol?) will force brighter conditions which would be false. Its most probably dusk and as you said Mike, 40% less light will factor into poor info from calib disk.
What concerns me more is what backup did JPL factor in for dusty calib disks. Images cant rely on just reflected metering all the time.
Here's the image you wanted - the Martian sky and the color calib wheel both in one frame! Heck, this really sucks! No color, just red everywhere! What happened to the colors in the calibration wheel? Why weren't the color channels adjusted to produce the colors on the wheel? Otherwise what's the point in sending it up all the way to Mars?
NASA/JPL
And thanks to all the others posters who have brought out some interesting points!
Cheers! Mike.
Hi guys, I calibrated as best I could to the color wheel. Unfortunately in the current light in shot its washed out by color temperature of incidental light. When calib disc is compromised by dust or extreme monochromatic light an incidental dome will help a great deal.. The rovers metering is a bit hit and miss in these conditions. They shoulda got a pro photographer to help setup metering system.. Or the guy they used was hopeless!
In this shot it looks like the Clib disc is covered in dust? Ive tried to get color from it but its completely washed out!
Here's the image you wanted - the Martian sky and the color calib wheel both in one frame! Heck, this really sucks! No color, just red everywhere! What happened to the colors in the calibration wheel? Why weren't the color channels adjusted to produce the colors on the wheel? Otherwise what's the point in sending it up all the way to Mars?
NASA/JPL
And thanks to all the others posters who have brought out some interesting points!
O'Brien, I can see what you are saying and there it appears to make sense, however, I believe it was pointed out previously that the colour patches have been removed or are not there or are not shown now. This was the original way that the colours could be matched because we should have the images of the patches in earth light and the images on Mars. As I understand it, this was the idea to give us a benchmark to go by.
The red filter you used to illustrate the yellow glove is probably extremely red and is not a very good comparison to the red dust either in the air on Mars or a general reddish reflection from the surface. You could also tell us just how 'red' that filter is (although it probably would not mean anything to me, at least)
The last point is this Measurements are frequently made of the sun with the Pancam and solar filter, to determine the Martian atmosphere's optical thickness. Knowing the amount of dust in the air can be used to calculate the spectral attenuation for "correcting" images... I have asked before but maybe you did not see it - where can I find information of these sun measurements and how to interpret the tiny pictures that we get back? I assume that the pictures are, in reality, still the 1024 x 1024 pixels that the other pictures are? So why are we getting these tiny representations of a sun as a white disc in the middle of a black background?
mikesingh wrote:xiriux, I agree to a certain extent. But you have not answered the question of how that yellow colored cable on the Rover became red on Mars?
...
So the question is, how has the yellow cable on the Rover turned to red in the NASA image on Mars? Is it the red dust? Obviously not!
mikesingh,
In color images, the color of the illumination is as important as the color of the object.
The cable in the cleanroom is bright yellow under typcal fluorescent lighting. It won't necessarily appear that way under dim or colored illumination.
Here's a series of photos I took that shows the effects of illumination. I'd encourage anyone else to do this exercise at home to prove it to themselves. The first photo is of a yellow glove taken under bright fluorescent lighting. The glove is what anyone would consider quite a bright yellow, and the first photo is a fair representation of its hue.
The second photo is of exactly the same glove, but the fluorescent light has been turned off and the illumination is a distant 60W incandescent light. The apparent change from bright yellow to orange is obvious.
The third picture is an example to show what happens in the limit in an unrealistically extreme case. The yellow glove is being illuminated only by an incandescent light shining through a red gel. The illumination is pure red, and the glove no longer appears yellow.
The camera settings for white balance and exposure were the same for all three photos, and no post-processing of any kind was done. The color changes are due solely to the illumination of the scene.
The glove is still yellow. It hasn't changed color. How it appears in a color image has.
IF the illumination has changed from the cleanroom to the surface of Mars, then the apparent color of the cable can change.
Just from the distance difference, Mars sunlight is less than half as bright as Earth sunlight. And if there is dust in the air the illumination will be redder, not just from the redness of the dust, but from increased scattering, like in a sunset when there's smoke or haze.
So the cable is not reddened from a coating of dust, but it may appear redder because the dust changed the illumination in that photo.
Measurements are frequently made of the sun with the Pancam and solar filter, to determine the Martian atmosphere's optical thickness. Knowing the amount of dust in the air can be used to calculate the spectral attenuation for "correcting" images. But you have to make sure you distinguish between two cases that sound similar, but aren't.
Often people talk about "What would it look like if I was standing there?" which is a combination of the color of objects there and the intensity and color of the illumination. That is different from "What color are the rocks?" which has the tacit implication "What color would these rocks look like under illumination I'm used to." Those two colors will not necessarily be the same.
xiriux, I agree to a certain extent. But you have not answered the question of how that yellow colored cable on the Rover became red on Mars? I have not 'auto' color corrected the second image where I have adjusted the colors globally on that Mars image to match the cable to yellow. Doing that brought out a set of different colors compared to the NASA/JPL image. The former corrected one looks more 'natural' compared to the latter NASA image.
So the question is, how has the yellow cable on the Rover turned to red in the NASA image on Mars? Is it the red dust? Obviously not! Or the more pertinent question would be, why?
sorry but I have to tell you that your procedure is wrong to get the proper color of martian color images. You have used the RGB color level auto correction but this way results false color in that case if in the original image there is no approximative color-balance. If any basic color is in majority (and this is the real situation) the auto level correction will set the maximal value of the other two color-components to the possible maximum value. As exmaples I have inserted some photos photographed on Earth. On the first image you can see a detail of Atacama desert.
The second one is its auto-corrected variant:
(like yours corrected products)
The third image shows the obvious error of the auto correction. You can see that in the original image a caravan wanders in the Sahar on camels. After different area selection to be corrected we get the following ridiculous results:
You can see that the color of the sand is strongly depends on the color balance of the selected area. This is the main problem of the automatic color correction, therefore its result is unacceptable as true color image.
One of the most compelling Sir Charles points out is the removal of the mirrors from the color callibration target. Careful examination reveals that the mirrors - (which would have shown us the true color of the sky as compared with the color chips) - have been replaced by pink or salmon colored rectangles. Why did they do that? And why has there been no announcement telling us the mirrors were removed? - and further - why is there no picture from either rover showing the sky and the color callibration target in the same frame?
The rover picture above shows yellow wheel trims which I am not sure the actual ones on Mars have do they? I suspect it might be the one they use to try out stuff on Earth - such as getting Spirit out of the sand etc. If I am correct and this IS the picture of the Earth-bound one, then this yellow cable does not have to be the same colour on the rovers on Mars. I may be wrong though.
-- Edited by qmantoo on Saturday 11th of September 2010 02:57:30 PM
Good point! However, check out this image below that I'm reproducing. Notice the area is sanitized with the scientists wearing masks. This clearly means that it is the actual Mars rover:
Now take a look at the rover that was used for the sandbox simulation. The room isn't sanitized and those scientists aren't wearing any masks:
It can therefore be safely concuded that the rover shown in the first image is the actual one with a yellow cable that magically turned red. That needless to say is due to NASA's over saturation!
The rover picture above shows yellow wheel trims which I am not sure the actual ones on Mars have do they? I suspect it might be the one they use to try out stuff on Earth - such as getting Spirit out of the sand etc. If I am correct and this IS the picture of the Earth-bound one, then this yellow cable does not have to be the same colour on the rovers on Mars. I may be wrong though.
On a different subject, it is all very well posting pictures which demonstrate colour differences, but some of us (and skeptical others) need the links to the originals so that we can see for ourselves.
I know it is a pain to do this all the time, but it backs up what we say with 'official' references to the originals which people seem to like..
-- Edited by qmantoo on Saturday 11th of September 2010 02:57:30 PM