So, image resolution affects colour rendition too? Surely not that much that one appears grey and one appears pink, although I can see how the edges of one area may beg merged into the edges of the next area and so colours may become a mixture. I suppose they could have sampled a pixel as pink and then coloured all similar pixels as the same pink colour.
The reason you are seeing two types of imaging in GE is due to image resolution.
The imaging on the left is high resolution whereas the imaging on the right is at a much lower resolution. Note that image strips may have been captured at different times.
Hope this helps.
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
I recently came across an odd color change while looking at Peru on google earth where one side of the image is rather clear and natural while just next to the site (in a dense jungle with river) is bright pinks and blues and highly obfuscated. However, within those odd 'neon' areas where photo links are present they often show images of small towns situated along the waterways. I've noticed this quite frequently on google earth / moon /mars and wish I had a better understanding of what was going on with it. Thanks.
<< No, you have changed the color to be that which you prefer. Your process of color correction has no merit. >>
The colour correction applied to the image does have merit. The correction brings the chromatic values close to what they should be for the filters used and the RGB LED illmination of the content.
O'Brien wrote:
<< If you disagree, please detail (and I mean detail with the actual reproducible procedure, not just vague generalizations) the process used and the provinance of your calibration data. >>
The details of the procedure used relevant to the amount of RGB correction has been recorded, but this thread is not the place to discuss the colour of a particular image or any colour correction techniques employed.
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
Yes, reduction of jpeg artifacts can be achieved using the 'reduce noise' facility in Photoshop. I only use this particular program for specialist work such as producing 3D images and colour productions. I also use it for some specialist enhancement procedures.
The main program I use for most of my research is PaintShop Pro v6.0. The reason I prefer this program is the accessibility of the menus and operation is smooth. This is the program I use to manually 'clean' the images of atrifacts on jpeg images and have had continued success with the procedure applied.
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
The jpeg artifact theory according to which a tampered enough image saved in jpeg format or another format with compression algorithms incorporated cannot be cracked up is out of date, as I explained previously in my image enhancing thread, and as I told timewarp in private chat in this forum, it can be done in photoshop, at least. It has a special feature for that. => Reduce Noice.
<< The pixellation you show is by no means unknown. It's a jpg. You're looking at jpg artifacts. >>
I'm sorry O'Brien, but there are no jpg compression artifacts showing in the original image as shown above. They have been removed. In fact, there are no compression artifacts showing in the sectional view either. Now, you being a digital imaging expert, will probably come back and say that removing the artifacts cannot be achieved. If you do, you're incorrect.
All the images I post on the forum have had the compression artifacts removed, in other words 'cleaned', prior to uploading. This procedure I use does not distort the image in any way and maintains the integrity of the content contained in the original. That is why the jpg images are suitable for further examination, whereas many of the PDS images are too large and the quality of the content is not suitable for further analysis.
Obviously there will be no requirement to tell you how to 'clean' an image as you will probably be fully aware and conversant with the procedure already.
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
Here is some information that may be of interest to members and visitors.
The Max Planck Institute for Solar System Research (MPS), Katlenburg-Lindau, has developed the Robotic Arm Camera (RAC) (figure) in collaboration with the University of Arizona. It also provided the focal plane with the Charge Coupled Device (CCD) detector for the Optical Microscope (MECA-OM). RAC and MECA-OM have the same CCD and share a common readout electronics that is located elsewhere and can serve one detector (RAC or MECA-OM) at a given time.
The RAC was originally built for the Mars Surveyor Lander 2001. After canceling of that mission the RAC was stored in a clean room and re-activated in 2005. Its development and integration was largely covered by funds from the Max Planck Society. Since 2007 the German Space Agency (DLR) supports the operation of the instrument during Phoenix surface operations. This support also includes participation in Operational Readiness Tests (ORT) prior to landing.
The RAC is a camera with a double Gauss lens system and a CCD (Charge Coupled Device) detector. The focus can be adjusted by moving the lens system back and forth with respect to the fixed CCD. In that way the object to be imaged can be placed at any distance (> 13 mm) from the front window of the camera. In extreme close-up mode the imaging scale is 1:1 and each pixel corresponds to 23 microns at the location of the target.
The RAC also has a transparent sapphire cover that protects the front window of the camera against dust contamination. During normal image acquisition the cover shall be open.
The RAC has two lighting units that are composed of red, green and blue Light Emitting Diodes (LEDs) and are mounted, respectively, above and below the camera chassis (figure). These LED assemblies allow lighting of targets in all three wavelength ranges. Acquiring images under these three lighting conditions provides color information of the targets and allows for generation of true-color images of these targets.
Don't blow up a jpg of a highly processed multiple-image image 500x and then try to draw conclusions about the conspiratorial processing done on it, especially when anyone has access to minimally processed source data (or at least far closer to source data).Yes, I am afraid that was me. My next project is how to extract the PDS img data into information and image.
Is there a way we can learn of such things as calibration levels of the illumination diodes, the original brightness levels, the responses of the filters, and the illumination levels of the LEDs? More to the point, how to use it. As a guess, I would think it is probably in the PDS file along with the image.
If I was a scientist just starting out how would I go about putting together all this information and using it effectively.
I realise this is a huge subject which you have probably spent years learning about. It is also probably one where I need to read all kinds of text and reference books, but I was wondering if there was something on the web that you had come across which gave like a basic understanding of it all - for the reasonably educated person?
I believe the the query relating to quality refers to the far left section of an image captured by the Robotic Arm Camera (RAC) onboard the Phoenix lander - image #lg_7851.
First, let's get some terms correct.
This is not "an image" captured by the RAC. It is a highly processed image derived from at least three (and almost certainly more) individual images captured by the RAC. A number of different images (several dozens) were taken at different focus positions, different exposures, and different filters. I am almost sure that different sections of the different focus position images are composited together, because of the two clumps that are 25% in from the right edge. The clump against the backstop ledge is in focus, and so is the clump that is considerably more towards the camera. Looking at the individual images available on the PDS, only one should be in focus while the other is out of focus. Therefore I suspect the best available area from each was used. I also suspect that finer resolution is achieved by pixel averaging between multiple images. Then specific colors are applied based on the filters used. The entire image is then cropped to just show the region of interest at the tip of the scoop.
The photo credit shows how many hands have had responsibility for producing it: NASA/JPL-Caltech/University of Arizona/Max Planck Institute
The image, a highly processed press release image, designed to look good for release to media outlets, is then posted on the University of Arizona web site. Not a NASA data retreival web site, but the UofA's Phoenix Mission PR web site. The image is released in .jpg format for simple download.
I have had to repeat many, many times the following. It pains me that I have to do so again. DO NOT USE JPG IMAGES DOWNLOADED FROM PUBLIC RELATIONS WEB SITES AND ATTEMPT TO DERIVE ACCURATE SCIENTIFIC DATA FROM THEM.
They are not valid as primary science data. Unless you're familiar with the processing done (and *all* images have some level of processing or else you'd be looking at a stream of numbers) you can't know what was done to the image. We know that contrast and brightness are automatically stretched on some web sites. Any color image has had to be hand processed at some level.
Go look at the original monochrome images in the PDS and it's obvious that the region in question (and another in the top middle of the cropped image) has a large bright out-of-focus saturated area that detracts from the pretty media picture the PR people wanted to show. The bright area was completely saturated, and so contained no discernable information. So they were processed out as a part of the extensive processing for this particular image.
Here's a section of the PDS image where you can see each pixel (except where the image saturates and then all the pixels are pegged at pure white).
Source: Zoomed and cropped image from PDS, Phoenix RAC file ID rs026eff898515642_13200mdm1
The take home message? Don't blow up a jpg of a highly processed multiple-image image 500x and then try to draw conclusions about the conspiratorial processing done on it, especially when anyone has access to minimally processed source data (or at least far closer to source data).
Timewarp wrote:whereas the rest of the section is grossly pixellated as though, for some unknown reason, has undergone some form of severe digital modification.
The pixellation you show is by no means unknown. It's a jpg. You're looking at jpg artifacts. I can't believe that by this time you're still fooled by jpg artifacts. How many times do I have to say DON'T ZOOM IN ON A JPG AND EXPECT TO RETRIEVE ACCURATE INFORMATION.
Timewarp wrote:NASA has said that the images taken by the RAC are true RGB quaility. The images shown above are not displaying the original RGB colours and the original image has been modified to show the content with an overall reddish hue.
This statement is not correct. The images taken through different color filters *can* be converted to true RGB, but only if you have knowledge of the calibration levels of the illumination diodes. If you make the claim that the image is not true color, then you must have knowledge of the original brightness levels of the separate images, the responses of the filters, the illumination levels of the LEDs at a minimum. Since you appear to have never even looked at the separate images, I doubt you have the rest of this information. You claim that the image is too reddish. I claim that you do not have the necessary information to make that determination.
Timewarp wrote:Many of the released images have been treated in this manner and I am sure that many of the RAW originals had the correct RGB colours.
Your statement continues to give the impression that you do not understand the process by which a color image is generated. There are no "raw images with the correct RGB color". There are multiple images that must be processed together knowing image intensity, filter transmission, and illumination calibration.
Timewarp wrote:Shown below is the original image, which I have colour corrected.
No, you have changed the color to be that which you prefer. Your process of color correction has no merit.
If you disagree, please detail (and I mean detail with the actual reproducible procedure, not just vague generalizations) the process used and the provinance of your calibration data.
In the magnified and cropped image shown above, the upper left part of the image displays a very fine resolution whereas the rest of the image appears 'rough' and severely pixellated.
Why should these two different standards in resolution be seen in the same iimage?
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
I believe the the query relating to quality refers to the far left section of an image captured by the Robotic Arm Camera (RAC) onboard the Phoenix lander - image #lg_7851.
The upper left quadrant of the section is of finer quality, just like an analogue photograph, whereas the rest of the section is grossly pixellated as though, for some unknown reason, has undergone some form of severe digital modification.
Shown below is the released original image followed by the sectional crop.
Original image #lg_7851
#Close up sectional crop from image #lg_7851 showing differences in the quality.
NASA has said that the images taken by the RAC are true RGB quaility. The images shown above are not displaying the original RGB colours and the original image has been modified to show the content with an overall reddish hue.
Many of the released images have been treated in this manner and I am sure that many of the RAW originals had the correct RGB colours.
Shown below is the original image, which I have colour corrected.
Image #lg_7851 - Colour corrected version
Image credit: NASA/JPL
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
I have no interest in secondhand questions. If you have specific image quality questions, go ahead and ask them yourself.This is unacceptable behaviour.
A forum is a place where we can all interact. If you want to hold a private conversation then you need to do it by private means. If you are not prepared for a discussion on the subject, why did you start the thread? The choice is yours but do not expect to continue posting in this way with this high-handed attitude.
With reference to the orbiter images, some of the images are very reasonable considering the age of the equipment used. There are image strips displaying a resolution of <1.5m per pixel but others display a resolution of >2.5m per pixel are practically useless for close visual research of the surface features.
So, the higher resolution images are used for examining surface features, and the lower resolution images are used for context, wider area geology, etc. Yes, that's how many missions are conducted. The MOC is capable of 1.5m resolution, but does not take all images at that resolution. It only takes selected targets at that resolution. All depends on the mission and science requirements. I'm glad to hear you say "some of the images are very reasonable considering the age of the equipment used."
Timewarp wrote:
In the case of the images captured by the SSI onboard Phoenix, a variety of images with different resolutions were released. They are 512, 1024 and 2048 pixels square. The 2048 pixels versions are ideal for visual exploration but the 512 pixels versions produce results so mediocre that it's just not worth bothering with them unless one is prepared to view an image so small that none of the surface features are recognizable.
Again, selected images at high resolution, depending on mission requirements. Not a measure of the camera system's limitations, but a question of determining what's important to look at with high resolution and what isn't. Again, I'm glad to hear you say "The 2048 pixels versions are ideal for visual exploration ..."
Timewarp wrote:
As this thread is dealing with image quality, do you have an answer to the query that qmantoo is referring to?
I have no interest in secondhand questions. If you have specific image quality questions, go ahead and ask them yourself.
With reference to the orbiter images, some of the images are very reasonable considering the age of the equipment used. There are image strips displaying a resolution of <1.5m per pixel but others display a resolution of >2.5m per pixel are practically useless for close visual research of the surface features.
In the case of the images captured by the SSI onboard Phoenix, a variety of images with different resolutions were released. They are 512, 1024 and 2048 pixels square. The 2048 pixels versions are ideal for visual exploration but the 512 pixels versions produce results so mediocre that it's just not worth bothering with them unless one is prepared to view an image so small that none of the surface features are recognizable.
O'Brien,
As this thread is dealing with image quality, do you have an answer to the query that qmantoo is referring to?
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
I was referring to the quality of the narrow angle images captured by the Mars Orbital Camera and some of the images of the terrain captured by the SSI during the Phoenix mission.
I can see why you're disappointed in the first case.
You're comparing the images from MOC, a camera launched in 1996 (so convervatively the technology is 15-17 years old), designed to have selectable GSR of 1.5m-12m to images from Google Earth. Images from Google Earth come from a variety of sources; if you're looking at urban areas the images are almost certainly from aerial photography (airplanes, not satellites), more rural and inaccessible areas are taken by satellite. DigitalGlobe and GeoEye satellites have GSR of <0.5m. (In fact, if you're concerned about "tampering" the images from these satellites photographing the Earth are required by law to resample to a resolution of 0.5m for images for public distribution.)
So if you're comparing MOC to Google Earth you should not expect the same level of resolution. It is too much to ask that a 15 year old camera around Mars doesn't give the same kind of results as a plane flying over your house? Yes it is.
When it comes to the SSI, I still don't see what you're talking about. Here's a picture of a nice country road from Google Earth (Street View) with some rocks by the side of the road ---
And here's as good as Google Earth can see the rocks right next to the camera ---
And here's an image from the Phoenix SSI of some rocks right by the lander.
Ref: SS056EFF901136427_16350RAM1
Google Earth has better imaging than the Phoenix SSI? You must be joking.
If you have particular pictures of terrain that you'd be interested in comparing, post them and then let's discuss those images.
O'Brien wrote: << When you said that you expected the images to be as good as from Google Earth, were you referring to orbital images of Mars being as good as overhead views in Google Earth... >>
I was referring to the quality of the narrow angle images captured by the Mars Orbital Camera and some of the images of the terrain captured by the SSI during the Phoenix mission.
O'Brien wrote: << Timewarp: Let's please keep on topic and not divert the thread to theories about Oiléan Ruaidh. >>
O'Brien, what my research reveals is not theoretical any revelations are based on the original sourced material.
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
Timewarp: Let's please keep on topic and not divert the thread to theories about Oiléan Ruaidh. ?? I must have missed some post somewhere, is it in this thread?
OBrien: How about addressing the issues about image quality in the photographs returned from the Mars mission since I have given you an example which you could easily use to illustrate your points.
Then there is the issue about censorship and blacking-out of certain areas too.
Marsrocks: I grabbed the image from the several available at http://marsrovers.jpl.nasa.gov/gallery/press/opportunity/20101005a.html as an example. The image does not have a "brown overlay." The color levels can be manipulated without loss of "color contrast and detail" as much as you want. It's just a matter of selecting the appropriate weighting.
Timewarp: Let's please keep on topic and not divert the thread to theories about Oiléan Ruaidh. When you said that you expected the images to be as good as from Google Earth, were you referring to orbital images of Mars being as good as overhead views in Google Earth, or that MER and Phoenix images should be as good as the ground-level "Street View" images of houses, people, trees, cars, etc. taken from Google's fleet of camera vans?
The content of what can be seen in my avatar to the left was only found after I had rotated and darkened the original image by too much. It is a crop from the original colour composite of a meteorite that Opportunity discovered in August 2009.
The image shown below is the same image as the L456 colour composite seen above. The only adjustments that have been made are to the brightness and contrast levels. No colour correction has been applied.
Take a look and see if you can see anything special on the surface.
__________________
"All truth passes through three stages. First, it is ridiculed; Second, it is violently opposed; Third, it is accepted as being self-evident."
OBrien, what was your source for the image you posted? Is it NASA processed or did you process it yourself?
It's got that brown overlay in it that obscures so much color contrast and detail.
I'm just wondering if any individuals are coming up with that brown overlay stuff on their own or if the brown processed images are unique to NASA releases.
Call it an "International homogenity of silence" or an "International homogenity of fakery". It is no more a problem of technology, it was and is always a social problem. The differences between the leaderships of different systems are of a quantitative kind not of a qualitative one . It is just the maskerade of the oppresssion which makes the difference between the american and the chinese "way of life / way of death" , to choose some of the prominent ways. There are a lot of other ways, but always the same sort of people are doing the same kind of "job", which turns the high praised liberty of choosing into an illusion. The same type (profile) in (pseudo-) power positions all over the world. The chief of a small tribe in Swaziland may be more or less honest than the president of Real Madrid or the mufti of Jerusalem or the king of whatever, but their characters show some similarities.
50 years of space exploration:
CHINA ( Hey China, proud nation, hopefully you`re not that eager to repeat the mistakes of your friends and the mistakes of your enemies. Please show us some results of your triumphes. How about some images of the moon? )
RUSSIA ( Hey Russia, something new in space ? )
JAPAN ( Hey Japan, inventor of so many groundbreaking technologies, surprise the world with some emanzipation and lift the blanket for one moment. )
INDIA ( Dreaming mother of wisdom. To close the circle from past to the future, your chance. Do not miss it.
US ( Nothing more to say for the moment. )
EUROPEAN UNION ( ESA) ( This goes to the young ones in Europe: Ignore the petrified ones. They have no life, just blind function. Those who show courage are those who never will be forgotten , not only in the history of planet earth, not only in the history of the solar system, but in the history of this galaxy. Starting to present some truth to mankind. Whatever better could be done ?
For instance, I don't see any Google Earth images that can produce an image even remotely as detailed as As you are probably aware, you are comparing apples with pears. The picture you post is taken from the Rover (I assume) and the normal Google Earth picture is taken by satellite.
Yes there are Street View pictures taken by many different cameras, but one taken from a satellite where we could see a martian sunbathing like we can in GE would probably be OK. (as long as the Martian sunbather was of a similar size)
Like a tricky politician, you are picking and choosing the points you answer. How about addressing the issues about image quality in the photographs returned from the Mars mission since I have given you an example which you could easily use to illustrate your points.
Then there is the issue about censorship and blacking-out of certain areas too.
Timewarp wrote:As far as specifics is concerned, I would have expected the images to be as good as can be seen in Google Earth. Is this too much to ask?
Your specifics need to be more specific.
Are you saying that the MER and Phoenix images should be as good as images in Street View? Which Street View? There are a myriad of different resolutions in Street View taken with thousands of different cameras all over the world.
Or are you saying that the orbital images should be as good as the overhead images in Google Earth?
In either case, you should provide a specific pair of images that you want to compare, and then look at the hardware that produced each image. For instance, I don't see any Google Earth images that can produce an image even remotely as detailed as
It's all very well to insist that taking an Earth-bound technology and sending it to another planet should be a simple and trivial matter. (deleted sentence by qmantoo)
Is it too much to ask that a probe several hundred million miles away perform the same as a piece of hardware that has completely different requirements and can be sent to the repair shop if something goes wrong? Yes, it is too much to ask.
-- Edited by qmantoo on Monday 11th of October 2010 07:20:56 AM
I would like to make a comment on this one too. You clearly have some expectations of a particular level of respectability and superiority that the images should have. Of course. I think everyone would expect to be able to see the structures and features in all space mission photographs. However, as you and we are well aware, there are certain aspects of the terrain which are censored and are not available to us in the images as released to the public, whether they are from the PDS or from the web portal. Our expectations - which are perfectly reasonable - are to have images that do not have parts censored. You expect that from a dictatorship or police state but not from a nation with freedom and liberty in its constitution.
What do you expect the images to look like, and how does that fulfill the mission requirements? (And which images are you looking at? If you're complaining about compression you may be ascribing faults to the particular web image version you've dowloaded rather than the intrinsic limitations of the hardware.) You yourself have explained the camera and lens specs and you understand the image quality issue. Why do you ask us when you know yourself that the images that are released to the public are not of the best quality available.
We have shown our expectation in this thread and it is obvious from this what is possible and what is 'fed' to us. Rather than challenging these kind of statements, why not explain why the quality is NOT as it should be according to what you know and according to the link I have just given. This is an excellent example of what is possible.
Please do not get sidetracked into addressing the subject of the image, but look at the quality since that is the point of this thread. There are pixellated areas and there are areas of the example image where there are no pixels - even at 500x. So, please explain that.
Timewarp wrote in another thread:I am also conversant with the majority of technical details that relate to the Phoenix lander and the MER's.
From the photographic point of view, what I consider is rather a let down is the quality of the imaging returned to Earth. I understand why a degree of compression of the captured image data is necessary before it is multiplexed with all the other instrumental scientific data and that the window of opportunity is limited for uploading the data to the orbiting satellite prior to transmission back to Earth.
But honestly, would you not agree with me that in this day and age, with all the advanced and sophisticated imaging equipment and resources that are available, the quality of the imaging returned from Mars, and for that matter the Moon, should be far more respectable and superior than what we currently have had to cope with?
You clearly have some expectations of a particular level of respectability and superiority that the images should have.
Can you be specific?
What do you expect the images to look like, and how does that fulfill the mission requirements? (And which images are you looking at? If you're complaining about compression you may be ascribing faults to the particular web image version you've dowloaded rather than the intrinsic limitations of the hardware.)