Originally posted by mysticcowboy There will be some degradation in quality. ...
The question was: "While higher quality settings do not produce visible differences for the first generation JPEG, are they perhaps beneficial for 2nd, 3rd generation versions of a JPEG?". Good question which you didn't address in your response.
Originally posted by mysticcowboy By using jpeg you have already degraded the image from what's possible.
JPEGs allow for less latitude in post processing compared to RAW or 16-bit TIFF files, but I'd be surprised if you were able to visually distinguish prints made from high-quality JPEGs vs RAW files.
Why do you think it becomes next to impossible to distinguish the topmost quality settings for JPEGs? Because a certain amount of degradation cannot be removed, no matter how high the quality setting, or rather because at a certain point there is nothing to improve upon anymore? I don't know about the existing implementations but in theory the principle behind JPEG compression allows a 100% reconstruction of the image as long as you store enough data. Progressive levels of compression are achieved by increasingly throwing away data pertaining to the highest remaining frequencies in the image.