In JPEG colors have half the linear resolution (In 640x480 image you have 320x240 color image), but you wont see it, because human eye has much lower resolution for colours aswell - unless your viewing it much closer or it's a big print.
Then some artifacts may appear just because of jpeg, where you can observe colour information streching over the original "subject" - like a star for example, which is 1 pixel big will have a 2x2 pixel colour block.
So if some part of image consist mostly of contrasty colour information they migh get messed up. See samples, brightness (black/white) is intact but colors get damaged, even with 100% quality jpeg. (Note that 100% jpeg dosnt get even actual lossy compression, it gets only reduced colour resolution and basic losless compresson).
Attached are original bitmap file, 100% quality JPEG, upscaled BMP and upscaled jpeg, so can you tell the difference? I can, but i doubt i could without knowing and in a real image where you dont see clear RGB lines. Sometimes i only see this in astrophotographs.
Edit: i forgot to mention that JPEG isnt simply 8 bit - 255 shades of gray. It is true for gray - lightness but not for colour information. It processes images as YCbCr where Y - lightness usually gets its 255 levels but colour, depending on compression level may get anything below, like 2, 16, 64. Which might offset captured colour value or cause visible steps/noise in colour only gradients(Red->Blue,Green->Yellow etc, not Black-White, Dark->Light).
Last edited by ytterbium; 08-01-2009 at 05:23 AM.