Forgot Password
Pentax Camera Forums Home
 

Reply
Show Printable Version Search this Thread
03-06-2010, 10:30 AM   #16
Site Supporter
Site Supporter
Lowell Goudge's Avatar

Join Date: Jan 2007
Location: Toronto
Photos: Gallery | Albums
Posts: 17,892
QuoteOriginally posted by ProfHankD Quote
Ideally, the exposures would be calibrated raw images and the histogram spike would be as close to the maximum as possible without clipping a channel. However, I'm not expecting perfection, just characterization good enough to drive a recognition and/or synthesis algorithm. The same goes for the contrast issues. Frankly, as soon as one uses JPEGs, it's all very approximate anyway....
I guess that is a matter of opinion, but my experience in plotting exposure accuracy and range on a DSLR suggests you don't want to be too close to the top end, even within the top 2 EV the range is compressed and not linear. but it is your choice. I woould have expected if you are modeling the perofrmance you would want accuracy, otherwise it is a waste of effort
QuoteQuote:


Actually, I LOVE this little comparison because it is consistent with a guess I had made as to the cause. I knew it wasn't a light source defect because rotating the camera about the optical axis the pattern stays oriented with the sensor. In short, I believe the horizontal bias isn't a flaw in the light source, but a sensor reflection / mirror chamber masking artifact from my Sony A350 body -- different body, significantly different bokeh! The masking is probably the dominant effect, because Sony is quite aggressive about blocking stray light (i.e., from a full-frame lens). The 50mm f/1.4 Takumars seem to be exceptionally susceptible to this; for example, I don't see this artifact with my Opteka 85mm f/1.4.
then by this admission you are testing a system, and not the lens, and the models may be skewed by the camera you use
QuoteQuote:


I have been initially concentrating on wide open PSFs, however, the variations don't shock me. The little surprise for me is how evident diffraction effects are in some of the images... you don't see that much wide open.
no but that is widely discussed in the forums,

it would be interesting to see if the amount of defraction can be traced to the physical size of the aperture, i.e. shorter lenses at any given Fstop are worse than longer lenses
QuoteQuote:

As for exactly how I'm evaluating bokeh, it's pretty complicated, but here's how I got started.

I've been using CHDK to put various functions in cameras for research purposes for some time, and last year I supervised an undergrad senior project team at the University of Kentucky implementing depth map capture within an unmodified Canon PowerShot A620. The method used was simple depth-from-focus, capturing multiple images with different focus distances, determining which image was sharpest for each pixel location, and then creating a depth map image. It worked quite well -- where there was an edge or texture. Where there wasn't, it often hallucinated sharp edges at disturbingly incorrect distances. That's when I realized that the Gaussian blur model of defocus underlying all these algorithms was fundamentally and seriously wrong. So, last Summer I set out to develop a model of what lenses really do.

My initial models were mathematically perfect, noiseless, synthetic creations that I could use to test my matching algorithms. However, that doesn't tell me what real lenses do, nor what variations I can expect due to noise, JPEG processing, etc. That's what I've been testing lenses to determine... so far, I've measured PSFs for more than 3 dozen lenses on my Sony A350.... I've also tested quite a few compact digital cameras (most have small, but distinctive, PSFs).

My current best matching method involves a rather expensive Genetic Algorithm (GA), but it works quite well on perfect test images given enough time. The question is how well it can work with real lens PSFs, noise, etc. Using this approach, high quality depth-from-defocus should be possible from a single image... and then there are a bunch of other cool things one can do.

Here's a rather crude example of the kinds of other things one can do -- removal of obnoxious bokeh, a sample image pair from my "Technology Enabling Art" challenge series at dpreview:
http://c.img-dpreview.com/0205120-01.jpg
one other interesting aside is that this gives a method for testing lens imperfections, as you and matt have pointed out, element separation, fungus and dust are very visible


Last edited by Lowell Goudge; 03-06-2010 at 10:43 AM.
03-06-2010, 01:00 PM   #17
Veteran Member
MattGunn's Avatar

Join Date: Jun 2009
Location: Wales
Photos: Gallery
Posts: 347
QuoteOriginally posted by Lowell Goudge Quote
I guess that is a matter of opinion, but my experience in plotting exposure accuracy and range on a DSLR suggests you don't want to be too close to the top end, even within the top 2 EV the range is compressed and not linear. but it is your choice. I woould have expected if you are modeling the perofrmance you would want accuracy, otherwise it is a waste of effort
CCD's are very linear, this is one of their gereatest strengths. Although the newer Pentax and Samsung DSLR's use CMOS sensors (all of the 14 Mpixel cameras use a CMOS) sensor which are less linear but the non-linearities are calibrated out with the processing. The non linearity which you have measured is added deliberately with the gamma correction (have a look here: Gamma correction - Wikipedia, the free encyclopedia). If the JPEG conversion settings are known then this can be removed by further processing. By exposing with the peak in the centre of the histogram you experience a lot less non linearity as you are using a small part of the available dynamic range and so the curve appears nearly linear.

QuoteOriginally posted by Lowell Goudge Quote
it would be interesting to see if the amount of defraction can be traced to the physical size of the aperture, i.e. shorter lenses at any given Fstop are worse than longer lenses
The diffraction effects are independant of the focal length, they are only affected by the F no. The diffraction limited resolutio of a lens is given by:

d = 1.22 x (1 + 1/M) x F x L

where:
d is the smalest separation of point sources in the object which can be resolved
M is the magnifcation of the optical system
F is the F number
L is the wavelength of the light

QuoteOriginally posted by Lowell Goudge Quote
one other interesting aside is that this gives a method for testing lens imperfections, as you and matt have pointed out, element separation, fungus and dust are very visible
This does appear to be really good way of testing how clean a lens is, I may well end out using it more to test lenses I have cleaned / repaired.

Hank:
I have been thinking about the uniformity / shape / spread of the light source. Presumably if I was to take a focussed image of the light source then it would be possible to do a fourier deconvolution on all of the bokeh images to simulate the effects of an almost ideal point source? Alternatively I could focus the light into a 0.5mm plasitic fibre so the light source would only represent an angular spread of 0.1 mrad with a top hat distribution.
03-06-2010, 04:23 PM   #18
Junior Member




Join Date: Aug 2009
Location: Lexington, KY
Photos: Gallery | Albums
Posts: 30
Original Poster
QuoteOriginally posted by MattGunn Quote
If the JPEG conversion settings are known then this can be removed by further processing.
Actually I tried that a few years ago. Basically the issue was that a colleague of mine was doing 3D capture using structured light, and he hadn't realized just how much badness the gamma and DCT transformations were doing, but he was using cameras that only produced JPEGs. I basically took Dave Coffin's sample DNG encoding program and tweaked it using evolutionary computing to re-derive the single-sample-per-pixel Bayer-pattern raw data. (Yeah, I use a lot of GAs. ) It worked better than I expected, especially in cases of partial saturation, but wasn't really great.

QuoteOriginally posted by MattGunn Quote
The diffraction effects are independant of the focal length, they are only affected by the F no. ...
The interesting thing I saw was how the diffraction pattern was affected by the shape of the aperture. Perfectly understandable, but somewhat unexpected -- in the same sense that everybody intuitively expects a Gaussian blur for out of focus stuff despite the physics saying that doesn't happen....

QuoteOriginally posted by MattGunn Quote
This does appear to be really good way of testing how clean a lens is, I may well end out using it more to test lenses I have cleaned / repaired.
Tip of the iceberg. There's a rather long list of cool things I can do once I have a good enough model, but I can't talk about everything yet because the University of Kentucky has expressed an interest in patenting some of them. I generally put most of my work in the Public Domain as open source, and much of this will be too, but some things are more likely to get used if they are patented....

QuoteOriginally posted by MattGunn Quote
Hank:
I have been thinking about the uniformity / shape / spread of the light source. Presumably if I was to take a focussed image of the light source then it would be possible to do a fourier deconvolution on all of the bokeh images to simulate the effects of an almost ideal point source? Alternatively I could focus the light into a 0.5mm plasitic fibre so the light source would only represent an angular spread of 0.1 mrad with a top hat distribution.
I don't think this is a big issue unless testing distant focus with a nearby light source, in which case a fibre optic source sounds like a really good idea. I've also had some concerns about the spectrum of the light source. However, my biggest problem has been distinguishing sensor dust from lens artifacts, because both maintain their orientation relative to the sensor when I do my rotation test... and testing dozens of lenses quickly reveals how mediocre current dust removal systems are.
03-07-2010, 02:47 AM   #19
Veteran Member
MattGunn's Avatar

Join Date: Jun 2009
Location: Wales
Photos: Gallery
Posts: 347
QuoteOriginally posted by ProfHankD Quote
The interesting thing I saw was how the diffraction pattern was affected by the shape of the aperture. Perfectly understandable, but somewhat unexpected -- in the same sense that everybody intuitively expects a Gaussian blur for out of focus stuff despite the physics saying that doesn't happen....
Strictly, the multiplyer of 1.22 in the equation for diffracton limited resolution is only valid for circular apertures. However the derivation of this is not strightforward and I have no intention of trying to work through it for a pentagonal / hexagonal etc aperture.

QuoteOriginally posted by ProfHankD Quote
I don't think this is a big issue unless testing distant focus with a nearby light source, in which case a fibre optic source sounds like a really good idea. I've also had some concerns about the spectrum of the light source. However, my biggest problem has been distinguishing sensor dust from lens artifacts, because both maintain their orientation relative to the sensor when I do my rotation test... and testing dozens of lenses quickly reveals how mediocre current dust removal systems are.
I don't know about the alpha series but the pentax / samsung 20 series cameras have a very effective dust detection system to tell you when your sensor needs cleaned. I think mine is OK at the moment.
I may try a fibre, I think I have some scraps of 105um fibre somewhere, this should be a reasonable point source.
If it will help I can measure the spectrum of my LED torch at work. However I don't have a standard lamp so the intensity will not be acurately callibrated to take the sensitivity of the detector etc into account.

03-07-2010, 06:26 AM   #20
Site Supporter
Site Supporter
Lowell Goudge's Avatar

Join Date: Jan 2007
Location: Toronto
Photos: Gallery | Albums
Posts: 17,892
QuoteOriginally posted by MattGunn Quote
CCD's are very linear, this is one of their gereatest strengths. Although the newer Pentax and Samsung DSLR's use CMOS sensors (all of the 14 Mpixel cameras use a CMOS) sensor which are less linear but the non-linearities are calibrated out with the processing. The non linearity which you have measured is added deliberately with the gamma correction (have a look here: Gamma correction - Wikipedia, the free encyclopedia). If the JPEG conversion settings are known then this can be removed by further processing. By exposing with the peak in the centre of the histogram you experience a lot less non linearity as you are using a small part of the available dynamic range and so the curve appears nearly linear.
i may have misinterpreted his planned evaluation, specifically the light fall off in th eOFF circle, as a function of location. if this is all he is doing, then regardless of raw vs jpeg, he should be shooting in the middle of the range not at the extreme. I doubt the fall off is more than 3 stops, so why not use the most linear point as opposed to close to maximm? Most of the photos shown have less than 1 stop of fall off in the OFF circle
QuoteQuote:

The diffraction effects are independant of the focal length, they are only affected by the F no. The diffraction limited resolutio of a lens is given by:

d = 1.22 x (1 + 1/M) x F x L

where:
d is the smalest separation of point sources in the object which can be resolved
M is the magnifcation of the optical system
F is the F number
L is the wavelength of the light
but magnification of the optical system is based on focal length. the issue is that the impact of defraction is a function of the ratio of the area showing defraction ecffects over the circle area. shorter focal length lenses have smaller aperature diameters at any F number becase F number is focal length over diameter. I therefore would expect a higher percentage influence of difraction for shorter focal lengths at any F number
QuoteQuote:

This does appear to be really good way of testing how clean a lens is, I may well end out using it more to test lenses I have cleaned / repaired.
it may become something to ask for when buying a lens.
QuoteQuote:
Hank:
I have been thinking about the uniformity / shape / spread of the light source. Presumably if I was to take a focussed image of the light source then it would be possible to do a fourier deconvolution on all of the bokeh images to simulate the effects of an almost ideal point source? Alternatively I could focus the light into a 0.5mm plasitic fibre so the light source would only represent an angular spread of 0.1 mrad with a top hat distribution.
This addds another variable, specifically the spectrum of the source.
03-07-2010, 09:15 AM   #21
Senior Member
summonbaka's Avatar

Join Date: Dec 2009
Location: Kagoshima, Japan
Photos: Gallery | Albums
Posts: 237
I have a Limited 77 and a Takumar 200mm preset that beg to be used. Going to give it a shot this week.
03-07-2010, 09:38 AM   #22
Veteran Member




Join Date: Aug 2007
Location: Dayton, OH
Posts: 365
I'm very interested in helping out. I'll try to set up some shots later on. I just have a few questions, though:

Do I understand correctly that you're only really looking for wide-open shots? Would you gain anythign by seeing stopped-down aperture blades, as well?

Are you only interested in prime lenses, or would you like images from zooms? If so, would it make sense to submit a sample from three focal lengths: min, mid, and max?

Is your preferred size for the JPEGs just whatever a 100% crop works out to be? And what's the best way to go from RAW to JPEG for you to ensure minimal corruption of the data? I'm imagining a linear tone curve, zeroed-out Adobe Camera Raw settings, and exposure adjusted to the right to just below highlight clipping. Sound good?

03-07-2010, 05:11 PM   #23
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
To provide an out-of-focus PSF is hard work typically not even done in regular lens tests.

I know of some tests which have published the radial power density distribution of the out-of-focus PSF (a chart which theoretically is the step function). And not everybody intuitively would expect a Gaussian blur

It could be motivating if you would publish the radial power density distribution chart for the three RGB channels in exchange for the PSF you receive in this thread.

As far as photography is concerned, these are the most relevant parameters in classifying an out-of-focus PSF:
1. relative difference of radii of the 50% PSF value in the RGB channels.
2. momenta of power distribution relative to the step function

1. is responsible for out-of-focus fringing (bokeh fringing or bokeh color aberration where aberration is a non-applicable term in this situation though).



With respect to restoring a linear response curve from JPG or RAW data:

sRGB doesn't follow a gamma curve. The exact formula is more complex and gamma=2.2 is a bad approximation for low luminosities.

Panorama software like PTGui will compute the exact response curve from the overlap of a few images and additionally correct for vignetting and distortion. You may want to use this feature for a calibration of your own measurements.


My contribution is a crop from an club scene shot with a blue power LED in the background. The attachment shows a 100% crop from a 14.6 MP camera at ISO 1600. Lens is a Zeiss 50mm/1.4 shot at f/1.4. Position of the blue disk is at half way between center and left horizontal border. Definitely not intuitvely Gaussian blur (I know you can't use this shot. But it may entertain other readers of this thread .)

Last edited by falconeye; 06-15-2011 at 05:29 AM.
03-07-2010, 05:30 PM   #24
Junior Member




Join Date: Aug 2009
Location: Lexington, KY
Photos: Gallery | Albums
Posts: 30
Original Poster
QuoteOriginally posted by aerodave Quote
Do I understand correctly that you're only really looking for wide-open shots? Would you gain anythign by seeing stopped-down aperture blades, as well?

Are you only interested in prime lenses, or would you like images from zooms? If so, would it make sense to submit a sample from three focal lengths: min, mid, and max?

Is your preferred size for the JPEGs just whatever a 100% crop works out to be? And what's the best way to go from RAW to JPEG for you to ensure minimal corruption of the data? I'm imagining a linear tone curve, zeroed-out Adobe Camera Raw settings, and exposure adjusted to the right to just below highlight clipping. Sound good?
For now, I'm just trying to get a sampling of the diversity of PSFs. I'm most concerned with wide open, but more data never hurts. I guess I should also mention that many auto-aperture lenses are not particularly repeatable about the precise aperture shape and size when studied at this level of detail (friction can cause a blade to miss it's mark slightly). Primes are more interesting to me, but min+max on zooms would be fine, and you will see many more PSF defects on zooms. I've noticed slight decentering is particularly common in autofocus kit zooms. Aspherical elements tend to make the PSF touchier too.

Crop to the PSF is fine, but try not to scale. I didn't expect I could ask people to go to the trouble of calibrated raws (and you can't easily crop a raw), but 16-bit linear PPM, as generated by Dave Coffin's dcraw, is way more accurate than unclipped JPEG. Anyway, I'll happily take whatever I can get.

BTW, if you're looking at these to find defects, autolevels or even retinex will make things a whole lot more visually obvious. Just please don't autolevel, use shadow enhancement , etc. on what you're sending to me.
03-07-2010, 05:37 PM   #25
Junior Member




Join Date: Aug 2009
Location: Lexington, KY
Photos: Gallery | Albums
Posts: 30
Original Poster
For what it's worth, the blue LED shot shows classical geometric vignetting... if you see that, the image isn't in the center.
03-08-2010, 04:23 AM   #26
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
QuoteOriginally posted by ProfHankD Quote
For what it's worth, the blue LED shot shows classical geometric vignetting... if you see that, the image isn't in the center.
Thanks, good point

Note that this is at about 12 mm from the center for a lens supposed to cover a 21.6 mm image radius. Short distance focussing at wide aperture must intensify vignetting a lot ...
03-08-2010, 08:29 AM   #27
Inactive Account




Join Date: Dec 2008
Location: Ames, Iowa, USA
Photos: Albums
Posts: 2,965
Prof Hank,

How does the PSF obtained by this method relate to an almost in-focus PSF, which I suppose should approach the Airy Function?

As the image of the aperture gets smaller and smaller, doesn't interference between the interference fringes at the aperture's edges become important?

Or are you saying that the PSF's found by this method are suitable for correcting focus on an almost-in-focus image?
03-08-2010, 08:49 AM   #28
Inactive Account




Join Date: Dec 2008
Location: Ames, Iowa, USA
Photos: Albums
Posts: 2,965
QuoteOriginally posted by MattGunn Quote
....
The diffraction effects are independant of the focal length, they are only affected by the F no. The diffraction limited resolutio of a lens is given by:

d = 1.22 x (1 + 1/M) x F x L

where:
d is the smalest separation of point sources in the object which can be resolved
M is the magnifcation of the optical system
F is the F number
L is the wavelength of the light



This does appear to be really good way of testing how clean a lens is, I may well end out using it more to test lenses I have cleaned / repaired.
That should be :

d=1.22x(1+M)xFxL

(1+M)xF is about the distance from the aperture to the image plane.

QuoteQuote:
Hank:
I have been thinking about the uniformity / shape / spread of the light source. Presumably if I was to take a focussed image of the light source then it would be possible to do a fourier deconvolution on all of the bokeh images to simulate the effects of an almost ideal point source? Alternatively I could focus the light into a 0.5mm plasitic fibre so the light source would only represent an angular spread of 0.1 mrad with a top hat distribution.
Is the output beam from an optical fiber almost a top-hat distribution?

Dave
03-08-2010, 12:40 PM   #29
Veteran Member
MattGunn's Avatar

Join Date: Jun 2009
Location: Wales
Photos: Gallery
Posts: 347
QuoteOriginally posted by newarts Quote
That should be :

d=1.22x(1+M)xFxL

(1+M)xF is about the distance from the aperture to the image plane.
d = 1.22 x (1 + M) x F x L in image space where d is the separation of resolvable points in the image,

d = 1.22 x (1 + 1/M) x F x L in object space where d is the separation of resolvable points on the object.

However now that I see what Lowell was getting at neither of these are relevent. As the object distance is constant the magnification varies and so shorter focal length lenses will show more diffraction lenses at the same aperture than longer focal lengths. Also we are not really sampling the airy patern, are we?

QuoteOriginally posted by newarts Quote
Is the output beam from an optical fiber almost a top-hat distribution?

Dave
My understanding is that in the near field the output of a multi mode fibre is a good approximation to a top hat function. In the far field the approximation is not so good but it should still be better than an LED.
03-08-2010, 12:58 PM   #30
Junior Member




Join Date: Aug 2009
Location: Lexington, KY
Photos: Gallery | Albums
Posts: 30
Original Poster
QuoteOriginally posted by newarts Quote
How does the PSF obtained by this method relate to an almost in-focus PSF, which I suppose should approach the Airy Function?

As the image of the aperture gets smaller and smaller, doesn't interference between the interference fringes at the aperture's edges become important
First, let me state that I'm a computer engineer trying to find new algorithms for image processing, not an optical engineer. Any optics people out there are encouraged to correct the multitude of errors I make.

What I've been calling PSF is closely related to the Airy Function and it is common to see that type of patterning in PSF of compact cameras. However, to a first-order approximation, the PSF just gets smaller as you near the focus distance. Once the PSF gets small enough, things get much more complex. For example, some colors may be before the focus point while others are after. By then you've also got all sorts of artifacting due to the antialias filter, microlens array, Bayer filter pattern, etc. At that point, you don't reliably get much more than an estimate of spread diameter for x% of the energy, so the PSF detail structure is essentially moot.

QuoteOriginally posted by newarts Quote
Or are you saying that the PSF's found by this method are suitable for correcting focus on an almost-in-focus image?
Simple diameter measurements, usually made as power in the frequency domain, are commonly used for refocus of nearly-in-focus scenes -- with varying degrees of success. The real value of the PSFs I'm researching is for stuff that's not very close to being in focus, so that recognizing PSF internal structure can help disambiguate the scene.
Reply

Bookmarks
  • Submit Thread to Facebook Facebook
  • Submit Thread to Twitter Twitter
  • Submit Thread to Digg Digg
Tags - Make this thread easier to find by adding keywords to it!
aperture, bokeh, focus, image, images, k-mount, lens, lens bokeh, pentax lens, psf, research, slr lens

Similar Threads
Thread Thread Starter Forum Replies Last Post
Flickr as location research tool Nass Photographic Technique 3 10-10-2010 03:23 AM
Bokina versus Bokeh Monster, which bokeh you prefer? Pentaxor Pentax SLR Lens Discussion 3 04-21-2010 01:50 AM
Eyes, Brains, Sensors, and Research mithrandir General Talk 8 02-05-2010 01:11 PM
picture editing software research nitehawk128 Digital Processing, Software, and Printing 11 01-16-2010 05:57 AM
sample research paper copcally General Talk 3 08-29-2008 11:33 AM



All times are GMT -7. The time now is 09:18 AM. | See also: NikonForums.com, CanonForums.com part of our network of photo forums!
  • Red (Default)
  • Green
  • Gray
  • Dark
  • Dark Yellow
  • Dark Blue
  • Old Red
  • Old Green
  • Old Gray
  • Dial-Up Style
Hello! It's great to see you back on the forum! Have you considered joining the community?
register
Creating a FREE ACCOUNT takes under a minute, removes ads, and lets you post! [Dismiss]
Top