Forgot Password
Pentax Camera Forums Home
 

Reply
Show Printable Version 1 Like Search this Thread
11-22-2010, 06:24 AM   #16
Senior Member




Join Date: May 2010
Posts: 113
Original Poster
QuoteOriginally posted by newarts Quote
You want to map image plane distances to object space distances and convert those to the angle between the objects with the camera as the angle's vertex right? I think maybe knowing the precise center of the sensor might not matter much for the following reasons.

I think a small shift of the center of the sensor perpendicular to the optic axis must have an extremely small effect on the relation between distances in the image and object planes. If this were not the case, there would be obvious distortion of an image from the center to the edge of the frame.
maybe so but I am trying to eliminate any possible sources of error so I would need less images to acquire good mapping.

I can measure real angles with about 5 second precision so these few pixels would be quite a lot of error, in fact it would be the only source of error since I can usually point objects on the image with less than 3 pixel accuracy.

QuoteQuote:
The lenses we use are designed to be rectilinear - a spacing of Y in the object plane is mapped into a particular distance Z in the image plane (Y times the optical magnification) and this image distance is constant across the image plane, ie, shifting the sensor's center a bit one way or the other has no effect on the ratio M between Y and Z.

Z/Y = M

There is a well established relationship between distance from the image plane to object plane, D, focal length, F, and magnification, M.

D=F(1+M)^2/M

This can be solved for M, ie, Z and Y.

Isn't that enough? Can you give us an idea of the distances involved so some quantitative error estimates are practical?

Dave
ideally they are rectilinear but in fact any lens has some distortion, especially the wide angle ones. this distortions are usually circular, that is the function should be symmetrical around the lens' axis so the function only needs to connect distance from center with an angle between axis and the ray from "NPP" as they call it...

also you can't just pick the focal length manufacturer claims, since that length is often rounded or otherwise changed.

also, who said that I use rectilinear lenses only? I've just acquired Samyang 8mm and so far like it quite a lot, as it does not compress objects near edge like other fisheyes do, and does not stretch them quite a lot like wide angles do.

however although manufacturer claims it's "stereographic", it's really far from it so the only way to use images from this lens is to map its function properly.

11-22-2010, 07:16 AM   #17
Inactive Account




Join Date: Dec 2008
Location: Ames, Iowa, USA
Photos: Albums
Posts: 2,965
QuoteOriginally posted by olenl Quote
maybe so but I am trying to eliminate any possible sources of error so I would need less images to acquire good mapping.

I can measure real angles with about 5 second precision so these few pixels would be quite a lot of error, in fact it would be the only source of error since I can usually point objects on the image with less than 3 pixel accuracy.

ideally they are rectilinear but in fact any lens has some distortion, especially the wide angle ones. this distortions are usually circular, that is the function should be symmetrical around the lens' axis so the function only needs to connect distance from center with an angle between axis and the ray from "NPP" as they call it...

also you can't just pick the focal length manufacturer claims, since that length is often rounded or otherwise changed.

also, who said that I use rectilinear lenses only? I've just acquired Samyang 8mm and so far like it quite a lot, as it does not compress objects near edge like other fisheyes do, and does not stretch them quite a lot like wide angles do.

however although manufacturer claims it's "stereographic", it's really far from it so the only way to use images from this lens is to map its function properly.
Why are you arguing? We are trying to help!

We cannot help if you do not further specify the problem and your realistic needs.

I think the angular error (radians) involved changes with something like:

(image_center_uncertainty)/(Focal_length(1+M)^2) where M is << 1 as far as I know.

I estimate a 5 micrometer error (~ 1 pixel) on the sensor is something like .001 radians for a 50mm lens. This seems large if you are indeed requiring a few seconds precision. ( (1 sec ~ 5E-6 radians) for a 100mm lens this is around 1/10 pixel if my estimates are correct.)

Dave

PS, if you are concerned about in-body SR and are unable to experiment, maybe you should use a camera with no such SR.
11-22-2010, 08:09 AM   #18
Veteran Member
Manel Brand's Avatar

Join Date: Sep 2008
Location: Porto
Photos: Gallery | Albums
Posts: 853
QuoteOriginally posted by olenl Quote
also you can't just pick the focal length manufacturer claims, since that length is often rounded or otherwise changed.

also, who said that I use rectilinear lenses only? I've just acquired Samyang 8mm and so far like it quite a lot, as it does not compress objects near edge like other fisheyes do, and does not stretch them quite a lot like wide angles do.

however although manufacturer claims it's "stereographic", it's really far from it so the only way to use images from this lens is to map its function properly.
Holly cow. I get it now. I'm really dumb. Pretty complex hitec stuff. But I still don't understand what you mean by "verty technical question about in-camera stabilization"?
11-22-2010, 08:33 AM - 1 Like   #19
Veteran Member
kh1234567890's Avatar

Join Date: Sep 2010
Location: Manchester, UK
Photos: Gallery
Posts: 2,653
In the patent (US Application 20090245768) Hoya claim that in the off state the sensor platform stabilises at the centre of its movement span. It should not be too difficult to check if this is true and how reproducible it is in practice.

11-22-2010, 08:39 AM   #20
Senior Member




Join Date: May 2010
Posts: 113
Original Poster
QuoteOriginally posted by Manel Brand Quote
Holly cow. I get it now. I'm really dumb. Pretty complex hitec stuff. But I still don't understand what you mean by "verty technical question about in-camera stabilization"?
my bad, this is a typo. I meant to write "very" instead of "verty"

and to answer your question, I want VR panoramas but my specific needs require functions that are not present in existing commercial application so I am trying to do it myself. nothing hitec, just a small hobby project.

because i don't want to implement/use complex algorithms used for stitching any random images into panorama, I am trying to see if I could minimize stitching errors by learning lens mapping very precisely instead.
11-22-2010, 08:49 AM   #21
Veteran Member
Manel Brand's Avatar

Join Date: Sep 2008
Location: Porto
Photos: Gallery | Albums
Posts: 853
QuoteOriginally posted by olenl Quote
my bad, this is a typo. I meant to write "very" instead of "verty"

and to answer your question, I want VR panoramas but my specific needs require functions that are not present in existing commercial application so I am trying to do it myself. nothing hitec, just a small hobby project.

because i don't want to implement/use complex algorithms used for stitching any random images into panorama, I am trying to see if I could minimize stitching errors by learning lens mapping very precisely instead.
Ok. But be careful with "typos" in all your calculations. That way you may jeopardize all your learning efforts. Please don't forget to let us know what you did find out. Thanks again an good luck.
11-22-2010, 08:56 AM   #22
Veteran Member
MRRiley's Avatar

Join Date: Feb 2007
Location: Sterling, VA, USA
Photos: Gallery | Albums
Posts: 6,275
Typo in thread title corrected...

Mike
PF Moderation Team

11-22-2010, 08:56 AM   #23
Site Supporter
Site Supporter
Lowell Goudge's Avatar

Join Date: Jan 2007
Location: Toronto
Photos: Gallery | Albums
Posts: 17,891
I fail to see the whole point of this thread.

Since there is no way to know either the optical center of a lens, or the ability to allign the camera perpendicular to the subject and on the optical axis, the degree needed to make measurements, exactly what are we discussing here.

Please also note that to make any measurements you need to know the actual focal length, which is probably something like a +/- 5% tolorance any way.

I just don't see where this whole thing is headed.

a camera is not a calibreated measurement device tracable back to a national standard, therefore the calculations made have no relevance
11-22-2010, 09:11 AM   #24
Pentaxian
Fogel70's Avatar

Join Date: Sep 2009
Location: Stockholm, Sweden
Photos: Albums
Posts: 3,062
My guess is that mechanical precision on camera and lenses is a bigger problem than sensor alignment. It might even be hard to get the same result just by removing the lens and putting it back.

So I definitely think you need to be able to calibrate for mechanical errors on cameras and lenses before thinking of a few pixels misaligned sensor.
11-22-2010, 09:41 AM   #25
Senior Member




Join Date: May 2010
Posts: 113
Original Poster
QuoteOriginally posted by Lowell Goudge Quote
I fail to see the whole point of this thread.

Since there is no way to know either the optical center of a lens, or the ability to allign the camera perpendicular to the subject and on the optical axis, the degree needed to make measurements, exactly what are we discussing here.
well there ought to be an optical center so we might spot it somehow
and I don't align anything, I just solve certain system of equations ...
QuoteQuote:
Please also note that to make any measurements you need to know the actual focal length, which is probably something like a +/- 5% tolorance any way.
focal length is just a part of the analytical formula of "ideal" projection which we know is never actually corresponds to real lens... the real mapping is a table function with interpolation between measured values.
QuoteQuote:
I just don't see where this whole thing is headed.

a camera is not a calibreated measurement device tracable back to a national standard, therefore the calculations made have no relevance
sure, that is the point. since there is no standard, we need to measure each particular instrument's properties.
11-22-2010, 09:58 AM   #26
Senior Member




Join Date: May 2010
Posts: 113
Original Poster
QuoteOriginally posted by kh1234567890 Quote
In the patent (US Application 20090245768) Hoya claim that in the off state the sensor platform stabilises at the centre of its movement span. It should not be too difficult to check if this is true and how reproducible it is in practice.
Thank you, this is the kind of answer I was hoping for.
11-22-2010, 10:46 AM   #27
Veteran Member
Manel Brand's Avatar

Join Date: Sep 2008
Location: Porto
Photos: Gallery | Albums
Posts: 853
QuoteOriginally posted by olenl Quote
Thank you, this is the kind of answer I was hoping for.
I could told you that, damn. But I will give you another valuable tip. Call this guy, jeff@360cities.net. He surely knows your business:

London World Record Panoramic Photo: See Big Ben, London Eye, Tower Bridge, and more than you can imagine.
11-22-2010, 11:15 AM   #28
Site Supporter
Site Supporter
Lowell Goudge's Avatar

Join Date: Jan 2007
Location: Toronto
Photos: Gallery | Albums
Posts: 17,891
QuoteOriginally posted by olenl Quote
well there ought to be an optical center so we might spot it somehow
while that may be true, without a purpose, this is doing the calculation for the point of calculation
QuoteQuote:
and I don't align anything, I just solve certain system of equations ...
if yo are not making any allignment, and only solving equations, to me, it sounds like there is no need to know the center, unless of course, your equations are approaching infinity sith a divide by zero error
QuoteQuote:

focal length is just a part of the analytical formula of "ideal" projection which we know is never actually corresponds to real lens...
and your point is?
QuoteQuote:
the real mapping is a table function with interpolation between measured values.
?????
QuoteQuote:

sure, that is the point. since there is no standard, we need to measure each particular instrument's properties.
No, yoou mis understand, you are talking about using the camera as a system, where individual parts are made to tolorances known only to the manufacturer's, except the lenses which have some international standard associated with the testing, and marking, but which has very wider tolorances.

When I say there is no standard, what I mean is that any measurements you make, implying the type of precision you are discussing, would require calibration to a known international test method (that does not exist) and calibration of the instrument against physical standards to a tracable (assume length) nationally calibrated measurement device.

However, having said that, if yoou want to locate the center of a sensor, I suggest you use star trails, or an array of relitively distant lights and rotate the camera in a circular manner.

The circular trails left on the film, should describe circles. regardless of whether they are offset or not, you should be able to calculate the distance each circle is offset from the center of the frame, in the X and y distance, simply by looking at the coordinates of the circles
11-22-2010, 01:20 PM   #29
Veteran Member




Join Date: Mar 2009
Location: Perth Australia
Photos: Albums
Posts: 1,514
QuoteOriginally posted by olenl Quote
I want VR panoramas

You'll also need to know the exact distortion characteristics of your particular copy of whatever lens you're using too. If you require this sort of precision. (I would assume, not knowing what your trying to do exactly)

When we stitch together VR's we do it with off the shelf software that solves any distortion or shifts by analyzing the result and making whatever adjustments needed to make it fit together. Its totally auto-matic, though i do remember doing it by hand back in the day

I suppose you're doing something totally different or you'd have already done it the obvious way.
11-22-2010, 02:29 PM   #30
Veteran Member




Join Date: Jul 2010
Photos: Gallery
Posts: 2,395
I don't think anyone here could scientifically answer that question for you. In practical terms, yes, it is fixed. But in mathematical terms, it could be one pixel to the left or right. No way to tell.

Well there is one way to tell that my feeble mind came up with, but it's probably not worth the effort. For the type of thing you are talking about, you could buy an old *ist D or an entry level Canon / Nikon from the yesteryear for 100 bucks.

If you shine a low-powered laser or very fine light directly at the lens, through a sheet of black paper with a tiny hole, in a pitch dark room (you want this point to approach 1px in size, so that will be difficult). Shoot it 100% perpendicular to the lens to prevent any bending through the optical elements, and therefore minimize internal reflections. Then you shoot SR off, SR on, SR off, SR on, for many shots. Discard all of the SR ON shots. Finally, layer all of the SR off shots, and examine any movement of the light point. If the camera and laser/light are held firmly in place, they should cover the exact same point in the RAW data. Any movement of this light point indicates that the SR stabilization is not moved to the exact same spot 100% of the time.

I really doubt that it does... because that would just cost more money to build and is of no use to a photographer. Most technology operates within "tolerances" and I'm not sure if every pixel is used on the sensor for a given shot (otherwise, SR would result in black edges). Am I totally wrong here? Therefore, there is a tiny border of pixels on each side of the sensor that are not always exposed.
Reply

Bookmarks
  • Submit Thread to Facebook Facebook
  • Submit Thread to Twitter Twitter
  • Submit Thread to Digg Digg
Tags - Make this thread easier to find by adding keywords to it!
axis, camera, center, dslr, image, in-camera, photography, sensor, sr, stabilization

Similar Threads
Thread Thread Starter Forum Replies Last Post
Decentering - Technical Question Sailor Pentax SLR Lens Discussion 1 04-24-2009 04:32 PM
Is your in camera stabilization as good as a chicken? vitalsax Pentax DSLR Discussion 18 01-10-2009 03:40 PM
Camera Stabilization aamir515 Photographic Technique 10 11-20-2008 02:28 PM
Technical question re: PSE 5.0 back up and restore Ed in GA Digital Processing, Software, and Printing 1 11-14-2007 09:14 AM
Technical Question AmdamJayne Post Your Photos! 7 05-16-2007 12:37 AM



All times are GMT -7. The time now is 10:42 PM. | See also: NikonForums.com, CanonForums.com part of our network of photo forums!
  • Red (Default)
  • Green
  • Gray
  • Dark
  • Dark Yellow
  • Dark Blue
  • Old Red
  • Old Green
  • Old Gray
  • Dial-Up Style
Hello! It's great to see you back on the forum! Have you considered joining the community?
register
Creating a FREE ACCOUNT takes under a minute, removes ads, and lets you post! [Dismiss]
Top