Forgot Password
Pentax Camera Forums Home
 

Reply
Show Printable Version 16 Likes Search this Thread
08-05-2010, 07:20 AM   #166
Site Supporter
Site Supporter
Aristophanes's Avatar

Join Date: Jul 2008
Location: Rankin Inlet, Nunavut
Photos: Albums
Posts: 3,948
QuoteOriginally posted by falconeye Quote
BTW, another one who missed his quantum mechanics course was God. He tried your idea first (many pinhole lenses hooked up to a neural network computer, aka an insect's compound eye). But after hundreds of millions of years in frustration (as far as we know, his beard turned white because of this) about the bad image quality, he gave up and eventually, gave green light to the development of the lens (aka lens-bearing eye or normal eye). [Smiley intentionally left blank]
God has a lens of humour:


Barreleye - Wikipedia, the free encyclopedia

08-05-2010, 07:43 AM   #167
Veteran Member




Join Date: Jun 2009
Posts: 1,689
QuoteOriginally posted by Aristophanes Quote
Imagine if humans were like that.....sometime god has a sick sense of humor lol

Good find actually, never knew about these creatures.
08-05-2010, 10:24 AM   #168
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
QuoteOriginally posted by Aristophanes Quote
God has a lens of humour
Indeed!

And by accident, it looks undistinguishable from the forthcoming K-5
08-05-2010, 10:29 AM   #169
Veteran Member
mattdm's Avatar

Join Date: Feb 2007
Location: Boston, MA
Photos: Gallery
Posts: 2,948
See, I told you this was a whole 'nother thread.

QuoteOriginally posted by falconeye Quote
Sorry Matt,

but this is entirely wrong. I guess you skipped the nastier parts of your quantum mechanics courses and therefore, you are excused
Well, you've got me there on my own education. However, in my job, I am literally surrounded (no, really, they're on every side of me!) by physics PhDs, and I floated the idea by several of them, and was assured that I'm on pretty solid ground.

QuoteOriginally posted by falconeye Quote
A lens is not a sort of analog computer. Because there is no input data to work with. If you would try to obtain input data (by measuring the wavefront hitting the front lens) you would destroy any chance to obtain a result. And if you tried to insist, let me refresh your memory and say that you simply cannot measure both, location and phase, of a photon. Heisenberg and all the rest.
So, as I am given to understand it, frequency and direction are not complementary properties in the quantum sense, so this is not a problem from that point of view. There's some uncertainty on the order of nanoseconds about exactly when a photon is measured, but that's small enough we don't care about it.

So the problem that remains is how exactly to gather that data. Specifically, you need to deal with making a coherent picture from incoherent light.

There's two basic approaches I can think of.

First, there's the pinhole aperture used in current applications of this sort of technology. That has some serious disadvantages: diffraction, and simply losing a whole lot of light because you're at f/200 or whatever. The composite result is brighter than f/200 would be because you have numerous pinholes, but, still, a problem. I'm not sure how far this approach can go with more advanced engineering, but I'll take your word for it that it's a dead end.

So, the second idea: the front-element sensor is an array of small tubes, each pointing in a different but known direction, and each connected to one photosite. These tubes could also serve as the color filters, so you have direction and frequency.

Then it's a "simple" matter of deconvolving the input data to produce a photographic image. Unlike current raw files, these "extra-raw" files are nothing like a bitmap image -- information for a small portion of the frame is distributed across the whole dataset. In selecting parameters for calculating your final image, you could generate something like that produced by a traditional lens of any different type (and images impossible traditionally).

Now, by having your photosites direction-specific, you're losing a bunch of light -- but since that light hasn't passed through an aperture, you've got a lot more to work with in the first place.

There may be yet-unthought-of ways to gather the information in an even less lossy way.


QuoteOriginally posted by falconeye Quote
BTW, another one who missed his quantum mechanics course was God. He tried your idea first (many pinhole lenses hooked up to a neural network computer, aka an insect's compound eye). But after hundreds of millions of years in frustration (as far as we know, his beard turned white because of this) about the bad image quality, he gave up and eventually, gave green light to the development of the lens (aka lens-bearing eye or normal eye). [Smiley intentionally left blank]
It may be worth noting that although you characterize this branch as given up on, there are certainly many orders of magnitude more creatures with compound eyes on the earth right now than there are with "normal" eyes.

But beyond that: doing this well requires considerable computing power. A lot of our brain is devoted to vision already (somewhat like a quarter of the neocortex). Given the constraints, it makes sense to use an analog approach -- you don't want to have to think for a minute in order to see something. But that's not going to be a constraint on computing devices in the World Of The Future.

08-05-2010, 10:34 AM   #170
Senior Member
Eigengrau's Avatar

Join Date: Aug 2008
Location: Colorado
Photos: Albums
Posts: 250
QuoteOriginally posted by nosnoop Quote
The flawed logic is that none of your digital obsolescence examples support your claim that FF would take over APS-C; and you ignore the evolution of the camera as a whole with your narrow obsession on sensor size.
What I'm expecting is something very similar to what happened with the first Digital Rebel. Inexpensive digital SLRs didn't really exist before that, but it only took one successful model to start the trend. As soon as we get the FF equivalent of the rebel (cheap and compelling enough to get people to buy) we'll see a trend that way. There's some hinting even now that Canon and Nikon might be bringing out cheap FF cameras pretty soon.

The price they set that FF at is going to be the new maximum price for APS-C. It will be very hard to sell a cropped sensor body for more than a FF one.
08-05-2010, 11:58 AM - 1 Like   #171
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
QuoteOriginally posted by mattdm Quote
See, I told you this was a whole 'nother thread.
Yeah, I love this
QuoteOriginally posted by mattdm Quote
I am literally surrounded (no, really, they're on every side of me!) by physics PhDs, and I floated the idea by several of them, and was assured that I'm on pretty solid ground.
Must be experimental physicists
QuoteOriginally posted by mattdm Quote
So, as I am given to understand it, frequency and direction are not complementary properties in the quantum sense, so this is not a problem from that point of view.
They are not but you miss the point. Location and momentum are complementary. And momentum includes direction. And because an image is nothing but luminosity as a function of direction, you want to know direction as precisely as possible for tack-sharp images. Which means that you have to give up on location as much as possible. This is the reason why you need a wide aperture lens. Because a wide aperture maximizes the uncertainty about where the photon entered the lens. It's really this simple.

Note that a single microlens array would still be one lens in my terminology. The important point is that it remains uncertain which microlens was passed thru by any particular photon. Ie., no electronics coupled to it. Fresnel lenses are similiar to this.
QuoteOriginally posted by mattdm Quote
So, the second idea: the front-element sensor is an array of small tubes, each pointing in a different but known direction, and each connected to one photosite. These tubes could also serve as the color filters, so you have direction and frequency.
[...]
Now, by having your photosites direction-specific, [...] you've got a lot more to work with in the first place.
That's exactly the point of failure in your thinking. A "microtube" detecting direction and location is infeasible. Due to diffraction, each microtube would look a bit "around the corner" and you end up with a fuzzy image. Therefore, each microtube would have to remain rather large compared to the wavelength of a photon. Rendering the approach incompetitive except for low resolution imaging.
QuoteOriginally posted by mattdm Quote
It may be worth noting that although you characterize this branch as given up on, there are certainly many orders of magnitude more creatures with compound eyes on the earth right now than there are with "normal" eyes.
Oh, the branch isn't given up. It's just inferior. Sort of entry-level imaging in nature
QuoteOriginally posted by mattdm Quote
But beyond that: doing this well requires considerable computing power. A lot of our brain is devoted to vision already
That's unrelated to the rest of the discussion but wrong again

Most of the computation for the imaging part is done on the retina, i.e. outside the visual cortex. The visual cortex' task is pattern recognition.

Having done both, deconvolution algorithms and pattern recognition algorithms, I can assure you that deconvolution is trivial in comparison to pattern recognition. Real time deconvolution is easily achievable with modern signal processors (like a graphics card GPU) while pattern recognition requires super computers (if possible at all) for some trivial tasks like reliable car driving. Even character recognition (which is easy except for the training part) is not matching the human performance. And we read fast.

Last edited by falconeye; 08-05-2010 at 12:05 PM.
08-05-2010, 12:36 PM   #172
Veteran Member




Join Date: Apr 2010
Location: Tennessee
Posts: 6,617
You guys take all the fun out of photography.

08-05-2010, 12:54 PM   #173
Veteran Member




Join Date: Aug 2007
Location: Madison, Wis., USA
Posts: 1,506
Naw, this is just another kind of fun. And a very good kind, at that.

Since I seem to be a rather slow learner when it comes to taking decent photographs, I fall back some days on learning about these technical subjects.

God does play at dice - far better than I play with my camera.

Thanks, Falk and Matt.
08-05-2010, 08:17 PM   #174
Site Supporter
Site Supporter
Aristophanes's Avatar

Join Date: Jul 2008
Location: Rankin Inlet, Nunavut
Photos: Albums
Posts: 3,948
QuoteOriginally posted by Winder Quote
You guys take all the fun out of photography.
Makes one want to throw a camera and Heisenberg into the sun.
08-05-2010, 08:34 PM   #175
Senior Member
Eigengrau's Avatar

Join Date: Aug 2008
Location: Colorado
Photos: Albums
Posts: 250
QuoteOriginally posted by falconeye Quote
Location and momentum are complementary. And momentum includes direction. And because an image is nothing but luminosity as a function of direction, you want to know direction as precisely as possible for tack-sharp images. Which means that you have to give up on location as much as possible. This is the reason why you need a wide aperture lens. Because a wide aperture maximizes the uncertainty about where the photon entered the lens. It's really this simple.
Fantastic! This is a great description, and as someone getting a physics undergrad right now, I always appreciate opportunities to see quantum uncertainty at work in day-to-day things like lens apertures.

It makes perfect sense, of course, but somehow it is always surprising to see the quirky rules of quantum mechanics impinge on our lives in noticeable ways.

Question, though: don't we know luminosity as a function of location, (the specific spot on the film/sensor) rather than as a function of direction? In the double slit experiment, for instance, we can know the final location of the photon, but not if we know which slit it went through - location instead of direction. If I understand correctly, the smaller the aperture, the more certain we can be about the direction a photon comes from (and therefore momentum), and the less certain we can be about the location on the sensor. Am I going wrong somewhere?
08-06-2010, 03:42 AM   #176
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
QuoteOriginally posted by Eigengrau Quote
Question, though: don't we know luminosity as a function of location, (the specific spot on the film/sensor) rather than as a function of direction?
That question had to come. It's a good one!

I give you a hint. Maybe, you work it out as a home exercise in your current studies?

There are two measurement processes. The first is when the photon enters the lens (where we keep location as uncertain as possible). The second is when the photon hits the sensor (where we keep direction as uncertain as possible, by measuring the excact location in a way we cannot know which part of the lens the photon was coming from).

So, theoretically, we should have lost all information. But the trick is the lens. After all, it is the lens' task to focus all parallel rays into a single point within the focus plane. So, a lens' task is to swap the complementary properties of direction and location.

My description applied to the first measurement, your question to the second.
08-06-2010, 10:39 AM   #177
Moderator
Site Supporter
Blue's Avatar

Join Date: Jun 2008
Location: Florida Hill Country
Photos: Gallery | Albums
Posts: 17,377
QuoteOriginally posted by falconeye Quote
. . .
BTW, another one who missed his quantum mechanics course was God. He tried your idea first (many pinhole lenses hooked up to a neural network computer, aka an insect's compound eye). But after hundreds of millions of years in frustration (as far as we know, his beard turned white because of this) about the bad image quality, he gave up and eventually, gave green light to the development of the lens (aka lens-bearing eye or normal eye). [Smiley intentionally left blank]
Except, that is not how insect eyes work unless you are talking about ocelli. Ommatidia that make up the compound eye are complex and there are more than one type depending on the kind of insect such as nocturnal and diurnal species.
08-06-2010, 11:17 AM   #178
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
QuoteOriginally posted by Blue Quote
Except, that is not how insect eyes work unless you are talking about ocelli. Ommatidia that make up the compound eye are complex and there are more than one type depending on the kind of insect such as nocturnal and diurnal species.
Well, say pinhole on one end of a tube and a one pixel sensor at the other. Some insects have a 7 pixel sensor but that pretty much is it. I think my simplifying statement was pretty much spot on

I like the following sentence from Wikipedia a lot:
QuoteQuote:
To see with a resolution comparable to our simple eyes, humans would require compound eyes which would each reach the size of their head
08-06-2010, 11:44 AM   #179
Moderator
Site Supporter
Blue's Avatar

Join Date: Jun 2008
Location: Florida Hill Country
Photos: Gallery | Albums
Posts: 17,377
QuoteOriginally posted by falconeye Quote
Well, say pinhole on one end of a tube and a one pixel sensor at the other. Some insects have a 7 pixel sensor but that pretty much is it. I think my simplifying statement was pretty much spot on

I like the following sentence from Wikipedia a lot:

Actually, you are spot off. R cells aren't pixels.

If you want to pull sentences for Wikipedia:

QuoteQuote:
The size of the ommatidia varies according to species, but ranges from 5 to 50 microns. Naively, microlens arrays can be seen as biomimetic analogy of ommatidia.
08-06-2010, 01:21 PM   #180
Senior Member
Eigengrau's Avatar

Join Date: Aug 2008
Location: Colorado
Photos: Albums
Posts: 250
QuoteOriginally posted by falconeye Quote
That question had to come. It's a good one!
So, a lens' task is to swap the complementary properties of direction and location.

My description applied to the first measurement, your question to the second.
Ah, I see. Thanks for clearing that up.
Reply

Bookmarks
  • Submit Thread to Facebook Facebook
  • Submit Thread to Twitter Twitter
  • Submit Thread to Digg Digg
Tags - Make this thread easier to find by adding keywords to it!
body, crop, division, ff, lenses, pentax, pentax news, pentax rumors, sensor, sensors, sony, users

Similar Threads
Thread Thread Starter Forum Replies Last Post
Wild rumor: Canon and Sony fighting over Pentax? rawr Pentax News and Rumors 56 09-05-2010 05:10 PM
Pentax is due for a new body soon... jct us101 Pentax News and Rumors 90 07-31-2010 02:28 PM
Will Sony supply the Sensor for future bodies? Reportage Pentax News and Rumors 10 06-22-2010 06:47 PM
Potential Pentax user Terry Cliss Welcomes and Introductions 3 02-14-2008 01:20 AM
Potential new Pentax user AlexL Pentax DSLR Discussion 30 08-26-2007 04:33 PM



All times are GMT -7. The time now is 10:29 AM. | See also: NikonForums.com, CanonForums.com part of our network of photo forums!
  • Red (Default)
  • Green
  • Gray
  • Dark
  • Dark Yellow
  • Dark Blue
  • Old Red
  • Old Green
  • Old Gray
  • Dial-Up Style
Hello! It's great to see you back on the forum! Have you considered joining the community?
register
Creating a FREE ACCOUNT takes under a minute, removes ads, and lets you post! [Dismiss]
Top