Originally posted by pcrichmond I'll continue to dream too, even for the square sensor. I must admit one of my most enjoyed features on the K-1 is shooting in 1:1 crop mode again. Probably dates me to my favorite camera, the C330 tlr, which I actually preferred over my 6x7.
Please feel free to correct if I'm wrong (this is a learning experience) but I thought sensors by nature are monochromatic - each pixel measuring & recording intensity of (or lack of) light and then a filtration system, either Bayer or Foveon are used to either transmit or block specific wavelengths achieving perception of color.
I also remember reading that sensors by nature are already full spectrum, so adding a hot mirror inside or externally shouldn't make a difference - especially in monochrome.
If this is so, then the hardware is already there and writing the coding would be the majority of the development.
And the technology is already on the market, (Red, Leica, PhaseOne) plus a few others who manufacture mostly for microscope imaging.
Hopefully, a Ricoh rep looks at your thread.
I'll dream for a FF monochrome k-mount and can make due with the converted K-01 for IR.
But if someone is looking at this I'll dream big - a full frame, full spectrum monochrome with 1:1 crop ratio.
Thanks again for starting this thread, there has been a wonderful amount of knowledge shared as well as dreams.
Technically, the silicon sensor pixels are
panchromatic in that light of
all wavelengths from near-infrared to near UV induce a signal although the sensor is more sensitive middle wavelengths (somewhere in the yellows and oranges). Photons enter the silicon and excite electrons. The electrons themselves have no color information. The readout circuits count the number of electrons and infer a color signal based on where those electrons came from (e.g., all electrons from a "red" pixel are assumed to have been created by red light).
Bayer uses a filtration system in front of the silicon chip. Each pixel has a color filter in front it and the software then assumes that what ever signal comes from that pixel must have been of the color implied by the filter.
Foveon has no color filters but, instead, relies on an unusual physical property of silicon by which blue photons tend to be absorbed near the upper surface of the sensor, green photons tend to get a little further into the silicon before being absorbed, and red photons tend to travel the furthest. Each Foveon pixel is actually a stack of 3 photodiodes with the top one being the most blue sensitive, the middle one tended to get more green photons than blue or red, and the deepest on getting most of the red photons. But the color separation is not very good (a fair number of red photons are also absorbed by the blue and green diodes and some blue and green photons make it deeper into the supposedly red layer of the sensor. Some clever math enables the camera's CPU to statistically estimate the intensities of red, green, and blue signal at each location (that statistical process explains Foveon's poor high-ISO performance).
But the color system of these sensors is not perfect by any means. If you shine pure blue light on a Bayer or Foveon sensor, you'll see a strong signal from the "blue" read-out but also some signal from the green and red ones, too. That's caused by cross-talk in the color filter or Foveon photodiodes. And if you close the shutter, put on the lens cap, and take a very long exposure in hot environment, you'll see a strong color-speckled output from the sensor. That color is not really there at all. It's the camera's software mistakenly assuming that any electrons measured in R, G, and B pixels or photodiodes must be coming from R, G, or B photons.