Originally posted by Zav I want Ricoh to produce a monochrome module for the GXR! You want colors? Swap your modules!
Oh and it can also be a square sensor
First, Ricoh needs people to buy their camera. Maybe that'll happen.
Oh yes, square sensors. Just like using Verichrome Pan in a Yashicamat.
Meanwhile...
The discussion (and much info) on B&W sensors is here:
Manual Focus Lenses :: View topic - Monochrome sensor; very interesting outlook
As for obtaining B&W sensors... It's a basic issue of sensor fabrication. A sensor consists of a big array of photodetectors (a megachip) topped with the filter array (often Bayer, sometimes others) topped with the microlens array. The photodetectors are transistors, either CCD or CMOS (and that doesn't matter here). These are made by VERY CAREFULLY depositing layers of semiconductor stuff (technical term) onto a silicon substrate aka wafer. The wafers are cut into chips. Wires are attached, etc.
Now we get tricky. Sometimes the fabrication plant (which is rather expensive to build and run) sprays a color filter layer onto the wafer, then glues on a separate sheet of microlenses. And sometimes another factory takes a sheet of filters, and glues them on, and then glues on the microlenses. Then the chips are packaged and shipped to the camera maker and the rest of the job continues.
The project cited in the link above doesn't have a fab plant. They're trying to take off-the-shelf sensor chips and peel away the glued-on filter layer in order to convert them to monochrome, which ain't easy. Doing so necessarily removes the microlens layer, which reduces the sensor's sensitivity a bit. Does this result in a lower base ISO level? I dunno. Panatomic-X fans, take heed.
The project shows samples of pictures with and without Bayer filters. The WITHOUT shots are strikingly sharp. But I wonder about getting image data from each and every pixel. With CMOS at least, each photodetector sits atop a signal amplifier. Its data is sent to a signal processor which adds it to a RAW file or whatever. Is each pixel addressed separately, or only as a cluster of four (1B+1R+2G)? To access each B&W pixel, must the underlying chip architecture be changed?
So, to get a commercial B&W sensor, we need a company that has its own fab plant(s), and that's willing to change their fabrication process (which ain't cheap), and maybe the chip architecture. Which is a cheaper change: spraying a clear layer instead of a color filter layer? Or gluing on a clear sheet instead of a filter sheet? I dunno. I'm a bit rusty on chip/sensor fabrication. It's not like you can just send the design off to a silicon foundry and get back a waferful of B&W sensors.
Who fabricates sensors? Canon, Kodak, Samsung, Nikon (I think), Sigma, and Sony all come to mind. Various others. Kodak made a high-res B&W sensor some years back, then stopped. Who is willing to pick up where they left off?
So maybe we'll this this happen, and maybe not. Start signing petitions.