Originally posted by chriswill Looking at that video and reading the comment that "if done skillfully, it doesn’t damage the sensor" I think there is a small number of people willing to try it. It appears they are using a piece of plastic to remove the Bayer color sensor layer but also leaving several scratches behind, possibly damage to the sensor's pixels?
Originally posted by ProfessorBuzz If you have a Bayer colour sensor, then that's your only option. The challenge is knowing the relative sensitivity of the RBBG pixels. Some software does this automatically.
Shooting in true monochrome, you get more photons in less time, but there is a quantum efficiency curve - so the middle visible wavelengths around green tend to be brighter. In colour, you've got this PLUS the Bayer filters which each have different transmission characteristics.
Give it a try and see what you get.
I did a rather amateurish experiment today and now understand Professor Buzz's comment about the loss of pixels when using the B/W setting in camera. However the procedure of taking the photo in color and decoloring in post gave better results.
I took a few photos, RAW files, with my Kx and K70, with settings for B/W in camera and in natural setting for color. Opening in Camera RAW with PSE, I desaturated the color photo and compared it with the B/W. The desaturated color photo had crisper and finer lines and appeared to be in better focus.
I am by no measure a professional, nor do I have much expertise in PSE. However, my self experiment indicates the best and least expensive way to get good monochrome photos with a DSLR, shoot in color and desaturate in post.