Originally posted by Autonerd Well... film is kind of digital at a very, very small level. Once developed, either a little bit of plastic has a speck of silver on it, or it's clear. Blocking light from the enlarger or not. On or off. The density of those silver specs gives you your shades of gray.
It's why I object to the term "analog" for film photography, but I could be wrong.
Aaron
You analogy (no pun intended) of silver grain and pixels makes sense especially if we accept the definition of analog as
"relating to or using signals or information represented by a continuously variable physical quantity such as spatial position, voltage, etc."
For me, I started hearing the term digital once those signals or info was coded into bits of binary. The basic units are off or on, zeros or ones, which create a code for variable physical quality and quantity.
The beauty of digital is that copies are equal to the original because it's just a code. In art, the paradigm shift of digital is that everything we perceive is decoded (or corrupted) by a third party and that the art or photo itself, is not special or unique as an object. We can appreciate the photographer and the photography, but because anyone can have a copy as good as the original, there is devaluation.
With analog, even with using one film negative from Ansel Adams, no two Ansel Adams darkroom prints are truly identical as they were made ever so slightly differently affected by Ansel Adams, not an inkjet printer. It makes the object more precious to the creator and the collector.