It's 2008 and thankfully flame-wars of the sort that happened to posts of this nature have stopped. Largely everyone we've also stopped arguing about film versus digital, realising that actually, at the end of the day, other people were getting on taking great photographs no matter what.
But that's not to say that the medium you record your photograph on doesn't somehow play an important role in the resulting image. Of course there are many other factors - lighting, subject, composition, your art, lens and aperture/shutter speed settings to name a few. But at the end of the day, is it still possible to revisit the original digital versus film question?
There wasn't any doubt at the versatility of digital, and the fact that digital photography can produce very good results. In fact a lot was said in the late nighties and early 00's that digital had surpassed film. This was based on a number of subjective and objective measurements. People thought digital images looked better. People scanned film and compared it to digital, and preferred the digital image. People measured MTF and calculated theoretical values for the image quality and this link (and the links contained within) was probably one of the more influential:
Link
But they were flawed. At the end of the day image quality is a subjective measurement as much as any hi-fi afficionado will tell you a good sound system is found by listening to it.
Resolution is an important factor, particularly for enlargements. Good quality slide film seems to be able to get to an MTF 50 of nearly 50 line pairs/mm (as a measure of resolution), which is comparable to the best in digital. Good quality colour negative film is said to reach an MTF 50 of up to 70 linepairs/mm. Apparently, the top fullframe digital cameras are reaching MTF 50's of around 50lp/mm, however, I have not seen this formally tested, except by assuming that the megapixels relate directly to resolution (not necesarily correct). To summarise resolution; at best, we can say it is very difficult to distinquish the best in digital with the best in film, and further tests are needed before one can say either is better. In the meantime, lets just conclude they are both capable of excellent resolution, requiring the best lenses to make the best use.
Noise is another important factor. In actual fact, it is Signal to Noise ratio we are interested in, not noise per se. If noise per se was the issue, applying a blur in photoshop to all our images would improve the image quality! In the link I posted above, they mistakenly assume that signal to noise ratio can be easily measured by taking an equal tone area of a (random) photograph and measuring the variance of that spot. Sadly this does not take into account noise reduction techniques applied to the digital image, and has not much bearing on signal to noise ratio (I won't go into just how many ways this is a poor experiment - lets not mention that they don't know the actual variance of the scene - the variance measured by the film was probably correct!). A subjective comparison would rate digital as having less noise than film - in fact this is widely believed to be true. However, there is no evidence that the native noise of the two mediums are different, and that the difference you see is simply down to processing, which has a direct detriment to image quality I'll mention in the next section. In summary, we don't know the actual signal-to-noise ratios of the medium of digital as this hasn't been measured in the studies I've looked at.
But is image quality all about resolution and noise? There's also linearity (true-to-life-ness), colour accuracy, and tonality.
Linearity - how truthful is the rendition? Film suffers from one main artefact - that is film grain. This is not easily removed, and is something you will have to live with in film. Digital apparently has less noise than film, but the payoff for this is 'noise reduction' is used. Even on modern, top of the line cameras? Have a look at the sample images from any review of a recent semi-pro/pro digital camera at high ISO and it can be seen clearly, once you train your eye to see it. Noise reduction works by trying to guess which areas of your image contain detail, and which areas do not. It largely works this out by carefully sifting through the image and dividing it up into high contrast lines and areas, and areas of similar tone. Once it has done that it smoothes the areas of similar tone, effectively smoothing out the noise, and sharpens areas of contrast, improving the overall sharpness of the image. One must admit these effects works, because most photographers don't mind it. Personally, when I see these areas noise reduction, I actually feel a little unwell - there's something disconcerting about believing something is a true representation at first, and then realising it isn't. Where noise reduction really fails is in the zones that cannot easily be categorised - take any textured area - grass, sand, woodland, clouds etc.. and you'll find the noise reduction fails miserably to convey that texture - you'll notice what I call a 'salt and pepper' pattern instead, where the processing erratically sharpens then smoothes the image repeatedly, ruining much of digital's linearity. Give me a bit of grain anyday - at least with grain it is a true representation, albeit with a soft veil over image.
I can continue with linearity. How about the way digital deals with highlights? Yes, they clip. What does this mean? For one thing areas can become pure white, lacking any detail, particularly in light sources - eg. sun and highly reflective objects. However, this can often be avoided with appropriate exposure (but not always). More of a problem is the way that the individual channels clip when approaching white. This causes a huge problem for all bright and colourful tones in an image. The abrupt clipping causes colours to suddenly shift as white is approached. Look at any digital sunrise photo and you can identify the three colour areas around the sun as each channel clips - white, yellow, orange, red - don't they look a bit unnatural? Film handles this much better.
It doesn't stop there! The bayer arrangement of the sensors results in various artefacts. Striped objects can have an interference effect with the repeating regularity of the photosites, resulting in moire fringing - stripy colours overlaying the image. The bayer arrangement requires processing to compute the actual colour tone, and this can result in the 'maze' artefact in areas of high detail. The bayer arrangement also exacerbates chromatic aberrations in lenses because the colour edges are also separated by different colour sensors. And then the digital wells can fill up overflow, resulting in purple fringing, and worse, blooming (now a much rarer occurance than purple fringing apparently).
Film, by virtue of it's randomness at a microscopic level, does not suffer from any of these problems.
As a result, tonality and colour accuracy are better maintained in film.
In conclusion, I believe that film has higher image quality than digital. And I also conclude that you cannot prove me wrong!
As a disclaimer : these are my opinions as they stand now. I've formed my opinions on personal experience, other's experience and the results of those that have tried to quantify resolution and noise of digital and film and compare them. I don't have vested interests in film or digital, except that I believe it would be a great shame for the film industry to wind down, as that would mean we would not have access to the supremely excellent medium that is film.
As a request : I know I may be preaching to the converted, but for those of you only shooting digital, I recommend you have a go shooting film sometime. With your newly acquired knowledge with digital, you should be pretty good! To see the superior results, you need to use good lenses and it needs to be printed and/or scanned properly! The CD's that you can get done with a roll of film are invariably atrocious. Ideally get hold of a dedicated 35mm scanner and learn how to use it (not easy), or print at a professional photographic outfit.