Originally posted by rrstuff where people waling in front of your camera don't disappear until a very long exposure has been applied
I assume the 'k' in walking didn't wait long enough to be exposed in your message, and I certainly don't want to get in the way of artistic expression. I do know a little bit about how our minds see images. If, for artistic reasons, you want the blurred images of moving people to remain in the final image, multiple images will work, but if the interval is too large, it won't look anything like smooth motion. Anything you can do to blur the moving objects in real time will help compensate for the slow frame rate effect, such as shortening your depth of field and focusing in the distance. Taking a sequence of second long exposures means that each frame already has a blurry object in it and if you can reduce the interval between those second long shots to a fraction of a second the denser portions of those objects should appear like a smooth motion. The entire object doesn't have to be equally blurred, as long as the densest part looks smooth, our eyes will equalize the fringes automatically.
When it comes to blurring drops of water or clouds, there is a line where too much blur removes any transparency and makes it completely opaque again, except over a larger area of the image. Transparency is how we perceive blurring in real life, our mind is stacking multiple stop action images. I just looked at a photograph of steaming mud pots in Yellowstone, and with a 10 second exposure, it is impossible to separate the vapour from the ground (and vice-versa). Which is the effect the photographer was trying to obtain, but it doesn't reflect what it is like to be there in person. To perceive placid water, you need to see sharp reflections in the water, otherwise it looks like it is out of focus or dirty.
Originally posted by v5planet improve the intensity of the color and the smoothness of the tonalities
With film, I would agree with you, with flat plane digital sensors, I can't. The sensor only gives the camera a voltage and colour of filter for each pixel. Post processing will simulate luminosity and dynamic range by interpolating and adjusting the raw data. Basically as long as there are enough photons hitting the sensor to clearly distinguish the image from noise and not so many that the sensor is overloaded, computer algorithms will do the rest.
20 seconds is the longest exposure I've ever worked with, trying to get sharp nighttime images, not to be creative, but even then issues with wind and heat drove me nuts. From a technical standpoint, it seems to me that the kind of creative effects you describe are better done with digital manipulation than using optics to cut the amount of light reaching the sensor to the point where the sensor is not very reliable. But I certainly wouldn't describe my photography as artistic.