----- Original Message ----- From: "Tom C"
Subject: Re: Digital profligacy



I'm jumping in the middle of the thread, not having read all the posts, risking ridicule.

I personally doubt that autoexposure, autofocus, etc., etc., etc, has led to a decline in quality of photographs taken. Sure some people, alot of people, who take photographs don't ever learn exposure for example. Technology makes it easier for more people not to learn it, I agree. But so what. Would they have otherwise? Probably not. Alot of people just take pictures and it's not because they consider photography to be a hobby or passionate pursuit. They just snap the shutter, never mind composition. I would guess that exposure for exposure, there are more good, correctly exposed photographs taken now than 30 years ago, 40 years ago, ad infinitum.


Before automation, you had no choice about learning the technical end of photography. It was part of the game. You learned how to adjust an aperture and shutter speed to match a needle in the viewfinder.
This, in itself, may seem like a small thing, but it isn't. In the process of learning how to set the camera to correct exposure, you were also, by default, learning much of the workings of light itself, which is what photography is about.
You would be guessing wrong, btw, about properly exposed photographs taken 3 decades ago vs. today.
Automatic exposure does not necessarily give correct exposure, it gives a best guess exposure, that guess coming from a rather retarded brain.


What I find amusing is that over the past 30 years, the skill set required to be a photographer has changed from learning the rather simple operation of a manual camera with a grand total of 3 controls to the more complicated operation of a device with sometimes a dozen or more buttons, a few dials, a rocker swith, and several hidden and often inscrutable modes.
You need to know far more about machine operation to run a modern camera, especially a digital, than you needed to know about photography when you set everything yourself with a Spotmatic II.



If it is a passionate pursuit, then they will learn. OK, take away the in camera light meter, matrix, spot, center. Is anyone seriously stating they would get more accurate exposure by not using the meter (don't think I'm stating that one should always believe the meter)? That they would get a better exposure more often by not using the meter? I find that pretty hard to believe. I agree that one may learn how to judge exposure better, having acquired a sixth sense after viewing many many *poor* exposures.

People no longer use light meters. For the most part, they trust their camera to make the right judgement call, with little or no input. They walk around an inanimate object taking a dozen or more exposures in the hope that one may be the right one, and then when they get lucky, they put up a PESO and say look how wonderful am I? And it only took 16 tries to get it.


They may not be wasting film, but they are wasting something far more precious.
They are wasting their very lives.



It's all a moot point pretty much, right? If the printer has the ability to compensate for exposure variations that fall within a range of acceptable to bang-on, and they have that ability *by design*, then that's just the other side of the coin, so to speak... exposure can be controlled in camera first and out of camera second (don't anyone think I'm saying exposure in camera doesn't count, I shoot transparencies almost exclusively when using film). From what I've read on this list, there's a huge shortage of printers/processors that are willing to spend the time and effort to produce properly exposed prints.

To be fair, there aren't enough people in sufficient concentrations willing to pay to make it worthwhile to produce top quality prints.
Modern minilabs are not capable of producing 100% accurate colour anyway. You really have to be willing to settle for (at best) closer to 94% accurate, and I doubt many labs hit that mark.


William Robb




Reply via email to