Responding to my workflow video from last week, some great questions from Tom Leininger. He got me thinking and I came up with this:
Thanks for the e-mail with your thoughts and questions about ethics in regard to Adobe’s Lens Correction Profiles and the recent NPPA discussion about the Fill slider. You got me thinking so much about all this that I came up with a full post in response. This post is not about the ethics of toning, saturation, Hipstamatic, etc. It is about software and technology that makes your photographs more real, not less. Here goes…
Our goal in photojournalism is reality. The foundation of ethics in photojournalism is that our photographs of any situation should look the way our eyes saw it. Let’s use the human eye as our benchmark standard of reality. How the eye sees is our goal, and thus our reality. We forget that the human eye is not film or glass.
Since the beginning of photojournalism we have been recording reality with flawed physical materials. Glass lenses introduce distortion and vignetting of light, and film cannot record the full range of latitude that our eyes are capable of seeing. These limitations were accepted, so much so that we even exploited them for artistic purposes.
Take the famed Kodachrome slide film for example, which we loved for its unforgiving latitude that produced deep, dark shadows when we exposed for the highlights. Anyone researching our planet from Kodachrome slides alone would believe that the shade is as black as night even when the sun is shining. To the dismay of thieves everywhere, this is not the case. Despite the scenes our favorite photographers sometimes show us, our eyes can see detail in both the sunlight and the shadows simultaneously. There is shadow detail on Earth, even in Haiti. Kodachrome and its strict latitude was highly inaccurate in contrasty situations. Romantically so.
To counter such defects and limitations in our film and lenses, photojournalists have used countless contraptions. Over the decades these have included such things as fill flash, tungsten film, color balancing gels on strobes, split neutral density filters, developers and processes that altered contrast, and on and on. These were accepted practices, and photographs using these techniques have won awards for decades. And what was the goal of these tricks? To overcome the physical limitations of the photographic process in the pursuit of photographs that were closer to the way we actually see.
The major difference between these time-honored practices and modern technologies like Adobe’s Lens Profile Correction and the Fill slider (note that it’s not called “Fill Flash”) is that today’s solutions are applied in post-processing after the shoot. That’s the sticking point for a lot of old school photojournalists. Decisions in the film days were made at the moment of exposure and were mostly permanent. The exposed slide was considered a finished product and was meant to remain that way. You couldn’t make changes after exposure without altering a physical artifact.
Things have changed. With its advances in RAW conversion software (built into both PhotoShop and Lightroom), Adobe has given photojournalists new ways to get closer to our goal—reality. We are less constrained to the limitations of film and glass.
This introduces a new dynamic to photojournalism. Decisions can now be made AFTER EXPOSURE that bring our images CLOSER TO REALITY. That is the situation, and most ethicists have not realized or accepted it.
When you shoot a photograph in RAW format you retain an amazing amount of control over that image. White balance, exposure, and (soon) focus are all things that you can decide on later while sitting in front of the computer. It’s like having a time machine that takes you back to your camera at the exact moment of exposure, letting you change any of your settings for each shot. This changes everything. Would it be unethical to use a time machine to go back and get a better photograph of a historic moment? Of course not, provided the photograph was true (and you didn’t step on any butterflies).
This time traveling ability in post-processing is an incredible power. Imagine being able to take your old film negatives and re-develop them in new, improved chemistry that eliminated the grain and improved the color. That is happening with digital right now. RAW conversion software improves every year, and your photographs improve with it. Your digital photographs from ten years ago can be reprocessed with today’s software for improvements in color, noise, etc.
This is the new reality: When you click the shutter today, you’re not baking your image permanently into Kodachrome emulsion. You’re bookmarking a moment for later attention. You’ll make decisions about exposure, white balance, and whether the photograph will be in color or black and white at the computer, not the camera. It may be different and new, and it may be scary, but it isn’t necessarily wrong.
Because even with these new tools, our goal is still reality.
The recent “Fill Flash” discussion that appeared online seemed to involve photographers who didn’t quite understand the tool. They even got the name wrong (it’s called Fill and it’s no longer a part of Camera Raw). Like you said, the tool simply opened up the midtones of an image. That’s nothing new to digital image processing (midtone controls were part of the original LeafDesk software circa 1990). Besides, no one in the film era would have questioned the ethics of using a low-contrast developer and some minor dodging and burning to achieve a similar result.
What is the difference, ethically, in the two following scenarios?
1. Using a battery-operated flash unit on your camera as fill flash to lighten up a backlit subject.
2. Using a slider in software that triggers a mathematical algorithm to lighten up a backlit subject, using the original capture data from the sensor.
There is no difference ethically. Both should be accepted practices. But if you had to pick one that was less natural, surely the strobed fill flash would be it, the way it artificially changes reality by casting unnatural (though pleasing) light and adding shadows. Using a slider to adjust contrast and give your image more latitude (similar to that seen by the human eye), preserves the natural look of the scene and doesn’t intrude on the moment with a bright flash of artificial light.
What we are seeing in photojournalism ethics today is a lack of understanding when it comes to technology. There are definitely things to be concerned with, notably those techniques that move our photos toward unreality, such as cloning, heavy-handed toning, and over saturation & desaturation.
However, technology is providing us with great tools—new algorithms that bring our photographs closer to reality and overcoming the physical limitations of lens design.
Adobe’s Lens Profile Correction is the perfect example. Here’s how it works:
Modern lenses are riddled with flaws like pincushion and barrel distortion, color fringing, and vignetting at wide apertures. Even the pricey zoom lenses that most photojournalists carry today have these flaws to some degree. With Lens Profile Correction, the flaws in your glass are corrected. Adobe has tested most of the lenses available today, from the Nikon 14-24 to the Canon 600, and built profiles that correct the distortion and vignetting in each lens design. In other words, Adobe’s Lens Profile Correction automatically corrects the UNREALITY that your lens is putting into your photographs. The scientists and mathematicians behind Lens Profile are bringing your photograph closer to what was seen with the human eye and away from the world as seen through glass elements.
It makes you wonder, should we keep our professional standards and ethics modeled on the physics of light passing through glass and onto film? Or should we base them on the superior quality and light gathering ability of the human eye?
If you could go back in time to the moment you took a favorite photograph, would you rather shoot it with a $400 Nikon/Canon 24mm lens, or a $6,500 Leica 24mm lens with its exceptional sharpness, contrast and lack of distortion? Would you have any ethical concerns with making that choice before you shot the photograph?
Those were easy questions. What about this: Would you have ethical concerns with using a mathematic algorithm designed by scientists to bring your photograph closer to reality by correcting the flaws inherent in your lens, with the understanding that the algorithm will not change the content of your photograph, only correct the distortion, vignetting and other unnatural flaws of your equipment?
These are the kinds of questions we face. We are at the moment where post-processing is as much a part of any photojournalist’s kit as their cameras and lenses. The physical limitations of the past are vanishing. We are able to make ethical decisions about technique retroactively. Don’t be afraid of advanced techniques that lead to reality, even as they astonish you.
Remember, reality is the goal.