Taking the plunge on multi-exposure software - HDR.

Digital image capture has opened up the opportunities for photographers to capitalize on the tremendous advances the computer has brought to the world.  The main things we benefit from are the rapid advances in technology and the software tools that follow.  In the old film days, our ability to capture and present images was really under the control of the engineers at the camera companies and at the film producers. Very few individuals could develop and market things other than gadgets.  The world of "high tech" has known for many years that with a computer and a good knowledge of software engineering, just about anyone with a good idea can successfully develop and market great tools.

Today, with the extensive use of image processing software, a relatively open architecture in operating systems (even Apple™ has opened theirs somewhat!) and ever more powerful computers, many very talented software entrepreneurs have flooded the market with really cool image processing capabilities.  This month I wanted to give a brief overview of some software that allow photographers to go way beyond what was "doable" with film. 

To record high contrast images you can turn to several HDR software applications.  HDR, or high dynamic range is a tool that in the "old days" we could approximate with graduated ND filters and sandwiched slides in the dark room/projector.  Even then the outcome was not all that great or was limited to the ability to expose for the foreground and prevent the sky above from burning out. Today there are several very nice programs that allow the photographer to take a series of image of the same scene but at different exposures.  On one hand, the exposure for the shadows is optimized, letting the highlights burn out.  On the other hand, the exposure for the highlights is optimized, letting the shadows block up.  Add a mid tone optimized exposure and one each slightly over exposed and underexposed and we have a sequence of 5 (for example) exposure of the same scene.  Let an HDR program have these and the result is a pretty decent product that shows a properly exposed image with shadows, highlights and midtones.  There are a couple of things that will make this process much more manageable.  First, lock your camera to a tripod and then compose your image.  Although the software can adapt somewhat to slight variations in the image, that takes a lit of computer time and may not always give optimum results.  Start with a proper exposure for midtone and then move up and down in 1 stop increments to start.  Shoot in RAW and check the histogram for each image to make sure you are getting the entire range captured.  If your camera has an auto bracket feature, now is the time to use it!  You will need to experiment a bit to get a feel of how your software handles HDR images.  Experience will help decide the number of exposures either side of midtone and number of stops between each image.  You may now ask, which HDR software package is best.  Photoshop™ can be made to do HDR with many of the normal image tools but there are specific products that are designed just for HDR applications.  The most widely used is Photomatix, see http://www.hdrsoft.com.  They now offer a "lite" version for under $40.00 as a download.  When you use most HDR programs, something like a "tone mapping" and detail enhancement features are usually available.  Like most features they good points and challenges.  Be careful using it as it can lead to "comic book" appearance if you go to the extreme.  Additionally the final output from the HDR program after tone mapping can be a little flat, so plan on some time in Photoshop™ or another post processing program to spark it up.  As with any other powerful application, time and experience will make it one of your "go to" tools.



The changing face of ISO!

The newest digital cameras are touting ISO numbers that stretch the imagination.  Back in film days the very idea of shooting a film with an ISO (or ASA for the seasoned photographers in the club) exceeding 10,000 was unheard of.  Now a good many digital SLRS exceed that by staggering amounts.  How can we properly utilize this new tool?  Recent magazine articles have addressed the use of ISO as a creative tool in ways we've not seen before.  The ability to get decent saturation in very low light is the first thing that comes to mind, secondly we can crank up shutter speed and freeze action with reasonable depth of field in ways we only dreamed of in years past.   OK, what are the drawbacks to this great tool?  As with everything in this field, there are tradeoffs.  As I have mentioned before in articles and in the classes I teach, ISO does not change the sensitivity of the sensor, it changes the overall "system gain". 

Let's get into the details! The digital sensor, by its very design, has a system gain which simply put defines the amount of electrical current generated by a given amount of light striking each pixel (photons to electrons).  The engineers designing these sensors call this parameter "native sensitivity".  The electrical circuits that takes these extremely tiny signals and boosts them to a usable level are called amplifiers.  The signal coming from the amplifier (at this point it's an analog signal, related more or less linearly to the amount of light striking the sensor) is routed to another circuit called an analog to digital converter.  This circuit takes the signal and converts it into a digital form.  The number of bits that you see referenced in a lot of literature comes from the design of this circuit.  The more bits, the more information (light level) conveyed to the processor.  The ISO setting DSLRs controls the gain of the amplifier.  If the ISO is set to the "native sensitivity", the camera is pretty close in sensitivity to the equivalent ISO of film.  This is essential the optimum setting for gain.  As ISO is increased, the gain goes up and along with it, any noise in the input signal as well as noise generated by the amplifier itself.  If the ISO is reduced below the native sensitivity, the signal gets attenuated a bit and some information is lost, impacting the saturation of the image.  The software in the image processor can do quite a bit to alleviate the negative impacts of noise and to some degree the reduced information, so deviating from native sensitivity is manageable. 

So how can we translate this into something usable?  Well first you need to find the ISO for your camera that corresponds to the native sensitivity.  You can use that as a starting point for your shooting and increase or decrease as the situation dictates, but at least you'll know that starting point.   How to find out?  Good question.  It took a while but the "conventional wisdom is that Canon native sensitivity is at ISO 100 while Nikon tends to be 200.  I'll bet some internet searching can produce better as well as conflicting numbers but such is life in the digital age.  I searched on "native sensitivity + Canon + 7D and got 342 hits on Google.  Many had additional links and a good many had wrong information!

Set your camera to its native value, move the ISO up and down, and see for yourself the variability in images.  Hint- You probably won't see a lot until you get to the extremes.  Once you know, you can then use ISO as a creative control with more understanding.


What’s all the noise about?

Back in the days of film photography, we worried about grain in our images. It was the wisdom of the time that the faster the film, the more pronounced the grain and the less sharp the image. Exposure times, film processing parameters and such had some impact but that was really a second order effect. The guys in the green and in the yellow boxes were constantly working to improve grain structures to reduce the impact of faster speeds on image quality. That was then, this is now.

The digital revolution has introduced the photographer to the wondrous technologies of electronics and digital signal processing. But, one of the things we now must deal with in our photography is noise, or as some authors call it, digital grain. Let’ discuss its sources and how to deal with it.

What is noise? It’s all of the additional extraneous signals that come from the sensor, the analog electronics and the digital circuits in the camera that show up in our images (and usually have a negative impact on the image). This is particularity visible in images of large single color such as the sky.

Noise has a number of sources. These range from dust on the sensor to the intrinsic errors in converting an analog signal to a digital data stream.

Dust on the sensor is the easiest to understand but can be quite difficult to control. Point and shoot cameras have a pretty good sealed environment around the sensor so dust is normally not a problem, but if you do get dust problems, correcting the root cause can be tough. Digital SLR’s are much more prone to sensor dust but most have a “clean setting” that allows the photographer to blow away dust (carefully!). The other sources of noise are a bit more technical, but let’s start with light striking the sensor. Light comes in little packets call photons. When these little guys strike the sensor they are converted to electrons which are then used to generate the electrical signals containing image data. Since the sensor is warm (that is at a temperature above absolute zero) other “thermal” electrons are generated just by the heat. These electrons create a baseline signal (noise) that contaminates the image information. The signal coming off of the sensor is very small and cannot be effectively used by the circuits that convert the analog data to digital information. To remedy this, analog amplifiers are used to boost the signal. The background noise is amplified as well as the image data plus the amplifier adds its own noise contribution. The amount of amplification is controlled to a large degree by the ISO setting on the camera. The higher the ISO, the more the noise contribution. Higher ISO means the camera will use less light to generate the image. Less light means less signal from the sensor (fewer photons converted to electrons), more amplification is needed and the ratio of signal (image data) to noise is less. This shows up as higher noise or grain in the image.

Another noise source, one that’s easier to relate to film grain is the inherent nature of digital imaging. The process of producing a digital image means we are taking an infinitely variable light spectrum and converting it to discrete pieces of information. The first place is the sensor itself, made up of a fixed number of sensors, not unlike the grains in film. The analog data generated has, as we’ve seen, a noise component. This signal is then once again digitized into discrete pieces of data for processing. Each time this analog to digital conversion takes place, information is lost and edge detail is compromised. When the signal is then compressed into a jpeg file, more information is lost.

So, we have a lot of noise sources. How can we deal with them? The component and camera manufacturers have done very good things to reduce the impact of noise on our images. The first is the development of larger and larger sensors. Larger image sensors have more light gathering capability with a constant noise level. A 6 mega pixel sensor that is APS size generates more effective noise that a larger full frame sensor. The second, and probably most important improvement, is the noise reduction software in the camera itself. This “post processing” activity in the camera (with the RAW data) actually removes a lot of the noise from the signal, producing a cleaner and more grain free image.

What can the photographer do to reduce noise? Well, first use a camera with the biggest sensor you can afford. Unfortunately for the wallet, this usually means a digital SLR. If that’s not in the budget, then shop for the camera with the biggest sensor you can get and do research on the quality of the camera manufacturer’s noise reduction software.

With any camera, the following tips will also help.

1) Shoot at the lowest ISO setting you can for the conditions. Remember, this means less amplification and more light converted to image data.

2) Try to avoid very long exposures. The longer the shutter is open, the more total sensor noise is generated.

3) Use the lowest compression setting you can (high resolution jpeg has less inherent noise than low resolution – but it takes more storage space).

When you do get your images on to the computer, there are several really good noise reduction software packages available. Do some research and see what you can find. More are coming out every day.

Dealing with digital grain is really not that much different from film grain. Learn the causes and the remedies and keep on shooting.


Color!

While my wife and I were coloring with our granddaughter the other day, it struck me that the range of colors available for children (of all ages)  to create pictures in incredible.  Just looking at her collection of crayons was quite a revelation.  Then when I considered the fact that the computer can manage over 16 million colors, my interest was really piqued.

When we are making photographic images, how can we use color to make them better, more dramatic, more moody, etc.?  I started doing some research and came across a few tidbits that I wanted to share.  First I looked at what colors are complimentary and how that could be used.  Most of us that have worked in a color darkroom remember the old color wheel.
                               
It turns out that artists have long known that the complimentary colors, those on opposite ends of the arrows, look very good together.  That doesn't mean green and yellow don't look good next to each other but stronger images result from combining the complimentary colors.  Take for instance a flower image where the blossom is a shade of magenta and the stem and leaves green.  That can make a strong image.

Another image potential, if you have an image with vibrant colors is to go ahead and combine the non complimentary colors.  If an image has brilliant red and strong magenta tones, the "wow" factor can be quite prevalent.  It would be important in that case to make sure you have strong light to produce saturated colors so the image doesn't look subdued.  Saturation ensures colors are not subdued.

On the other end of the spectrum (pun intended) is the technique to get moody images by using subdued colors.  Here the use of complimentary colors is not really of concern.  The idea with subdued or moody is to move away from saturated colors and get a more uniform, flatter light.  Using fog or overcast skies to generate soft color is a way to get these moody images.

Some other color techniques you may want to try include having the main subject being a bright color and the rest of the image more subdued.  This works well if you main subject is small relative to the size of the image. 

Evening or nighttime photography offers a lot of color variations.  Artificial lights have varied and interesting colors that can be used to accentuate a subject in a new way.

Then of course, there is monochrome!  This does not have to be black and white.  Monochrome means one color!  This is something that the computer allows you to do in almost unlimited ways.

Next time you are out shooting, go beyond the composition and look at ways to use color to make your images stronger.


White Balance and Color Temperature

This month, as promised, we’ll look at white balance and color temperature. Most high end point and shoot and virtually all digital SLR’s have provisions for selecting white balance and color temperature. What this means is pretty simple. The photographer can select the camera’s method of representing color on the recorded image.

White balance is simply the camera’s way of looking at all colors of the spectrum and generating true white. The basic setting is AWB or auto white balance. In this mode the camera looks at the entire scene, assumes the light is about midday and comes to a conclusion as to what white should look like. The most creative white balance control is the custom one. Depending on your camera (check the manual) you can find a white object and use it to set the white balance so the camera can accurately record white in the light that’s available. You can also use other neutral colors to cause a shift in white balance but be careful; you can get some “interesting” and perhaps not too pleasant results.

If you really want to be able to alter a scene however, try the color temperature adjustment on your camera. Light is measured in “color temperature” and there is a relation between this value and warmth or coolness of the light. It’s a little confusing but the lower the color temperature, the warmer the light. This is a collision between science and art. Some examples are:

Candle light 1500, Incandescent light bulbs about 3000, morning and evening light is between 3400 and 4000, midday is 5000 to 5500. Flash is typically between 6000 and 6800 while a heavy overcast generates 6000 to 7500. Check out the final paragraph of this article for more details.

There are two ways to compensate for color temperature. One is the post processing phase (on the computer with an image processing software tool) and the other is in the camera. Anything you can do in the camera is inherently better because you are not disturbing the digital image through recompression and manipulation. It does take some experimenting to use color temperature corrections but consider this tips:

To warm up a scene, adjust the color temperature to a high (cooler) value. This is not as confusing as it first appears. The color temperature adjustment on the camera tells the software what color the light is on the subject. If you tell the camera that the light is very cool (high color temperature) the processor will “warm it up” to come to the midday neutral color. If you’re shooting at midday, try setting the color temp to 6500 if your camera uses degrees or to the “overcast” setting if it uses verbal settings. In both cases the camera will warm up the scene.

Don’t rely on the LCD screen on the camera to give you instant feedback on color. These little screens are great for checking to see if you recorded the image, if the exposure was relatively OK and if your composition was close to what you wanted. They are not good for checking color, focus or details of the image. You don’t see those until you view the image on a calibrated monitor or properly produced print.

Using color temperature creatively requires the old fashioned concept of taking notes and then reviewing the final images. From this you can start to learn how to most effectively use this creative control.

By the way, for those of you interested in how color temperature measurements are determined, here is a brief physics lesson. In you were to take an object physicists call a black body and heat it until it started to glow, the color of the object will be related to how hot it is. The lower temperatures will generate a warm orange glow that will become whiter as the temperature increases. The temperature is measured on a scale known as the Kelvin scale. This is similar to our friendly Fahrenheit scale but is shifted so that 0 degrees Kelvin is the theoretical absolute zero.

White Balance (cont'd)

As we all look to migrating to the world of digital photography, we need to be prepared to learn new terms and new ways of applying the tools of this medium to image capture. Digital has brought with it a whole new set of terms that can be daunting. The other day I was in a discussion on one of these, white balance, and it became clear that this is a scary topic for a lot of people. I wanted to take this opportunity to address white balance for those of you who have questions about what it is and to address how to use it creatively.

To understand white balance, we need to look at an old friend and some basic human vision processes. First let's consider color temperature. This concept has always been part of film photography as we have used special films, flash units etc. Color temperature refers to the color of light. The concept is one we have borrowed from the world of physics and of course we have applied our own photo-related spin to it. Color temperature is the value assigned to the color of light that radiates from a theoretical object physicists call a black body. As the temperature is raised, this body starts to glow, first a dull red, then brighter red and finally a blue-white. Since it's a theoretical object, melting is not a concern. Color temperature is measured in degrees Kelvin. This is a temperature scale that is the same as the Celsius (or Centigrade) scale except it is offset so that 0 degrees Kelvin is the same as about -273 degrees C. (This value is referred to as absolute zero, the coldest temperature that can exist in nature.) In nature we have low color temperature (red) moving up to high color temperature (blue). As photographers, we look at this in the opposite way, red being warm and blue being cool but so what!

Now let's look at human vision and how a digital sensor tries to emulate it. The human eye has two types of light sensitive organs in the retina. One type is called a rod. The rods are special organs used to provide vision in very low light and do not differentiate color. The other organ is called a cone. The eye has three types of cones. These cones detect can red, blue and green. The optic nerve conveys information from all of these sensors to the brain where the information is processed to give us vision. The designers of the sensors in digital cameras (whether CCD and CMOS) have taken the structure of the human eye and emulated it in the camera. The sensor elements or pixels are arranged so that they detect light and produce a signal that represents the amount of light energy striking the surface. In order to produce color, a series of red, blue and green filters are used in front of the sensor. This very closely emulates the concept in the human eye. This information is passed to the camera's "brain" where a software program assembles it into an image. This is where the analogy starts to break down. In the human brain we use experience to recalibrate the image we see. For example, when we read characters on a sheet of paper, the paper is white regardless of the color temperature of the light illuminating it. Just try reading a book under fluorescent lights, outside in the evening, or with incandescent lights. In each case the page appears white.

The digital camera tries to emulate the brain by setting a white reference or balance. To do this, it must be calibrated to what is truly white. Most cameras have three white balance settings. With auto white balance, the camera assumes that the scene has significant white content or more correctly, has a "normal" ratio of blue and red. The second setting is a preset white balance where the camera allows the photographer to choose a white balance based on evaluating the scene. This is similar to choosing a special film or a compensating filter to adjust for the color temperature of the light source (tungsten film, fluorescent light filters, etc.). The third white balance option is manual and represents the best creative tool. Manual white balance allows the photographer to set the white balance for the actual situation. Using a white card to allow the camera to establish the correct baseline for the conditions can do this. (Hint: Sometimes a white card can be overexposed, so when you use it, draw a black line on the card and verify the exposure by checking the image on the LCD screen. Use a felt pen and draw a line on the card. Take a shot and verify that the image on the LCD screen is properly exposed, showing the line accurately.)

The manual white balance sets the camera for "normal" conditions but it can be used creatively as well. By using a light blue card instead of a white one, you can achieve a warm tone to the image, similar to adding an 81 series filter to the camera. If you wish to cool the image, calibrate the white balance with a light pink card. There are several companies that make cards for this use. They have been around for quite a while supporting the television and film industries. One is www.warmcards.com. Also, since you're not burning film, you can experiment!

There is a tendency to set white balance to auto with the idea that you can compensate with the computer later. By all means fight this impulse! White balance is a very non-linear process, treating red, blue and green differently. When you use your computer, it will change the effective color temperature of the image uniformly across the image and not give you the same color rendition as if you properly calibrated the white balance.

 
Winter Photography

I know that I said we’d look at additional exposure topics this month but since we are deep into winter, I thought that it might be timelier to look at some of the aspects of winter shooting.  There are two areas that come to mind when one thinks of winter photography.  The first is equipment and the second is selection of subjects.

With regard to equipment a whole series of things jump out.  These things include care of your equipment in harsher than normal conditions, and care of the primary piece of equipment, the photographer.  The subjects available in winter are more limited than other seasons but we’ll look into that as well. The use of the creative controls to achieve great images is also an important part of the process.

Let’s look at care of equipment.  The two things that impact equipment in cold weather are moisture and of course temperature.  Moisture comes mostly in the form of snow and condensation from your breath.  The condensation can freeze and be a bit more insidious than snow because it can build up gradually.  Keep a soft cloth available to remove condensation from your camera (especially the LCD screen and the view finder).  Snow can become a problem if you shoot during a snow storm, so the use of a commercial lens and camera body cover or just a plastic bag can be very helpful.  Also consider that when carrying your camera, whether on or off a tripod, while you move around, exposes it to a sudden “dunking” in a snow bank if you trip and fall.  If you’ll be in areas with potential tripping hazards, consider putting your camera and lens in your pack or at least putting a plastic bag around them while moving from one location to another. 

Batteries are very much “allergic” to cold weather.  The typical digital (and most film) cameras uses a rechargeable Lithium ion battery.  These batteries provide power for everything in the camera.  You’ll find if you are out shooting in weather significantly below around 20 degrees that the battery will appear to lose charge after a while.  Keep a spare battery or two in a warm place, like a pocket inside of your cold weather jacket.  You can take the battery that has lost charge and warm it up to regain usable charge.  If you are on a multi day shoot, be sure to bring adequate chargers so you can recharge your batteries each evening.  Unlike the older NiCad batteries, newer Lithium ones don’t exhibit the half charge “memory”, so charging a partially discharged battery is acceptable.

There is an ongoing debate about winterizing cameras by changing lubricant on the shutter mechanism.  I doubt that many of us will ever be shooting in such extreme temperatures where this will be an issue.  I suggest that if you do plan to go to the deep Arctic to shoot, that you contact the camera manufacturer for guidance.

Now that we’ve protected our equipment, what about us?  Modern fabrics and materials have given us a myriad of boots, gloves, coats, pants and other garments so that comfort is no longer a pipe dream when shooting in winter.  That being said, we still need to make the best usage of these items.  I could write volumes on the different clothing items available but suffice to say that dressing in layers plus wearing boots with insulation and moisture resistance are two basic requirements.  Your layered clothing should be appropriate for the conditions (moisture resistance, insulation value, etc.).  Gloves are another key piece of equipment for the photographer.  I like to wear a pretty heavy pair when I’m out wandering and then have a pair of glove liners to wear while shooting to allow the dexterity I need.  I’ve tried the “fingerless” gloves with liners but they are a bit awkward for me.  This is a personal choice; just make sure to have adequate hand protection.

Let’s now look at subjects and creative controls.  As I said before, subject matter is somewhat more limited in winter.  That doesn’t mean a lack of subjects.  Landscapes (both large and small) abound.  Look around and see if you can find “snowscapes”, shadows, ice formations and other winter images.  Make use of the shadows, shapes and patterns to get really interesting images.  When shooting snow or ice, remember that your camera meter is designed to expose for midtones and snow in bright sun will become a grey/blue color.  Use added exposure, your color temperature or white balance controls to get a proper color.  That saves time in front of the computer correcting the balance.

Wildlife in winter is a very valid topic but more than any other season, we need to be very cautious not to stress these creatures.  Getting through winter is tough enough without people adding extra stress.  Watch for the classic signs of stress in an animal and back off if an inkling of stress is presence.  Winter wildlife shots can be very good if you are careful.  Stressed animals don’t make great shoots.

Take advantage of the season and get out shooting!

Winter (additional thoughts)

For this month I thought I'd do a short set of reminders on winter photography and how to make the most of your time outdoors during the cold weather.  First, some words of caution for you and your equipment. 

Cold weather is very hard on batteries.  Keep spares with you at all times and keep them warm.  An inside jacket pocket is a good place.  When batteries appear to quickly drain because of cold, warming them up will typically restore them.  Keep recycling between the camera and the warm pocket!  When shooting in the snow, do your best to keep your camera and lenses out of prolonged contact with snow and ice.  They are after all water and when they come in contact with a warm camera, they melt and can cause a lot of problems. 

When you go between cold and warm with a camera and lens, especially a big lens, remember about condensation.  Bringing a cold item into a warm place can cause moisture to condense on surfaces, including inside ones!  Tightly wrap your camera and lens in a plastic bag before coming in from the cold and leave it there until it comes to the room temperature.

While taking precautions with your equipment, don't forget about yourself!  Hats, gloves, boots and layers of clothing are very important so we avoid problems and enjoy our photographic passion.  The latest in clothing technology has been a boon for photographers.  Check out some of the new light weight, weatherproof gear.

One final note, remember about proper exposure for snow!  Your camera's metering system is designed to render images mid tone or 18% grey.  With digital cameras we can continue to overexpose snow scenes or adjust white balance instead.  Re-read your camera manual and be prepared for getting white snow instead of grey or blue.

Enjoy this marvelous winter season and get some great shots to share.

Metering Modes


We have all heard the comment, “well I just fix it in Photoshop™”.  I’m not too sure about you but I would rather spend my time in the field shooting than in front of the computer fixing images.  One of the best pieces of advice I ever heard, whether shooting film of digital is “get the exposure correct when you take the picture”.  If you do this, you will not need to “fix” it later.  Over the next few months, the digital corner will go back to exposure and metering basics.  This month we’ll look at the metering modes available in today’s digital SLR camera bodies.

Most digital SLR’s have followed their film predecessors in having a variety of metering modes.  The 3 basic ones are matrix or pattern, center weighted and spot. In the case of my Canon 40D, the engineers have added a 4th they call “partial”.  This forth one is really a variation on the spot meter capability. 

The “meter” in the camera is really a combination of the light sensors and the computer’s microprocessor.  The processor takes input from the sensors and computes the exposure based on the sensor light readings, the aperture setting, shutter speed, focal length, and a lot of data stored in “look up tables”.  Let’s look at each of selectable metering modes of the sensor and consider how to use them most effectively.

Matrix or pattern looks at the entire image divided into a number of segments.  In the case of the Canon 40D, there are 35 segments or zones evaluated.   Here is an example of the zones in a simpler, 13 segment matrix pattern:

                                                   

                                                      













As you can see, the zones are not uniform, but are designed to give the camera’s processor information according to a “best case” for normal compositions.  Normal is defined as average distribution of light, medium and dark tones.  Matrix is usually good for front lit scenes, or scenes with minimal contrast or a moderate mix of light and dark tones. 

Center weighted and partial modes look at the entire scene but give substantially more “weight” to the center.  Typically 70% of the information from the center area and 30% from the periphery are used to calculate exposure.  Here is an example of the center weighting approach.

                                                  

                                                           













Center weighted exposure is probably one of the better tools we have.  This mode uses a lot of the experience developed over the years of film shooting.  Center weighting, along with the exposure lock feature on the camera, provides a very good tool for getting the best exposure possible.  This mode is best for scenes with highly directional light, a scene with very bright sections and very dark sections, like a landscape with bright sky and dark foreground, and high contrast scenes.  To most effectively use center weighting, while controlling over exposure, always include the brightest area in the center (biasing toward the highlights).  Take the reading in manual or any of the creative exposure modes (shutter or aperture priority) and lock the exposure with the AE lock.  Then recompose and shoot!  A word of caution, this mode, like any other, averages to middle grey so you may need to compensate for very white or very black subject matter.

Spot and partial metering looks at a small part in the center of the frame.  All exposure information is calculated from that area.  This metering mode is very good for subjects that are in shadow, where you need to control or saturate highlights, and high contrast scenes.

 

                                                

 











Spot metering also averages to middle grey so you may need to compensate or look to use white balance to get a correct color rendition.

Next time you’re out shooting, try these different metering modes, consider the lighting of the scene and take a few notes on which mode you selected for each shot.  Consider adding some comments on why you chose that mode, and see how your images come out.  Do this a few times and maybe you’ll spend less time in front of the computer and more behind the lens!

Images of Nature - Instruction - Workshops

General Photography Articles

Light waves travel in straight lines and as we reduce the diameter of the lens opening we really extend the depth of field.  So what can we take away from this that we can apply to our everyday shooting?  Well consider the following:

a)       The smaller the f stop the greater the depth of field

b)       The greater the magnification, the shorter the depth of field. 

c)       Depth of field is typically 1/3 in front of the point of exact focus and 2/3 behind.  This changes as we increase magnification and the good rule of thumb for macro shooting is 50/50.

Go out and play with depth of field.  Try images close and far, shoot the same subject with different f stops and different magnifications.  You’ll develop a whole new set of creative tools.

Views of Nature Photography

Depth of field

I know depth of field is not just a digital topic but in keeping with our plan to address some of the basics I thought I’d discuss the concept.  Depth of field is defined as the distant in front and behind the main area of focus that is acceptably sharp.  If we look at how an image is “projected” on to the camera sensor we see:

The minimal acceptable sharpness is something called a circle of confusion.  If you are really interested in the details check out http://en.wikipedia.org/wiki/Circle_of_confusion

We all know that the aperture setting impacts depth of field.  The reason can be seen in the following diagrams: