This page is a growing collection of articles on digital image making that we write for various camera club newsletter and websites.  Feel free to send back comment and questions about these topics (see Contact US).  Information included was accurate when the articles were written but things change (such as websites, etc.)!  No guarantees on stuff beyond our control.

Contents:

Calibration of your Printing Process
Color Management
Color Management Part 2 Calibration
Data Management
Depth of Field
Digital Image Organization
Digital Negatives
Digital Photo Resources
Digital Sensors and Lens Performance
HDR
How do Digital Sensors Work
Image Stabilization
ISO
Megapixels and Image Quality
Metering Modes
Noise
Preparing Images for the Internet or Projectors
Saving Your Digital Images
Sensor Cleaning
Sensor Magnification
Shooting with Color in Mind
White Balance and Color Temperature
Winter Photography


Calibration of your Printing process


Ever wonder why your prints don’t always look like what is on your display?  Why do we need to calibrate our equipment to get the color renditions we want?  Let's take a look at the whys and the hows of getting better color prints at home on your existing equipment.

To Start with, the processes used by the various pieces of equipment in personal computers differ in the way they handle color reproduction. We'll focus on monitors and printers.  This doesn’t even consider scanners!  Monitors use an additive RGB (Red-Green-Blue) process to reproduce color.  Red, green and blue when added together will produce black.  The printing process on the other hand uses the subtractive CMYK (Cyan, Magenta, Yellow and Key (Black)) process.  Don’t worry about the additive and subtractive terms; they refer to how light waves interact with your visual system.


To get your color printing to match what is on the screen, you need to calibrate both your monitor and your printer.  There are a myriad of ways to do both but the real key is to calibrate your monitor before your printer.  You can calibrate your monitor visually or mechanically, depending on critical you are with the color prints you’re making.  I found a few websites that have links to some really valuable tools for visual calibration.  If these work for you, great, if not you will need to invest in some of the hardware/software products available for monitor calibration.  I suggest you look at this website to start with visual calibration:

http://desktoppub.about.com/cs/colorcalibration/a/cal_monitor.htm

This site some has introductory text and links to some pretty good tools.  It also has links to similar sites for printer and scanner calibration.  Be warned, once you start on the process it can be captivating.  Remember the goal is to get printed images you like, not the absolute maximum calibration settings of your equipment.

 

 

Color Management

Digital photography has added a lot of new “buzz words” to our vocabulary. One of the most misused or perhaps least understood is “color management”. Discussions of color management lead to more terms like color space, rendering, sRGB, and so forth. We figured a brief foray into the realities of digital color is warranted, so we can utilize the new tools we have.

What does color management mean?

Essentially it’s the catch-all term that describes the process of making sure the colors of your subject are displayed the way you want them on the medium of your choosing, and that you can achieve that result consistently! First let’s define a few terms:

Workflow is the process or steps used to go from a captured digital image on a media card to a completed image ready for use or for storage

Color space is the portion of the color spectrum that is available to the output device (monitor, printer, etc.) for display of the image. In other words, how many colors can you hope to reproduce.

Rendering is the software process of converting a RAW image into a usable file in a specific color space. This word comes from rendition. Another techie convolution of the language.

This month we’ll concentrate on color space. The first step in color management is determining the color space in which you want to work. Remember that RAW files are “rendered” to the color space by the camera’s internal software or the post processing software in your computer. If you shoot only RAW, the rendering must be done in the post processing on your computer. If you can select multiple output files such a RAW plus jpeg, you can specify the color space for the jpeg files.

You should select your color space based on the planned use of the image. Most digital cameras will allow you to select Adobe RGB or sRGB. Adobe RGB has a slightly larger gamut of colors and is less saturated than sRGB. Adobe is more closely correlated with ink jet printers. If you are more interested in viewing your images on a monitor or projecting them with an LCD projector, the slightly smaller color gamut and more saturated sRGB color space is better. Post processing programs also allow CMYK color space which is more suited to publication formats. It’s possible to create different files from the same RAW image in various color spaces using the computer’s RAW conversion program (Photoshop™ as an example). Next month we will continue the work flow and color management discussion. We’ll look into calibration of monitors and printers.




Color Management Part 2 - Calibration

Last month we discussed color space and how to select it based on the desired use of the image. This month we’ll look into calibration of your computer’s monitor.

What is calibration and why calibrate? This reason is very simple; it allows you to get prints out of your printer that look just like to image you saw on your monitor screen. OK, what if you don’t print your images but send them out to others for printing, use in magazines, contests, and so forth? Calibration has been compared to such mundane things as getting your car aligned or flossing your teeth, not required but very good practice. When you send a digital file out for any purpose, you really don’t know if the next person down the line will do proper color management. If you know you sent out the best file possible, then the next user will start with a superior product and you’ll have a much greater probability that your image will stand out.

There has been a lot of talk about LCD monitors not being very good for calibration since they drift quite a bit over time and temperature; this was true a few years back newer models are much better and can be calibrated as good as the old CRT models.

The first step in calibration is looking at the monitor. There are whole books written on this subject. In order to calibrate your monitor, you might consider the purchase a calibration product that includes software and a sensor. Many of these packages are available priced from under $100 to several hundred. A little bit of research on the internet will allow you to pick the right one for you. There are also more manual tools, some free, that allow a degree of calibration using the computer’s internal tools and specific light sources.

A really good website with lots of links to information, manufacturer’s sites and a good starting point for monitor calibration is:

http://www.normankoren.com/makingfineprints1A.html

Take a look at this site and get started doing better color management of your images!



 

Data Management

Data management is the industry buzz word for how you handle the images you have in digital format. Data management starts with selection of memory card size and goes all the way through the file structure and the back up process you develop. These decisions are really driven by the type of shooting you do. Consider the following notes and decide what fits your needs.

1) Select a memory card size that meets your needs. I personally feel that either 16 or 32 GB is a good size. These hold up to several hundred images in RAW format. Anything bigger could mean that your entire two week trip is on one card and could be lost or damaged. Anything smaller means you have a lot of cards that may get lost.

2) If you’re on a long trip using more that one or two cards, once you fill a card you have the choice of transferring it to a laptop or other portable device OR carrying enough memory cards so you can bring home your images on the cards. These cards are very rugged and immune to things like airport x-ray machines. Prices are dropping so this is much more of a realistic approach than it was just 2 years ago. As with any decision the “pros” have a variety of opinions. Some just use the cards; others make two or three copies on DVDs or other more permanent memory devices. I like just using memory cards.

3) Once you do get home, you really do need to consider your storage plan. A key decision is back up. Backing up data can be summarized in two words “DO IT”. Computer hard drives are electro-mechanical devices that will fail – some day. Now the question is what is the right way to back up data?

4) A starting point is to do a quick edit of your images as we discussed last month. Then with your RAW images edited to remove the true discards, store them in a file on your hard drive in the most compact manner you can. I like to do this by trip. Then I copy that entire file to a high quality write once DVD, not a rewritable one, they are much less archival. Make two copies if you’d like. Blank DVDs are cheap and two copies provide good insurance. Keep the two in separate places if the images are really special. Then you can rearrange your files on the hard drive. This is where a good batch processing program (like Lightroom ™) can be helpful. You can annotate each image with the DVD identifier in one step.

5) Now once you have started to do your image processing with Photoshop, you can also make copies on other DVD’s of the resultant jpeg or psd files. That’s up to you, but you’ll have your original RAW images on a fairly archival media. Do remember however that DVD’s are not permanent like old Kodachrome™ slides. They tend to deteriorate so it’s a good idea to rewrite new ones every few years.

 


Depth of field

I know depth of field is not just a digital topic but in keeping with our plan to address some of the basics I thought I’d discuss the concept.  Depth of field is defined as the distant in front and behind the main area of focus that is acceptably sharp.  If we look at how an image is “projected” on to the camera sensor we see:

The minimal acceptable sharpness is something called a circle of confusion.  If you are really interested in the details check out http://en.wikipedia.org/wiki/Circle_of_confusion

We all know that the aperture setting impacts depth of field.  The reason can be seen in the following diagrams:













 































Light waves travel in straight lines and as we reduce the diameter of the lens opening we really extend the depth of field.  So what can we take away from this that we can apply to our everyday shooting?  Well consider the following:

a)       The smaller the f stop the greater the depth of field

b)       The greater the magnification, the shorter the depth of field. 

c)       Depth of field is typically 1/3 in front of the point of exact focus and 2/3 behind.  This changes as we increase magnification and the good rule of thumb for macro shooting is 50/50.

Go out and play with depth of field.  Try images close and far, shoot the same subject with different f stops and different magnifications.  You’ll develop a whole new set of creative tools.

 



Digital Image Organization

Life is good. No more film costs, processing fees, postage or trips to the lab, because digital photography enables us to shoot a lot of images essentially for free. With this great improvement in “image acquisition” comes added work after we come in from the field. In June the digital corner highlighted a new software tool for workflow. This month we take a look at some ideas on improving organization of all of those images we made for free. Organization and workflow are pretty closely tied together. If you know how your images will be organized, you can adapt the workflow process to make your computer time more efficient. Let’s consider some basic workflow and organization ideas.

The very first thing you need to do is an initial edit of your images to weed out the obvious bad ones. This first edit could be done with the LCD on the camera or a software package looking at the CF card before download. It should be really quick and weed out grossly out of focus, poor composition and bad exposure. This will then allow you to concentrate on the images that have a chance of making it to your permanent files.

After this initial edit, you are ready to download your “shoot”. But wait a minute! Have you decided how you want to file your images? The concept of file organization is boring but with a little planning you’ll be able to locate images in the future. One good way to set up (or convert your existing file structure) is to look back to how you stored images when they were on 35mm film. If what you did back then was good enough for your needs, you could create a file structure quite similar to that for your digital images. I used 3 ring binders and archival sleeves for my images and labeled the binders according to the content. Some examples are large mammals, small mammals, insects, wildflowers, etc. This worked pretty well as I had a list on my computer that had highlights of what was in each book. When I went digital photography in a big way, I kept that same basic concept but decided to use Windows™ ability to build multi level file structures to provide what amounts to binders with tabs and then sections with in each tab. Windows™ allows file names up to 256 characters so you can get a lot of descriptive information in the file name.

Here is a simplified version of my file structure:






















You can see that with just 5 levels you can really get some detail. Windows™ allows nesting of file well beyond 5 levels. 

Once you have you files organized, you can add additional folders at any level. Say you have never photographed wild horses before, but on a trip to western Colorado you got some great shoots. Simply add a new folder under the large mammal folder for wild horses. Note that Windows™ will arrange your files in alphabetical order so that into account when you set up your files.

I’ve found it best to create new file folders before downloading from a CF card. I can then specify the location to store the files when I do my download. Since I use Lightroom™, I can specify the import location, then bring my images in from the memory card and do a second edit on a larger screen, add location information, key words and metadata, and group images by quality.

Before your next big shoot, take some time to organize your files. The longer you wait, the more you’ll need to move around. In a future “Digital Corner” we’ll discuss backing up images and how that fits into file management.

 

 

Digital Negatives

When we moved to digital SLR’s we were confronted with the dilemma of which format to use for recording our images.  There was JPEG and RAW or a combination of both.  Many of us shoot just RAW and use post processing software to import and view the RAW images.  We then export other file formats, like JPEG or TIF!  Now we have yet another format to consider - digital negative which shows up in our computer as a .dng file. 

Why has the digital negative come to be?  The main reason is that every camera maker has developed their own RAW file format.  A Nikon RAW file is different from a Canon RAW file and both are different from any other camera maker’s RAW format.  Combine that with the distinct possibility that as we move forward, even within a brand, RAW formats will change.  As time goes on and more features are introduced on cameras, the companies will drop support for older formats.   Camera manufacturers do supply software with the products that allow us to view and do some manipulation of the captured RAW files but most photographers use the standard software tools like Photoshop and Lightroom™ .  These programs are very capable of reading individual RAW files from all of the different cameras---today!  What about tomorrow?  When Photoshop CS12 comes out in 2015 (I’m being a little facetious here) will it still support today’s RAW images?  Probably not, especially if the camera companies have dropped support for that particular format.  Adobe will only provide internal software for supported formats; they can’t afford to maintain software with obsolete versions.

What are the options?  First of all, you can stop shooting RAW and do only JPEG.  That will limit you as RAW has many advantages over JPEG.  That’s a topic for another column.  Alternatively you can use one of the software tools you have to save your images as .dng (Digital Negative) files.    Adobe products all provide that capability as do several other tools. 

The whole premise behind digital negatives is that it’s an open standard developed by Adobe that allows image files to be saved with all of the data that is captured by RAW.  The key difference is that while all RAW formats are proprietary to individual camera companies, digital negative is a public (open) format.  Open formats allow software developers full access to the information.  As new camera RAW formats come out, software products will be able to read the new format and create the .dng files right away.   The only requirement is that the RAW formats comply with the digital negative format requirements.  I can’t imagine any camera manufacturer not doing that.  It would be such a competitive disadvantage that the products would not sell well at all.

So, is it time to panic and spend the next 3 weeks converting all of your RAW files to .dng?  No, not really but you may want to consider altering your workflow to include copies of your images in .dng.  You can then periodically convert your best images as time permits.  It’s only when you upgrade your image processing software that you may eventually have a problem.



 

Digital Photo Resources


We are deeply into the digital age in photography. If anyone is still skeptical, then just do an internet search on DIGITAL PHOTOGRAPHY LEARNING and a search engine like Google will provide over 28 million hits, and that is just the filtered English language sites. Well over twenty million digital cameras were sold in the US in 2005 and the growth in sales is slowing, indicating that the digital camera market is maturing – meaning digital is the standard and film cameras have pretty much been replaced. We are all looking for resources to help with ‘going digital’ or improving our skills in digital photography. This month we wanted to take a look at how to dig out the good resources, both on line and in print, which can help.

First, let’s address the basics. The craft of photography has not changed, just the tools we use. The concepts of composition, exposure, and creativity are as important today as when we used film as a medium. The classic books by the best teachers are still valuable. But where can we go to get help for digital issues such as white balance, noise reduction, etc.? There are literally hundreds of books being written every year on digital photography, and the critics are close behind in evaluating them. Check out the photo magazines or their websites for book reviews. In our investigation, most books are pretty close to each other in what they cover. The standard approach is to set the stage with how digital works, how it differs from film, how to use the digital specific creative controls and then the authors typically go into standard photographic lessons on composition, etc.

Most of the books also address how to fix your images on the computer! We are now seeing a bit of a backlash from the experienced photographers, as they spend more and more time in front of the computer monitor. The hue and cry is to get back to basics, expose properly, compose well, and all of the other stuff we did with film. Then you don’t spend all that time fixing problems that were eliminated in the camera.

But we digress! Before you spend your money on digital photo books, classes, or lots of cool software, try the free information route first. Remember we opened with the 10’s of millions of Google hits? Well,there are a few that we found had good, free information, tips and ideas. Some of these are pretty basic, some get in to quite a bit of detail and several provide additional links for more information. We tried to limit the sites to those with minimal advertising and offers to sell stuff, but the internet is now a commercial tool so buyer beware!      Links Page





Digital Sensors and Lens Performance

Digital SLRs come in two flavors of sensors, APS and full frame.  The full frame behaves just like the 35mm film cameras.  In other words the image from a normal lens is projected onto the film so that what the lens “sees is what is on the sensor.  The APS size sensors have a crop factor and thus show a magnification ranging from 1.3x to nearly 1.7x.  We discussed this is a previous digital corner so if you what a refresher, wander over to our website and review “Sensor Size and Magnification”

This is a real advantage for wildlife in that we get the extra focal length without losing the light like you would with a teleconverter.  We pay the penalty with wide angle however. 

There is another, more subtle advantage to this smaller size.  The lens still projects the image on the sensor as if it was a 35mm film plane.  That means that those parts of the image which would normally be around the edge of the 24mm by 36mm rectangle are not recorded by the sensor pixels.  That in turn, means that those problems we used to see with lenses (especially inexpensive lenses) are gone.  The soft focus, distortion and vignetting are off of the recording surface.

Practically, this means that you can open up your lens to a larger aperture without seeing the edge problems that full frame or film would see. Your lens’ “sweet spot” just got better in that what you used shoot at say f8 to reduce distortion, you may now be able to go down to f5.6 or even f4.

Check out your camera system and see how much improvement you can get!  

 


Digital Corner: Taking the plunge on multi-exposure software - HDR.

Digital image capture has opened up the opportunities for photographers to capitalize on the tremendous advances the computer has brought to the world.  The main things we benefit from are the rapid advances in technology and the software tools that follow.  In the old film days, our ability to capture and present images was really under the control of the engineers at the camera companies and at the film producers. Very few individuals could develop and market things other than gadgets.  The world of "high tech" has known for many years that with a computer and a good knowledge of software engineering, just about anyone with a good idea can successfully develop and market great tools.

Today, with the extensive use of image processing software, a relatively open architecture in operating systems (even Apple™ has opened theirs somewhat!) and ever more powerful computers, many very talented software entrepreneurs have flooded the market with really cool image processing capabilities.  This month I wanted to give a brief overview of some software that allow photographers to go way beyond what was "doable" with film. 

To record high contrast images you can turn to several HDR software applications.  HDR, or high dynamic range is a tool that in the "old days" we could approximate with graduated ND filters and sandwiched slides in the dark room/projector.  Even then the outcome was not all that great or was limited to the ability to expose for the foreground and prevent the sky above from burning out. Today there are several very nice programs that allow the photographer to take a series of image of the same scene but at different exposures.  On one hand, the exposure for the shadows is optimized, letting the highlights burn out.  On the other hand, the exposure for the highlights is optimized, letting the shadows block up.  Add a mid tone optimized exposure and one each slightly over exposed and underexposed and we have a sequence of 5 (for example) exposure of the same scene.  Let an HDR program have these and the result is a pretty decent product that shows a properly exposed image with shadows, highlights and midtones.  There are a couple of things that will make this process much more manageable.  First, lock your camera to a tripod and then compose your image.  Although the software can adapt somewhat to slight variations in the image, that takes a lit of computer time and may not always give optimum results.  Start with a proper exposure for midtone and then move up and down in 1 stop increments to start.  Shoot in RAW and check the histogram for each image to make sure you are getting the entire range captured.  If your camera has an auto bracket feature, now is the time to use it!  You will need to experiment a bit to get a feel of how your software handles HDR images.  Experience will help decide the number of exposures either side of midtone and number of stops between each image.  You may now ask, which HDR software package is best.  Photoshop™ can be made to do HDR with many of the normal image tools but there are specific products that are designed just for HDR applications.  The most widely used is Photomatix, see http://www.hdrsoft.com.  They now offer a "lite" version for under $40.00 as a download.  When you use most HDR programs, something like a "tone mapping" and detail enhancement features are usually available.  Like most features they good points and challenges.  Be careful using it as it can lead to "comic book" appearance if you go to the extreme.  Additionally the final output from the HDR program after tone mapping can be a little flat, so plan on some time in Photoshop™ or another post processing program to spark it up.  As with any other powerful application, time and experience will make it one of your "go to" tools.

 



How do digital sensors work?

Most of us have made the shift from film to digital over the past years.  When we were shooting film a number of us also experienced the sights and smells of the darkroom and so we had a pretty good idea of how film worked.  Light interacted with photosensitive chemicals in the film emulsion and during the developing process other chemicals stabilized the transformed images on a negative or transparency.   Digital technology is not all that much different.  As I have stated many times before in this column and well as in my classes, the only difference between film and digital image making is the medium on which image is captured.  Let’s talk digital!

In the digital sensor world, photosensitive has a different meaning that in the film world.  Digital sensors are electronic parts (integrated circuits or IC’s) that have a physical structure that allows incoming light to generate electric signals.  By the way IC’s are typically referred to as chips in the industry, so I’ll be mixing terms. These signals are conducted away from the sensor site (the picture element or pixel) by very tiny wires that are part of the IC.  The wires take the signal to amplifiers (on the same chip).  The amplifiers boost the very tiny signal to a level that can be manipulated and digitized by yet more circuits.  The output of the digitizing circuit is a light level, period.  This is because the individual sensor sites on a digital sensor are monochromatic; they only see light in terms of intensity not color.  So why are all digital cameras only black and white? 

The clever design engineers who develop sensor technology also have a pretty good understanding of the human eye and how we perceive color.  This is actually a carryover from the color film technology where the designers used color sensitive layers in the emulsion.

The basic sensor is a grid pattern of structures that convert light energy (photons) into electrical energy (electrons) in a way that is not all that different from solar cells.  The ability to add color to the image is done by filtering the light before it strikes the sensor.  Our old friends red, green and blue (RGB) are at work here.  The light is filtered to allow those colors to strike specific sensors and when final signal is digitized into a light intensity level, the tiny little computer chip in the camera can correlate that intensity to a color, and thus adds color to the data file for that sensor site.  If you were to magnify the filter structure on a camera, you would find about 25% of the sites detect red, 25% blue and 50% green.  This ratio was established to allow the sensor to more closely match the response of the human eye and thus make further “post processing” easier.

After all of the sites have been scanned for light intensity and the camera settings have been added, the data is ready to be stored as a RAW image.   As a side thought, a 10 megapixels sensor has about 10 million sites, imagine how fast that little computer is working if you can should about 8 images a second.

 


Image Stabilization

Several camera and lens manufacturers offer features on cameras or lenses that compensate for camera shake or movement. The methods do vary from one manufacturer to another.

The most common, and probably the most successful method, is the use of sensors within the lens. These are known as Image Stabilization (Canon), Vibration Reduction (Nikon), and Optical Stabilizer (Sigma). Within the lens is a set of sensors that detect small movement and correct for it by moving a small optical element in the opposite direction of the shake or movement.

Other companies (such as Konica Minolta) employ a similar function in the camera body and move a prism that is located between the lens and the image sensor.

Video cameras use a digital method where the image is retrieved from different pixels on the sensor to compensate for vibration or camera movement. That works well in the video arena but has significant image blurring in still work.

The movement compensation feature was originally designed to allow slower shutter speeds while hand holding the camera and still produce sharp images. Typical claims are an apparent increase of two to three stops.

Use of this technology is not without drawbacks. There is an added weight and cost factor for the lens based approach. The in camera version also adds cost to the body but does allow use of many more lenses.

When using the stabilization capability on a tripod, there is a potential problem. If the camera and lens are very stable, the electronic circuits in the lens may become slightly unstable and cause the image to blur a small amount. Some lenses have tripod sensors and correct for this. Others have a recommendation in the manual suggesting that the feature be turned off when the lens is on a tripod. As with all photographic “rules” there is a lot of controversy about this. The stabilization feature can compensate for movement and for vibration induced by tripping the shutter at slow speeds. Even when mounted on a tripod, the ability to reduce apparent shutter vibration can be a valuable tool.

The best approach is to do some research before buying or do some testing if you already own one of these lenses or cameras.

Is the feature worth the money and extra weight? In our opinion, YES. We have a 100-400 IS zoom from Canon and love it. Everyone we have talked with has a similar feeling about that particular lens. We’d be happy to publish accounts (positive and negative) concerning member’s experiences with this or any other stabilized lens.

 


The changing face of ISO!

The newest digital cameras are touting ISO numbers that stretch the imagination.  Back in film days the very idea of shooting a film with an ISO (or ASA for the seasoned photographers in the club) exceeding 10,000 was unheard of.  Now a good many digital SLRS exceed that by staggering amounts.  How can we properly utilize this new tool?  Recent magazine articles have addressed the use of ISO as a creative tool in ways we've not seen before.  The ability to get decent saturation in very low light is the first thing that comes to mind, secondly we can crank up shutter speed and freeze action with reasonable depth of field in ways we only dreamed of in years past.   OK, what are the drawbacks to this great tool?  As with everything in this field, there are tradeoffs.  As I have mentioned before in articles and in the classes I teach, ISO does not change the sensitivity of the sensor, it changes the overall "system gain". 

Let's get into the details! The digital sensor, by its very design, has a system gain which simply put defines the amount of electrical current generated by a given amount of light striking each pixel (photons to electrons).  The engineers designing these sensors call this parameter "native sensitivity".  The electrical circuits that takes these extremely tiny signals and boosts them to a usable level are called amplifiers.  The signal coming from the amplifier (at this point it's an analog signal, related more or less linearly to the amount of light striking the sensor) is routed to another circuit called an analog to digital converter.  This circuit takes the signal and converts it into a digital form.  The number of bits that you see referenced in a lot of literature comes from the design of this circuit.  The more bits, the more information (light level) conveyed to the processor.  The ISO setting DSLRs controls the gain of the amplifier.  If the ISO is set to the "native sensitivity", the camera is pretty close in sensitivity to the equivalent ISO of film.  This is essential the optimum setting for gain.  As ISO is increased, the gain goes up and along with it, any noise in the input signal as well as noise generated by the amplifier itself.  If the ISO is reduced below the native sensitivity, the signal gets attenuated a bit and some information is lost, impacting the saturation of the image.  The software in the image processor can do quite a bit to alleviate the negative impacts of noise and to some degree the reduced information, so deviating from native sensitivity is manageable. 

So how can we translate this into something usable?  Well first you need to find the ISO for your camera that corresponds to the native sensitivity.  You can use that as a starting point for your shooting and increase or decrease as the situation dictates, but at least you'll know that starting point.   How to find out?  Good question.  It took a while but the "conventional wisdom is that Canon native sensitivity is at ISO 100 while Nikon tends to be 200.  I'll bet some internet searching can produce better as well as conflicting numbers but such is life in the digital age.  I searched on "native sensitivity + Canon + 7D and got 342 hits on Google.  Many had additional links and a good many had wrong information!

Set your camera to its native value, move the ISO up and down, and see for yourself the variability in images.  Hint- You probably won't see a lot until you get to the extremes.  Once you know, you can then use ISO as a creative control with more understanding.

 

 

Megapixels and image quality

 
The larger the number of mega pixels, the better the image right? After all, when we all learned the basics of photography we learned a few axioms. Lens quality was number one and then film grain which we could relate to ISO film speed. Well, welcome to world of high tech. The number of pixels in a camera's sensor is not a good indication of the ultimate image quality, nor is the lens quality. The design engineers have added something a whole lot more difficult to measure with a simple number. What is it? Software!

Pixels are small light sensitive elements that convert light (photons actually) into electricity (electrons). A series of filters on top of the sensor determines the basic color information and then some electronic circuitry near the light sensitive areas of each pixel amplify the signal and send it on to the micro computer chip. There are some nasty characteristics of the electronic devices that convert light to electricity. First, the smaller the pixel, the less efficient they are, meaning it takes more photons to generate a given amount of electrons. This means smaller pixels don't work as well in low light as do larger ones. Also the amount of surface area that gathers lig! ht is reduced because space is needed for the electronic circuits that amplify the signal. Worse yet, small pixels tend to generate more noise proportionally than larger ones.

This is where we look to the software. Each camera manufacturer has developed their own image processing software that turns the electrical signals into an image. This software does an amazing amount of work in a very short time. Among other things, it has noise reduction algorithms, routines that integrate the signals according to color and the capability to smooth out the edges of pixels. Only a few companies make sensors and signal processing chips, but each major camera company has its own proprietary software. Not only that, but even in one company software can vary between camera models. The most important capability of the so! ftware is the noise reduction, as it is the most difficult thing to do well.

How does this impact the camera buyer? Well, when you are looking at point and shoot digital cameras, don't just go for the 12-16 MP. A 8 MP may give you a better image. Research the web and the magazine rack to get reports on image quality before buying. Also, the sensor pixels in digital SLR's tend to be bigger so the impact of noise is reduced and the image quality improved by most all camera company software.



Metering Modes

We have all heard the comment, “well I just fix it in Photoshop™”.  I’m not too sure about you but I would rather spend my time in the field shooting than in front of the computer fixing images.  One of the best pieces of advice I ever heard, whether shooting film of digital is “get the exposure correct when you take the picture”.  If you do this, you will not need to “fix” it later.  Over the next few months, the digital corner will go back to exposure and metering basics.  This month we’ll look at the metering modes available in today’s digital SLR camera bodies.

Most digital SLR’s have followed their film predecessors in having a variety of metering modes.  The 3 basic ones are matrix or pattern, center weighted and spot. In the case of my Canon 40D, the engineers have added a 4th they call “partial”.  This forth one is really a variation on the spot meter capability. 

The “meter” in the camera is really a combination of the light sensors and the computer’s microprocessor.  The processor takes input from the sensors and computes the exposure based on the sensor light readings, the aperture setting, shutter speed, focal length, and a lot of data stored in “look up tables”.  Let’s look at each of selectable metering modes of the sensor and consider how to use them most effectively.

Matrix or pattern looks at the entire image divided into a number of segments.  In the case of the Canon 40D, there are 35 segments or zones evaluated.   Here is an example of the zones in a simpler, 13 segment matrix pattern:

                                                   

                                                      












As you can see, the zones are not uniform, but are designed to give the camera’s processor information according to a “best case” for normal compositions.  Normal is defined as average distribution of light, medium and dark tones.  Matrix is usually good for front lit scenes, or scenes with minimal contrast or a moderate mix of light and dark tones. 

Center weighted and partial modes look at the entire scene but give substantially more “weight” to the center.  Typically 70% of the information from the center area and 30% from the periphery are used to calculate exposure.  Here is an example of the center weighting approach.

                                                  

                                                           













Center weighted exposure is probably one of the better tools we have.  This mode uses a lot of the experience developed over the years of film shooting.  Center weighting, along with the exposure lock feature on the camera, provides a very good tool for getting the best exposure possible.  This mode is best for scenes with highly directional light, a scene with very bright sections and very dark sections, like a landscape with bright sky and dark foreground, and high contrast scenes.  To most effectively use center weighting, while controlling over exposure, always include the brightest area in the center (biasing toward the highlights).  Take the reading in manual or any of the creative exposure modes (shutter or aperture priority) and lock the exposure with the AE lock.  Then recompose and shoot!  A word of caution, this mode, like any other, averages to middle grey so you may need to compensate for very white or very black subject matter.

Spot and partial metering looks at a small part in the center of the frame.  All exposure information is calculated from that area.  This metering mode is very good for subjects that are in shadow, where you need to control or saturate highlights, and high contrast scenes.

 

                                                

 












Spot metering also averages to middle grey so you may need to compensate or look to use white balance to get a correct color rendition.

Next time you’re out shooting, try these different metering modes, consider the lighting of the scene and take a few notes on which mode you selected for each shot.  Consider adding some comments on why you chose that mode, and see how your images come out.  Do this a few times and maybe you’ll spend less time in front of the computer and more behind the lens!

 

 

What’s all the noise about?

Back in the days of film photography, we worried about grain in our images. It was the wisdom of the time that the faster the film, the more pronounced the grain and the less sharp the image. Exposure times, film processing parameters and such had some impact but that was really a second order effect. The guys in the green and in the yellow boxes were constantly working to improve grain structures to reduce the impact of faster speeds on image quality. That was then, this is now.

The digital revolution has introduced the photographer to the wondrous technologies of electronics and digital signal processing. But, one of the things we now must deal with in our photography is noise, or as some authors call it, digital grain. Let’ discuss its sources and how to deal with it.

What is noise? It’s all of the additional extraneous signals that come from the sensor, the analog electronics and the digital circuits in the camera that show up in our images (and usually have a negative impact on the image). This is particularity visible in images of large single color such as the sky.

Noise has a number of sources. These range from dust on the sensor to the intrinsic errors in converting an analog signal to a digital data stream.

Dust on the sensor is the easiest to understand but can be quite difficult to control. Point and shoot cameras have a pretty good sealed environment around the sensor so dust is normally not a problem, but if you do get dust problems, correcting the root cause can be tough. Digital SLR’s are much more prone to sensor dust but most have a “clean setting” that allows the photographer to blow away dust (carefully!). The other sources of noise are a bit more technical, but let’s start with light striking the sensor. Light comes in little packets call photons. When these little guys strike the sensor they are converted to electrons which are then used to generate the electrical signals containing image data. Since the sensor is warm (that is at a temperature above absolute zero) other “thermal” electrons are generated just by the heat. These electrons create a baseline signal (noise) that contaminates the image information. The signal coming off of the sensor is very small and cannot be effectively used by the circuits that convert the analog data to digital information. To remedy this, analog amplifiers are used to boost the signal. The background noise is amplified as well as the image data plus the amplifier adds its own noise contribution. The amount of amplification is controlled to a large degree by the ISO setting on the camera. The higher the ISO, the more the noise contribution. Higher ISO means the camera will use less light to generate the image. Less light means less signal from the sensor (fewer photons converted to electrons), more amplification is needed and the ratio of signal (image data) to noise is less. This shows up as higher noise or grain in the image.

Another noise source, one that’s easier to relate to film grain is the inherent nature of digital imaging. The process of producing a digital image means we are taking an infinitely variable light spectrum and converting it to discrete pieces of information. The first place is the sensor itself, made up of a fixed number of sensors, not unlike the grains in film. The analog data generated has, as we’ve seen, a noise component. This signal is then once again digitized into discrete pieces of data for processing. Each time this analog to digital conversion takes place, information is lost and edge detail is compromised. When the signal is then compressed into a jpeg file, more information is lost.

So, we have a lot of noise sources. How can we deal with them? The component and camera manufacturers have done very good things to reduce the impact of noise on our images. The first is the development of larger and larger sensors. Larger image sensors have more light gathering capability with a constant noise level. A 6 mega pixel sensor that is APS size generates more effective noise that a larger full frame sensor. The second, and probably most important improvement, is the noise reduction software in the camera itself. This “post processing” activity in the camera (with the RAW data) actually removes a lot of the noise from the signal, producing a cleaner and more grain free image.

What can the photographer do to reduce noise? Well, first use a camera with the biggest sensor you can afford. Unfortunately for the wallet, this usually means a digital SLR. If that’s not in the budget, then shop for the camera with the biggest sensor you can get and do research on the quality of the camera manufacturer’s noise reduction software.

With any camera, the following tips will also help.

1) Shoot at the lowest ISO setting you can for the conditions. Remember, this means less amplification and more light converted to image data.

2) Try to avoid very long exposures. The longer the shutter is open, the more total sensor noise is generated.

3) Use the lowest compression setting you can (high resolution jpeg has less inherent noise than low resolution – but it takes more storage space).

When you do get your images on to the computer, there are several really good noise reduction software packages available. Do some research and see what you can find. More are coming out every day.

Dealing with digital grain is really not that much different from film grain. Learn the causes and the remedies and keep on shooting.



 

 

Preparing images for the Internet or projection

Over the last year or so we have seen our monthly critique images in 35mm format go to zero.  That seems to be a good indication that our members are either shooting digital or are having their film images processed into digital files. 

There are always lots of questions and a bit of confusion on how to manipulate digital files so they are optimized for whatever you need to do with them.  There are also a myriad of sources for information and not a few opinions on the “best” way to do things.  The club has set up guidelines for our submissions for monthly member images used in the theme and critique part of the meeting.  These are pretty much set up so that you can easily modify your images so they can be emailed without taking a very long time and be projected with reasonable quality. 

We plan a series of articles over the next few months on the hows and whys of taking your digital images and making them Internet or projection ready.  This month we’d like to start with a brief tutorial on jpeg files and how they impact display and projection.  Jpeg (JAY-peg) is an acronym for Joint Photographic Experts Group.  It is a commonly used term for a variety of different file formats used for photographic images.  All digital still cameras can produce images in jpeg format.  So what is the key advantage of jpeg?  These files can take the digital image data and selectively throw away a certain portion of image without significantly impacting the image quality.  The term used for this is “compression”.  Compression is a technical term for reducing the size of a digital file (the number of kilo or megabytes) while maintaining the overall integrity of the information to a set criteria.  When using jpeg in the camera, the photographer can select the “quality” of the image.  In point and shoot camera, such as the Canon Digital Elph™, the compression options are Normal, Fine and Superfine, indicating increasing quality and file size.  In a digital SLR, such as the Canon 40D™, the photographer has more choices (6 jpeg options plus RAW).  RAW is just the basic image data from the sensor with no compression.  The choice of compression is, like just about everything in photography, a tradeoff.  Look in your camera manual for the options you have and the opinion of the manufacturer as to the “quality” of each setting.

When selecting the highest quality, the number of images you can store on a memory card is reduced but the quality of each image is usually better.  Here is where the eventual use of the image comes into play.  If your intent is to make large prints for display in a gallery or even your home, you want the absolute highest quality image you can get (and probably should stick with RAW files from the camera).  If you want small prints, medium quality is usualy just fine.  If you are going to email them to friends (or for the club critique) you can stand a much higher degree of compression.  The reason being is that computer monitors and projectors, for the most part, cannot display all of the information in a very high quality image file.  The very good flat screen computer monitors display 1280 by 1024 pixels- that’s 1.3 megapixels.  If your 10 megapixel camera produces a superfine jpeg that is the equivalent of 5-6 megapixels after compression, you are still way over what your monitor can produce.  Looking at projectors, they range from .5 megapixels to 2.2 megapixels, so again we can see that a 5-6 megapixel image in overkill for projection. 

So what does all of this mean?  When you shoot, you can pick you’re your compression based on the anticipated use of an image, or when you are doing your “post processing” at the computer, you can resize your image for a specific application.  One other specification of projectors is the “aspect ratio”.  That ratio is essentially the ratio of horizontal and vertical pixels.  This is another thing you can use to decide on final image size.  We’ll look into that in the future.



 

Saving your Digital Images

Now that you have that new digital camera and are starting to amass a large quantity of images, where do you plan to store them? When we were all shooting slides, they went into plastic pages and then into three ring binders (unless they stayed in boxes with illegible notes written in on the side (I've got lots of those). Now the world has changed and your images are nothing more than a large number of 1's and 0's on some form of storage medium. There are a large number of possible long-term storage methods, each with pros and cons.

Most cameras use Compact Flash (CF) memory cards to record as you shoot. It's possible just to keep buying those cards, especially if you only store jpeg files. 32-gigabyte (GB) cards are down below the $100 range and getting cheaper. A 32 GB card can store several thousand jpeg images from an 8 mega pixel camera. That's still Beware that CF cards are VOLATILE, meaning they be easily erased.

Let's assume you make the choice to down load your images to your PC (or Macintosh system). You now have many more choices for long term storage (notice I did not use the word permanent!). It's possible to add large hard disks to your system (either internal if you are comfortable taking your system apart) or external. External drives connect by either serial or USB (Universal Serial Bus). Hard drives hold lots of data, hundreds of gigabytes in some cases. These are somewhat expensive but you have the luxury of having your images on line and at your fingertips. On the down side, these things are mechanical and wear out, sometimes without warning and if your drive crashes, you may lose every bit of data on the drive.

There are also on-line storage services that allow you to store data at very low cost, sometimes at no cost. These are great as you can access your images from any computer on the internet, using a password. My only caution here is that no one can predict how long these service providers will be around and how much notice, if any, will be given if they do go out of business.

Most people store images on CD's or DVD's that they burn directly from their PC. These are flexible, very inexpensive and allow for many images to be stored in a very small space. So what's the drawback here? The recording method is called burning for a reason. If you record to a CD R or a DVD R (as opposed to one with an R/W) you are using what's known as an ablative process where a tiny laser actual burns marks in the surface of the disc. These marks aren't permanent; corrosion can start soon after recording and eventually make parts of the data unreadable. The current technology for write-once CD and DVD products will support a usable life of about 10 years for HIGH QUALITY discs. Don't use the very cheap discs, as their lifetime is a gamble; buy the best you can get. If you use R/W or rewritable CD's or DVD's you are using a similar process to record data as is used in hard disks, namely one with that is magnetic. These discs are not the best for long-term storage as they can suffer the same fate as magnetic material and can degrade over time (lose parts of the image).

So, just like every other facet of photography, storage of digital images is a handful of tradeoffs between price and ease of access, longevity, etc. The nice part is that you can easily make multiple, identical copies of your images with no degradation and if you can re-record data, again with no image degradation.

My recommended approach is to use high quality CD's or DVD's and note on each one's case when you recorded the data. Then every few years copy the data on to a new disc. As technology moves forward, new, better and more permanent products will become available.

 


Sensor Cleaning:

A number of members asked about sensor cleaning, so the Digital Corner did some research.

We got 1.1 million hits on a Google™ search of “sensor cleaning digital cameras”. We also queried Nikon’s™ and Canon’s™ websites. There are two schools of thought. Canon™ and Nikon™ say use clean, dry air from a squeeze bulb. Don’t use compressed air or anything that touches the sensor. (Actually you can’t really touch the sensor; the surface that is exposed is the optical low pass filter that’s over the sensor itself.)

The other school of thought was summarized in about 35 pages of text and images on the website http://www.cleaningdigitalcameras.com/ . This site is number one on the Google™ search. It is a very good reference on all of the methods, with pros and cons spelled out clearly. Our conclusion is that if you can’t clean all of those annoying blotches and dust spots using an air bulb, you can be very brave (or maybe cavalier) and use one of the methods mentioned on the website, OR you can take your camera to a professional and let someone else assume the liability.

Once you have it clean, there are a few good rules of thumb for keeping it clean. Don’t change lenses in a dusty environment, minimize the amount of time the camera is exposed to the open air w/o a lens installed, etc. Sorry there are no magic formulas, but like everything else in photography, there are always tradeoffs!




Sensor Magnification

This month we'd like to address two of the things a lot of photographers fairly new to digital photography find perplexing. First, why do digital cameras give an apparent magnification and what are the tradeoffs? The apparent magnification of a digital camera starts with the relative size difference between 35 mm film and the electronic sensor used for digital image capture: "35 mm" film has an effective image size of 24 mm high and 36 mm wide for a "horizontal" image. When we photograph something through a lens, we record a certain size image of the subject on the film.

Digital cameras have sensors that vary in size, and except for a few very high end cameras, the sensor is smaller than the 35 mm image size. In the case of the Canon EOS 7D™, the sensor is about 15 mm high and 22.5 mm wide. If you do a quick calculation you'll see that the dimensions of the 35 mm image are 1.6 times bigger then the digital sensor.

If we assume the same conditions when we photograph the same subject, that is camera to subject distance and focal length of the lens, the image on the digital sensor will be the same size as the image on the film plane. That's just standard photographic optics. But remember the size of the sensor is smaller than the 35 mm film frame.

Now the real impact of digital! The software in the camera enlarges the image to give an equivalent 35 mm image size.

In doing so, this software magnifies the image on the sensor by the same amount that is needed to make the image sensor look like the 35 mm image, in our example of the Canon EOS 7D™, this is 1.6x.

OK, we now have a l.6x magnification, what did it cost? If we had used a 1.6 teleconverter we'd have lost some of the image because the angle of view would have been decreased as the effective focal length of the lens increased, the same happens with the digital, but in this case, the information that the lens gathered was focused beyond the edges of the sensor so it was lost; the same effect as reduced angle of view. The second thing we lose with a teleconverter is light, namely the effective f stop of the lens is increased by about 1 stop. (f4 to f5.6 for example) In the case of a digital camera, this is not the case. The camera will still show the f stop as the same. BUT, the resolution of the sensor (number of pixels per unit of area) is fixed so a slight increase in what is equivalent to grain will be seen. Digital camera noise reduction software does a very good job of smoothing out this grain effect, so the apparent magnification gained is pretty close to free.

Now let's think about a few other things that may have slipped by in our discussion. First is the aspect ratio. That's a fancy mathematical term for the relative size of the horizontal and vertical dimensions. 24 x 36 or 15 x 22.5 have the same ratio, 2 to 3. This has a real impact in printed image size and can readily explain the popularity of printing image in 8 x 12 size instead of the venerable 8 by 10. 8 by 12 does not require cropping of one dimension. When digital scanning and printing became popular, the long held 8 by 10 dimension was challenged and quickly abandoned.

The second thing to think about is some of the new lenses being marketed. If you look at the magazine adds for some new products, such as Canon EF-S™ lenses, you'll see a note indicating these lenses are only for digital cameras like the Canon EOS 7D™. This is because they focus the image not to a full 24 by 36 mm area but to the size of the image sensor. Remember we said earlier that the equivalent of reduced angle of view was due to the information falling off of the edge of the sensor? This doesn't happen with these new lenses. The effect if these lenses were used with a film camera body, assuming the computer in the camera would allow the photo to be taken, would be a smaller image on the film plane.

 

 

Digital Corner  -  Color!

While my wife and I were coloring with our granddaughter the other day, it struck me that the range of colors available for children (of all ages)  to create pictures in incredible.  Just looking at her collection of crayons was quite a revelation.  Then when I considered the fact that the computer can manage over 16 million colors, my interest was really piqued.

When we are making photographic images, how can we use color to make them better, more dramatic, more moody, etc.?  I started doing some research and came across a few tidbits that I wanted to share.  First I looked at what colors are complimentary and how that could be used.  Most of us that have worked in a color darkroom remember the old color wheel.

                                   

It turns out that artists have long known that the complimentary colors, those on opposite ends of the arrows, look very good together.  That doesn't mean green and yellow don't look good next to each other but stronger images result from combining the complimentary colors.  Take for instance a flower image where the blossom is a shade of magenta and the stem and leaves green.  That can make a strong image.

Another image potential, if you have an image with vibrant colors is to go ahead and combine the non complimentary colors.  If an image has brilliant red and strong magenta tones, the "wow" factor can be quite prevalent.  It would be important in that case to make sure you have strong light to produce saturated colors so the image doesn't look subdued.  Saturation ensures colors are not subdued.

On the other end of the spectrum (pun intended) is the technique to get moody images by using subdued colors.  Here the use of complimentary colors is not really of concern.  The idea with subdued or moody is to move away from saturated colors and get a more uniform, flatter light.  Using fog or overcast skies to generate soft color is a way to get these moody images.

Some other color techniques you may want to try include having the main subject being a bright color and the rest of the image more subdued.  This works well if you main subject is small relative to the size of the image. 

Evening or nighttime photography offers a lot of color variations.  Artificial lights have varied and interesting colors that can be used to accentuate a subject in a new way.

Then of course, there is monochrome!  This does not have to be black and white.  Monochrome means one color!  This is something that the computer allows you to do in almost unlimited ways.

Next time you are out shooting, go beyond the composition and look at ways to use color to make your images stronger.




White Balance and Color Temperature

This month, as promised, we’ll look at white balance and color temperature. Most high end point and shoot and virtually all digital SLR’s have provisions for selecting white balance and color temperature. What this means is pretty simple. The photographer can select the camera’s method of representing color on the recorded image.

White balance is simply the camera’s way of looking at all colors of the spectrum and generating true white. The basic setting is AWB or auto white balance. In this mode the camera looks at the entire scene, assumes the light is about midday and comes to a conclusion as to what white should look like. The most creative white balance control is the custom one. Depending on your camera (check the manual) you can find a white object and use it to set the white balance so the camera can accurately record white in the light that’s available. You can also use other neutral colors to cause a shift in white balance but be careful; you can get some “interesting” and perhaps not too pleasant results.

If you really want to be able to alter a scene however, try the color temperature adjustment on your camera. Light is measured in “color temperature” and there is a relation between this value and warmth or coolness of the light. It’s a little confusing but the lower the color temperature, the warmer the light. This is a collision between science and art. Some examples are:

Candle light 1500, Incandescent light bulbs about 3000, morning and evening light is between 3400 and 4000, midday is 5000 to 5500. Flash is typically between 6000 and 6800 while a heavy overcast generates 6000 to 7500. Check out the final paragraph of this article for more details.

There are two ways to compensate for color temperature. One is the post processing phase (on the computer with an image processing software tool) and the other is in the camera. Anything you can do in the camera is inherently better because you are not disturbing the digital image through recompression and manipulation. It does take some experimenting to use color temperature corrections but consider this tips:

To warm up a scene, adjust the color temperature to a high (cooler) value. This is not as confusing as it first appears. The color temperature adjustment on the camera tells the software what color the light is on the subject. If you tell the camera that the light is very cool (high color temperature) the processor will “warm it up” to come to the midday neutral color. If you’re shooting at midday, try setting the color temp to 6500 if your camera uses degrees or to the “overcast” setting if it uses verbal settings. In both cases the camera will warm up the scene.

Don’t rely on the LCD screen on the camera to give you instant feedback on color. These little screens are great for checking to see if you recorded the image, if the exposure was relatively OK and if your composition was close to what you wanted. They are not good for checking color, focus or details of the image. You don’t see those until you view the image on a calibrated monitor or properly produced print.

Using color temperature creatively requires the old fashioned concept of taking notes and then reviewing the final images. From this you can start to learn how to most effectively use this creative control.

By the way, for those of you interested in how color temperature measurements are determined, here is a brief physics lesson. In you were to take an object physicists call a black body and heat it until it started to glow, the color of the object will be related to how hot it is. The lower temperatures will generate a warm orange glow that will become whiter as the temperature increases. The temperature is measured on a scale known as the Kelvin scale. This is similar to our friendly Fahrenheit scale but is shifted so that 0 degrees Kelvin is the theoretical absolute zero.

White Balance (cont'd)

As we all look to migrating to the world of digital photography, we need to be prepared to learn new terms and new ways of applying the tools of this medium to image capture. Digital has brought with it a whole new set of terms that can be daunting. The other day I was in a discussion on one of these, white balance, and it became clear that this is a scary topic for a lot of people. I wanted to take this opportunity to address white balance for those of you who have questions about what it is and to address how to use it creatively.

To understand white balance, we need to look at an old friend and some basic human vision processes. First let's consider color temperature. This concept has always been part of film photography as we have used special films, flash units etc. Color temperature refers to the color of light. The concept is one we have borrowed from the world of physics and of course we have applied our own photo-related spin to it. Color temperature is the value assigned to the color of light that radiates from a theoretical object physicists call a black body. As the temperature is raised, this body starts to glow, first a dull red, then brighter red and finally a blue-white. Since it's a theoretical object, melting is not a concern. Color temperature is measured in degrees Kelvin. This is a temperature scale that is the same as the Celsius (or Centigrade) scale except it is offset so that 0 degrees Kelvin is the same as about -273 degrees C. (This value is referred to as absolute zero, the coldest temperature that can exist in nature.) In nature we have low color temperature (red) moving up to high color temperature (blue). As photographers, we look at this in the opposite way, red being warm and blue being cool but so what!

Now let's look at human vision and how a digital sensor tries to emulate it. The human eye has two types of light sensitive organs in the retina. One type is called a rod. The rods are special organs used to provide vision in very low light and do not differentiate color. The other organ is called a cone. The eye has three types of cones. These cones detect can red, blue and green. The optic nerve conveys information from all of these sensors to the brain where the information is processed to give us vision. The designers of the sensors in digital cameras (whether CCD and CMOS) have taken the structure of the human eye and emulated it in the camera. The sensor elements or pixels are arranged so that they detect light and produce a signal that represents the amount of light energy striking the surface. In order to produce color, a series of red, blue and green filters are used in front of the sensor. This very closely emulates the concept in the human eye. This information is passed to the camera's "brain" where a software program assembles it into an image. This is where the analogy starts to break down. In the human brain we use experience to recalibrate the image we see. For example, when we read characters on a sheet of paper, the paper is white regardless of the color temperature of the light illuminating it. Just try reading a book under fluorescent lights, outside in the evening, or with incandescent lights. In each case the page appears white.

The digital camera tries to emulate the brain by setting a white reference or balance. To do this, it must be calibrated to what is truly white. Most cameras have three white balance settings. With auto white balance, the camera assumes that the scene has significant white content or more correctly, has a "normal" ratio of blue and red. The second setting is a preset white balance where the camera allows the photographer to choose a white balance based on evaluating the scene. This is similar to choosing a special film or a compensating filter to adjust for the color temperature of the light source (tungsten film, fluorescent light filters, etc.). The third white balance option is manual and represents the best creative tool. Manual white balance allows the photographer to set the white balance for the actual situation. Using a white card to allow the camera to establish the correct baseline for the conditions can do this. (Hint: Sometimes a white card can be overexposed, so when you use it, draw a black line on the card and verify the exposure by checking the image on the LCD screen. Use a felt pen and draw a line on the card. Take a shot and verify that the image on the LCD screen is properly exposed, showing the line accurately.)

The manual white balance sets the camera for "normal" conditions but it can be used creatively as well. By using a light blue card instead of a white one, you can achieve a warm tone to the image, similar to adding an 81 series filter to the camera. If you wish to cool the image, calibrate the white balance with a light pink card. There are several companies that make cards for this use. They have been around for quite a while supporting the television and film industries. One is www.warmcards.com. Also, since you're not burning film, you can experiment!

There is a tendency to set white balance to auto with the idea that you can compensate with the computer later. By all means fight this impulse! White balance is a very non-linear process, treating red, blue and green differently. When you use your computer, it will change the effective color temperature of the image uniformly across the image and not give you the same color rendition as if you properly calibrated the white balance.

 


Winter Photography

I know that I said we’d look at additional exposure topics this month but since we are deep into winter, I thought that it might be timelier to look at some of the aspects of winter shooting.  There are two areas that come to mind when one thinks of winter photography.  The first is equipment and the second is selection of subjects.

With regard to equipment a whole series of things jump out.  These things include care of your equipment in harsher than normal conditions, and care of the primary piece of equipment, the photographer.  The subjects available in winter are more limited than other seasons but we’ll look into that as well. The use of the creative controls to achieve great images is also an important part of the process.

Let’s look at care of equipment.  The two things that impact equipment in cold weather are moisture and of course temperature.  Moisture comes mostly in the form of snow and condensation from your breath.  The condensation can freeze and be a bit more insidious than snow because it can build up gradually.  Keep a soft cloth available to remove condensation from your camera (especially the LCD screen and the view finder).  Snow can become a problem if you shoot during a snow storm, so the use of a commercial lens and camera body cover or just a plastic bag can be very helpful.  Also consider that when carrying your camera, whether on or off a tripod, while you move around, exposes it to a sudden “dunking” in a snow bank if you trip and fall.  If you’ll be in areas with potential tripping hazards, consider putting your camera and lens in your pack or at least putting a plastic bag around them while moving from one location to another. 

Batteries are very much “allergic” to cold weather.  The typical digital (and most film) cameras uses a rechargeable Lithium ion battery.  These batteries provide power for everything in the camera.  You’ll find if you are out shooting in weather significantly below around 20 degrees that the battery will appear to lose charge after a while.  Keep a spare battery or two in a warm place, like a pocket inside of your cold weather jacket.  You can take the battery that has lost charge and warm it up to regain usable charge.  If you are on a multi day shoot, be sure to bring adequate chargers so you can recharge your batteries each evening.  Unlike the older NiCad batteries, newer Lithium ones don’t exhibit the half charge “memory”, so charging a partially discharged battery is acceptable.

There is an ongoing debate about winterizing cameras by changing lubricant on the shutter mechanism.  I doubt that many of us will ever be shooting in such extreme temperatures where this will be an issue.  I suggest that if you do plan to go to the deep Arctic to shoot, that you contact the camera manufacturer for guidance.

Now that we’ve protected our equipment, what about us?  Modern fabrics and materials have given us a myriad of boots, gloves, coats, pants and other garments so that comfort is no longer a pipe dream when shooting in winter.  That being said, we still need to make the best usage of these items.  I could write volumes on the different clothing items available but suffice to say that dressing in layers plus wearing boots with insulation and moisture resistance are two basic requirements.  Your layered clothing should be appropriate for the conditions (moisture resistance, insulation value, etc.).  Gloves are another key piece of equipment for the photographer.  I like to wear a pretty heavy pair when I’m out wandering and then have a pair of glove liners to wear while shooting to allow the dexterity I need.  I’ve tried the “fingerless” gloves with liners but they are a bit awkward for me.  This is a personal choice; just make sure to have adequate hand protection.

Let’s now look at subjects and creative controls.  As I said before, subject matter is somewhat more limited in winter.  That doesn’t mean a lack of subjects.  Landscapes (both large and small) abound.  Look around and see if you can find “snowscapes”, shadows, ice formations and other winter images.  Make use of the shadows, shapes and patterns to get really interesting images.  When shooting snow or ice, remember that your camera meter is designed to expose for midtones and snow in bright sun will become a grey/blue color.  Use added exposure, your color temperature or white balance controls to get a proper color.  That saves time in front of the computer correcting the balance.

Wildlife in winter is a very valid topic but more than any other season, we need to be very cautious not to stress these creatures.  Getting through winter is tough enough without people adding extra stress.  Watch for the classic signs of stress in an animal and back off if an inkling of stress is presence.  Winter wildlife shots can be very good if you are careful.  Stressed animals don’t make great shoots.

Take advantage of the season and get out shooting!

 

Winter (additional thoughts)

For this month I thought I'd do a short set of reminders on winter photography and how to make the most of your time outdoors during the cold weather.  First, some words of caution for you and your equipment. 

Cold weather is very hard on batteries.  Keep spares with you at all times and keep them warm.  An inside jacket pocket is a good place.  When batteries appear to quickly drain because of cold, warming them up will typically restore them.  Keep recycling between the camera and the warm pocket!  When shooting in the snow, do your best to keep your camera and lenses out of prolonged contact with snow and ice.  They are after all water and when they come in contact with a warm camera, they melt and can cause a lot of problems. 

When you go between cold and warm with a camera and lens, especially a big lens, remember about condensation.  Bringing a cold item into a warm place can cause moisture to condense on surfaces, including inside ones!  Tightly wrap your camera and lens in a plastic bag before coming in from the cold and leave it there until it comes to the room temperature.

While taking precautions with your equipment, don't forget about yourself!  Hats, gloves, boots and layers of clothing are very important so we avoid problems and enjoy our photographic passion.  The latest in clothing technology has been a boon for photographers.  Check out some of the new light weight, weatherproof gear.

One final note, remember about proper exposure for snow!  Your camera's metering system is designed to render images mid tone or 18% grey.  With digital cameras we can continue to overexpose snow scenes or adjust white balance instead.  Re-read your camera manual and be prepared for getting white snow instead of grey or blue.

Enjoy this marvelous winter season and get some great shots to share.



 

Images of Nature - Instruction - Workshops

Digital Corner

  Beluga Whales
  Bighorn Sheep
  Bison
  Black Bear
  Elk
  Galapagos Fur Seal
  Galapagos Sea lion
  Manatees
  Moose
  Mt Goats
  Northern Elephant Seals
  Orcas
  Wild Horses

  Aircraft and Airshows
  Aquariums
  Artsy
  Birds
  Commercial Shoots
  Events
  Flowers
  Historic Railroads
  Insects
  Large Mammals
  Locations
  Marine Life
  Mushrooms
  Reptiles
  Scenics
  Small Mammals
  Spiders
  Zoos

Views of Nature Photography