Bracketing your exposures and automatically blending them together in your preferred software is probably the handiest way to get the most dynamic range from your camera without introducing a load of noise. But how can you blend shots accurately when your subject is moving?
In this video, Spencer Cox explains in clear and concise detail how you can avoid that ugly and noticeable ghosting effect you get when your software spits out an HDR photo from exposures taken of a moving subject, such as tree leaves against a sky. For landscape photography, I would always recommend a tripod and manually blending exposures in Photoshop — you've more control, and the final product is much more natural-looking. However, sometimes, you just want to take a quick snap and move on. But when you've got some dark shadows and a bright sky, one shot isn't enough for most cameras to be able to capture the full dynamic range of the scene.
Image averaging works on the assumption that noise in an image is random, so it fluctuates from image to image. Where HDR will try to take a dark exposure of a sky and bright exposure of a shaded area and attempt to blend them together, image averaging doesn't require different levels of exposure. Instead, it will take all the images — taken with the exact same settings — and reduce the range of fluctuations of pixel brightness, bringing the level of noise closer to the mean (which is zero noise).
The rough rule is that your noise will drop by the square root of the number of images averaged. So, if you take four images, you can reduce your noise by half, but if you want one quarter of the noise, you need to take 16 images. As Cox notes in the video, the main drawback of image averaging is this law of diminishing returns.
Do you use image averaging to deal with noise in your photos?