Here's another alternative to consider: If you can get a consumer camera that shoots HD for just a couple hundred bucks, why not load up on the cameras and get multiple angles of an event for next to no cost? And since you can move them around easily, why not perch them in unusual places so you don't need a half-dozen video camera operators? Sounds great, doesn't it?
Well, the reality is that the rolling shutter CMOS image distortion in video cameras is just as prevalent in digital still cameras. You can easily see it when you bounce the camera up and down lightly, or pan the camera side to side-things that you do routinely when you are recording video with the camera in your hands instead of on a tripod. These motions distort the image from what exists in reality. Camera flashes are partially represented across multiple frames. When you play these images back, they look completely unnatural.
To quantify these CMOS distortions, I secured two brand-new digital still cameras (Figure 1, below) that shoot HD video and pitted them side by side in some critical tests. The results clearly demonstrate the difference between CMOS and CCD when it comes to capturing video that faithfully represents what happened.
The first contestant in this competition is Canon's 10 MP CMOS-based SX1 IS. Previously available only outside the U.S., it has garnered rave reviews for the image quality of its full 1920x1080p30 HD video. This camera is nearly identical to the 10 MP CCD-based SX10 IS. However, even though there are more than enough pixels, the CCD model does not record any flavor of HD-it shoots only 640x480p30 SD.
The second competitor in this test is Canon's newer 12 MP CCD-based SX200 IS. The "IS" in these camera model names means "image stabilized." Since both the cameras tested are IS models, I'll omit that part of the model name from the rest of the article.
The SX200 has a completely revamped menu system and records 1280x720p30 HD video. It's not as high resolution as the "Full HD" SX1, but I'll leave it up to the reader to decide if accurate video is more important than the number of pixels per frame. As I've previously noted on my blog, TechThoughts.org, CMOS image problems are so widespread that other companies warn against using them, lest shooters encounter the image distortion associated with the rolling shutter. Apple goes so far as to say, "Camcorders that use CMOS image sensors often use a technology called a ‘rolling shutter.' This exposes different parts of the frame at different times until the entire frame is fully exposed. If the camcorder is moved before the entire frame is fully exposed, the resulting image may appear distorted. Sometimes, video stabilization [in editing software] may make this distortion more apparent. Videos that show signs of distortion due to your camcorder's rolling shutter may not be suitable for use with video stabilization [in editing software]."
So there's image stabilization in the cameras, and there's image stabilization in the editing programs, but neither can fix the distorted images that CMOS chips record. But how bad and how critical is it? What does it look like? There are individual examples of rolling shutter-derived distortion out on the web, but I haven't seen head-to-head tests with two similar cameras with different chips. So this is what I tested.
For this comparison, I chose to test three things: panning distortion, tilting distortion, and flash distortion. By distortion, I mean a distortion from reality-not that the image gets messed up in and of itself, although I have seen examples of CMOS chips exhibiting that on the web. I wanted to test these distortions head-to-head, CCD to CMOS: same panning speed, same tilting speed, same motion in front of the camera with the flash going off.
To do this, I mounted both cameras to one tripod head with several Monfrotto clamps (Figure 2, below). When I moved the handle of the tripod, both cameras had the same exact panning speed and motion. I could not favor one camera over the other in any way. They would also have the same tilting speed. They would be inches from each other as a flash illuminated my little daughter while she walked around the house.
Almost all the images you see here are frame grabs from the video. They are all full resolution and have only been converted to JPEG for the web and not retouched in any way. The first image of the vertical strips is a digital still from the SX1, not video. Images of the cameras themselves were taken by another camera: a CCD-based 5 MP Canon S2 IS.
To begin, I set up two tall, white strips on a brown wall (Figure 3, below). This would clearly illustrate if the motion blur was horizontal or diagonal. I panned in both directions at multiple speeds, and I found that the distortion was evident at every speed.
The rolling shutter of the CMOS chip does not "snap" a frame; it gathers the image one row of pixels at a time until it has gathered the entire chip. This approach presents no problem if there is no movement in front of the camera. However, if what is in front of the camera moves, or if the camera itself moves, whatever you are trying to record with video is in a different place by the time the camera gets around to recording pixel row two, and in a different place again when it records pixel row three, and so on. Compile those errors row after row and you get images that bear no resemblance to reality.
While some may assert that it's motion and people aren't paying attention to it, the reality is that we notice when something leans forward and back or bends left and right, no matter if the camera is panning or not. Moreover, this distortion is also evident on things that move within the frame: cars driving by, cyclists, runners, golf swings, kids playing soccer, animals' legs moving, and the like. Anything that moves is distorted to some extent. The amount of movement determines the amount of distortion.
In the top half of Figure 4 (below), you can see that panning the CCD across the white vertical panels is sharp on the top and bottom edge, and the sides are blurry, but vertical. This shows motion but properly represents the vertical panels. The CMOS takes the vertical panels and turns them into different diagonals depending on the panning direction of the camera (in the right pan, the strips slanted in the other direction). So the mind sees this moment as white panels that are leaning, which is not what is really there.
What's occurring here is that the CMOS chip is slowly gathering the image from top to bottom. As the camera pans to the right, the vertical white strip moves to the left. Over the course of scanning the CMOS chip for the frame, the white vertical strip almost moves one complete width sideways. Move back and forth a couple times and you'll really question whether those strips are actually vertical.
Another issue is that the CMOS rolling shutter distortion is not limited to horizontal panning. Vertical tilting is also affected. To demonstrate this, I tilted the cameras up and down on the same strips. For this section, I did do a bit of image editing: I combined an image from when the cameras were not tilting with the cameras tilting up and down. This way you can more easily see if there was any sort of distortion from reality (Figure 5, below).
In the top image of Figure 5, you can see that the CCD camera has blurry edges on the top and bottom, but the size of the wall remains the same as in the original still image. Interestingly, the white band seems to have become longer, while the dark brown wall has become shorter. This is because the white band throws more light into the camera than the darker wall. I split the image down the middle, and you can compare the left and right bands on the first image to see their comparative length.
The CMOS comparison (Figure 5, bottom) is much more interesting. When the camera tilts down, the wall is moving up while the CMOS chip is scanning down rows. As it descends a row, the wall appears to move further up until it prematurely reaches the bottom of the wall. This makes the wall look several feet shorter than it is.
Conversely, when you tilt the camera up, the wall moves down the image, and as the CMOS chip scans down the rows, it finds the same piece of wall over and over again. This seemingly extends the wall several feet taller than it is. The center image in the bottom of Figure 5 is the same wall from the same position and the same zoom. I compiled this image by taking the left white band and sliding it to the left and taking the right band and sliding it to the right, revealing both bands in the static image for comparison. This can be verified by the amount of tree evident over each white band.
Regardless of whether you're tilting or panning, CMOS chips clearly distort the reality that they are being counted on to capture.
For the flash test I enlisted the assistance of my 1-year-old daughter to walk around and show me the things she likes to play with while I carried the dual camera mount in one hand and fired a separate camera flash to simulate video you might record while other people are taking pictures. The flash was not connected or related to either of the two digital cameras in any way.
Figure 6 (below) shows the frame before the flash and the frame(s) of the flash as captured by the CCD camera. This is just one moment of time represented in still images. There is a bit of horizontal offset between the two cameras because the subject is only 3' from the two cameras. This does not affect the test.
The CCD image represents the flash as a full-on, blinding blanket of light. As many of us have experienced, that is a very accurate representation of how a flash feels and looks in reality, including the temporary blindness that you get.
In Figure 7 (below), two separate CMOS frames (middle and bottom) show the strobe. It is the bottom part of the first frame, then the top part of the second frame. Does this mean that the strobe fired twice, once for the first thirtieth of a second and then a little bit more another thirtieth a second later? That's what the frames of video say. If you look closely, you'll also see that there is a portion of the subject that is never illuminated by the strobe (her top three fingers holding the bowl). How does that happen in reality?
Moreover, my test results suggest that this particular CMOS division of the image is not repeated. Each time, it's random-top, bottom, a single frame or two frames, a partial frame, and so forth. It does follow a "bottom-top" sequence several times, which looks very weird on video. It looks like the sequence is backward, and there were two flashes.
This footage would be especially troublesome to watch if you slowed it down-an effect commonly used for event video. Imagine your subjects coming in to a room, arms up, greeting the cheering crowd. You want to slow down this shot for a video montage. Well, those partial flashes will now last much longer, and their random nature will become very visible.
The test video clip shows clearly how the CMOS chip misrepresented the various flashes. For event video where there will be flashes going off, this is a serious concern. If your CMOS camera will be used handheld, following motion (such as dancing) while pictures are being taken, you can understand how the images you capture will not accurately represent reality.
In comparison, a CCD chip faithfully captures motion, fast and slow. It also properly represents flashes of light to be the single, momentary blowouts that they are. And these differences will persist whether you're working with a still camera that does video or a more conventional CCD or CMOS-based video camera model.
When selecting a camera to capture precious moments for posterity, the images you record will be all that are left years or decades from now. Do you want them to faithfully represent the people and action that were in front of you? Or are you looking to add something creative to your video? CMOS distortions can be used creatively, provided that you know what will happen and that you want to deliberately make images with the peculiarities inherent to CMOS-generated images of motion or strobed images.
As CMOS chips are embedded in more and more consumer and pro gear year after year, I sincerely hope the manufacturers will figure out how to correct the chips' inherent rolling shutter distortions so that we can record undistorted images. Camera manufacturers already use software to correct for chromatic aberrations from inexpensive lenses focusing light onto ever-tinier chips. Software can do amazing things. Perhaps the engineers working on upcoming technology will figure out how to gather the data from a CMOS chip all at once, much like what happens with a CCD. Let's keep our fingers crossed.
Anthony Burokas (VidPro at ieba.com) of IEBA Communications has shot award-winning corporate video internationally and recorded events since the days of 3/4" tape. He is currently technical director for the PBS series Flavors of America and resides just outside of Dallas.