LOMMA asked me to shoot the event, and in the small-town way that characterizes Blue Ridge country, left all the details and deliverables up to me. I think their expectations were in the unedited VHS dub range, but when I asked what they wanted, they responded, "You figure it out." What I "figured out," ultimately, was edited video authored to DVD. My last column dealt with the five Wows I tried to earn with the DVD authoring. Here I cover shooting and editing.
Far and away, my biggest gambit was to shoot the event with two camcorders. Running two camcorders in two different locations with one cameraman (in this case, me) is somewhat risky, since one accidental blow can knock either camcorder out of position. But if you pull it off, the quality of video rises exponentially.
Here's how I set the cameras up. In the back was my trusty Sony DCR-VX2000, the primary camcorder that framed the entire stage and provided the main footage I would use in the production. I stationed this camera next to the soundboard, and asked the sound technician to make sure no one touched it during the show.
Camera B was a loaner Sony HC40 that I positioned just below the stage, moving from the left edge to the right edge depending upon who was playing lead on the current song. I spent most of the time driving this camera, taking two basic shots: medium shots of the performer from waist to head, and close-ups of the instrument itself.
Note that this two-camera strategy works just as well for a conference, speech, or a wedding. One largely unattended camera captures the entire event, while the second provides the visual garnish. Just be sure you use a fluid head tripod with Camera B; otherwise, you'll waste valuable time positioning your pans and tilts, and you'll miss critical footage.
Wow number two was capturing audio from the house sound system to the VX2000, providing a much higher-quality audio feed than any microphone could. Output from the sound system was a single XLR connector, which I fed into a BeachTek DXA-8 Adapter sitting beneath the VX2000. For backup, I attached a shotgun microphone to the HC40 and recorded audio there as well.
Back in the lab, I captured the video from both cameras and inserted the output of Camera A—the VX2000 with the wide angle view of the entire stage—on the bottom video track. Above that was Camera B, the stage camera with close-ups of the performers. Since the event lasted slightly longer than two hours, I switched tapes in both cameras three times. Though I would use only the audio from Camera A, I still had to synchronize the video footage from the two cameras so I could switch back and forth without losing sound synchronization.
I knew this was coming, so I brought a cheap disposable camera with a flash to the concert. When I changed the tape in either camera, I quickly walked to the front and took a picture (off-camera, of course). Both video cameras captured the flash, which vastly simplified synchronizing the two streams. With this accomplished, I deleted the audio feed from Camera B and started editing.
All multitrack timelines work the same way. The top track gets precedence, and in the absence of some kind of overlay effect, obscures all tracks below. With the feed from Camera B atop Camera A on the timeline, I had three options. I could leave the video from Camera B on top, and show Camera B in the final video. I could split and delete portions of the Camera B video and show Camera A in the final video. Or, I could convert Camera B into a picture in picture (PIP) effect which displayed as a window within the full-screen video from Camera A.
I quickly settled into the following routine: Show Camera A, the entire stage, at the start and end of every song. Intersperse clips from Camera B in full screen when it showed a medium shot of the performer for Wow number three. Show Camera B footage as a PIP within Camera A when I zoomed into the instrument, positioning the window in the bottom left hand corner of the Camera A video where it did not obscure any performers. PIP was Wow number four.
I did not use transitions when I switched from track A to track B, but I did use a half-second dissolve when displaying or removing the PIP effect. I also slightly feathered the edge of the PIP effect to soften the hard edge of the frame. Overall, the video from the second camera—and the way it was integrated into the production—elicited far more Wows than I needed to complete my five shooting and editing Wows.
Note that these efforts required lots of advanced planning. Days before the event, I tested the connection to the sound system, assessed the various camera setup positions, and checked white balance and exposure settings under the planned concert lights. I also checked when breaks were planned, so I could change tapes strategically and preserve the primary audio track in the VX2000.
In addition, note that working with two cameras increased editing time significantly, not only because capture time doubled, but because my editing options doubled as well. Though this was volunteer work, all I can say is that the Wows definitely justified the means, and the effort and time involved. They always do.