Archive

Archive for the ‘Uncategorized’ Category

Crossing The MMAxis

January 24, 2012 Leave a comment

When you cover a sport like hockey, basketball or football, there is a clear, defined direction of play, that being east-west, or for the viewer, left-right. However, what happens in a sport when there is no clearly-defined direction of play? With MMA becoming more and more mainstream everyday, and more events being televised, I felt it time to write down my thoughts on where the line is drawn in crossing the axis in MMA.

However, before we go there, let’s talk about what crossing the axis is and how it affects the broadcast of a sport. Let’s take football as our example. When we watch a game on TV, the main game camera is set up in the stands around the 50-yard line. With the camera here, the action goes from left to right (endzone to endzone). Thus, the axis is created…an imaginary line that intersects the field parallel to the direction of play. If a camera is placed on the opposite side of this line, then we create a “reverse angle”. However, if we switch to it live, then a jarring effect happens. If a player is running the ball right-to-left on the screen, and we cut to our reverse-angle camera, that player is now running left-to-right, causing it to seem if the player is going in the wrong direction.

This is why reverse-angle cameras are only used as replays. We’re established the direction of play by watching it from beginning to end, so we know that the player isn’t actually going the wrong way. To avoid crossing the axis, all cameras are placed on one side of the axis. Cameras placed dead-centre in the endzone are okay, as you have the full 180 degree axis to work with. This is why netcams in hockey and above-the-rim cameras in basketball are allowed…they sit on the axis without crossing it.

Now, that brings us to MMA. By the very nature of the field of play (in this case, the cage), we create a sport that has a true 360 degree direction of play…or, more realistically, no real direction of play. Grappling can take place anywhere in the cage, wins can happen at any spot. There is no goalline, no net…no target to reach”. In this sense, an axis cannot be defined.

But let’s take the sport out of the equation. Let’s look at this purely from a broadcasting point of view. First, let’s look at the placement of your cameras. For the most part, you’ll usually only need 3 handheld cameras cageside. You’ll probably have a jib too, but we’ll put that aside for now. Around the cage, your cameras will, more than likely, be set up in a triangular formation. What this creates is a unique formation. With no clear direction of play, we can take our shot as being ON the axis, rather than facing the axis.

Now, the rules of the axis in any sport dictate that, if you switch to a camera on the axis, you can then go to any camera you want, on either side of the axis. The reason…because by switching to a camera on the axis, we have actually changed the direction of play for the viewing audience. In the case of hockey, if we go to the behind-the-net camera, the action now goes north-south. We now can safely switch to a reverse angle shot, because from north-south, it doesn’t confuse the viewer if you switch to either east-west or west-east. This is the same reason jibs are so useful…they can cross the axis in one fluid motion, but because it’s the same shot, you actually witness the axis being crossed, so you’re no longer disoriented.

Now that we’ve established that we can flip back and forth across the axis so long as we start on the axis, we allow ourselves the ability to switch to either camera 2 or 3 from camera 1 in our triangular camera positioning around an MMA cage. And since we have established that there is no direction of play, we actually create a new axis with every camera change. Or, if you’re a stickler for rules, consider this…since there is no set direction of play, it can be assumed that there is an infinite number of directions for play to go….therefore, an infinite number of axis for us to use. So, no matter what camera you use, you are technically ON an axis and can therefore switch to any camera.

So there you have it…a clear case of breaking the rules by knowing the rules. Granted, this is all in theory, but having done a number of MMA shows, the theory holds. No shot seems out of place or mirrored, as in left-right sports. I would love to hear your thoughts on this, because as with all things in television, things change and rules become mere guidelines. So please have your say and let the discussion begin!

Advertisements
Categories: Uncategorized

It’s More Than Just Words On The Screen – It’s A Message

January 11, 2012 1 comment

Over the course of an edit session, you may spend hours or longer fretting over a number of edit choices. Tempo, music, colour tinting, which take to use…the list goes on and on. Let me throw in another decision that, whether you realize it or not, can have a huge impact on your project.

Choice of font.

Now, before you write this off as just a perfectionist nitpicking, have a quick watch of this video, then we’ll continue.

Admittedly, this is completely overboard, but the wrong font choice can have a negative impact on the effectiveness of what you’re trying to accomplish. The problem is, with so many font choices out there, you could agonize over which font best works for your piece. Add to that, the choice of bolding the font, or writing in italics, the placement on the screen, upsetting the visual balance of an image…all of this goes in to adding font. Here are some thing to consider when it comes time to putting words on the screen.

1) Does the font match the feel of the piece?

If you’re doing a fact board about deaths caused by second-hand smoke over the last 5 decades, I’m pretty sure you don’t want to write it in Comic Sans. You laugh, but some people really like that font. The truth is, not all fonts are universal. Think about what you’re trying to convey, then really look at it typed in a number of different fonts. Here’s a few examples for you…a simple message, but notice how the change in font can completely change the impact.

You see how just a simple change in font can change the meaning intended by the words? The font you choose can convey more emotion into your message than your choice of words. Just having the words on-screen isn’t enough anymore. Consider this…how many times have you been waiting in line somewhere and there’s a TV on that you can see, but you can’t hear? Or in a restaurant or bar? You need to be able to convey the same sentiment visually as you do audibly. Also…choose one font and stick with it. If you have 5 different fonts on the screen, by the time the viewer has deciphered and read everything, the font is gone. Simple and clear are the key.

2) Does your screen have a proper balance to it?

Belive it or not, where you put the words on the screen in relation to what’s behind the font can affect what people see. It’s called screen balance…creating a weighted symmetry to your screen, so that not one thing overpowers or is overshadowed by something else? Consider this…you have a picture of a person on the right side of your screen, and you have some font you need on the screen. If you put it across the bottom of the screen, the left side is empty, making your picture right-heavy. It`s the same reason why, when you shoot a sitdown interview with two people, you make sure the framing is the same on both shots. That way, going back and forth from one camera to the other is not jarring. Also, if you have a tight, close-up shot on the right, use a bigger-sized font. A looser pictrue…smaller font size. The key here is symmetry. Make your screen balance so avoid skewing the message.

3) Watch out for what’s behind the words!

Have you ever wondered why there are options for drop shadows and borders/edges on your font? Allow me to enlighten you…

You should be conscience of the colour scheme of the shot, and keep it in mind when choosing your font colour. A simple drop shadow and edge can also help differentiate the font from the background. The last thing you want is for the message to be lost in the medium…or worse yet, misinterpreted, as in the video example above.

Above all else, though, is clarity. Here is a good time to bring in the buddy system. If you bring in a friend and they have to openly question what is written, or ask you to bring it back so they can re-read it…change the font. TV is a visual medium…make the visuals clear and easy to read, and the piece will be better for it. But, above all else…have fun!

The Buddy System

December 30, 2011 Leave a comment

Any editor will tell you that the actual act of editing is a solitary experience. You get in to your suite, lock the door, black out any windows and don’t come out until your masterpiece is complete. The problem is reaching the endgame with a sense of objectivity. By being so secluded and glued to every edit, you don’t have the ability to properly judge what you have done. Think I’m kidding? Go back and take a look at some of your earlier works. Go on, find that box of unlabeled DVD’s, old VHS tapes with nothing on it but your name or whatever medium that they live on and watch it. I’ll wait.

 

Done yet?

 

Good. Now, what did you see? The answer in all cases should be “something I didn’t see back then”. Some of that, you can attribute to “if I knew then what I know now”, but that’s not always the case. Whenever we purposefully do something in the edit suite, we do it because it seems like a good idea at the time. We have some kind of brainstorm/post production moment of illumination and set the thought process in motion, assured that it is the absolute best thing you could possibly do for your project. The problem is, most of the time, you’re the only one judging the piece, and unless you have the ability to come back and look at it days later before submitting, you’ll still think that it’s the best piece you’ve ever done.

 

The problem is the isolation factor. Editors get into a zone, and when they’re in that zone, nothing can get them out of it. Think about every step you take in putting a piece together. You load the material, most times watching it while it’s loading and making mental edits while you’re doing it, going through the clips meticulously and placing them in order, fretting over every shot and audio edit, then fine tuning and sometimes colour correcting for hours or even days. After so much edit time on one project, you have a perfect picture in your head of what you want it to look like, how it should look and sound and the end result isn’t necessarily what you envision. But because you have such a clear picture, you can’t see the true edit from your mental projection.

 

Now, the last thing anybody wants is someone in that small broom closet of an edit suite with you for every single minute of the edit session questioning your every splice and split edit. Most times, it not only results in lost time, but countless screaming matches and multiple levels of frustration. This is where I like to rely on the Buddy System as a form of instant criticism. Now, before you go and get your best friend and volun-tell him/her that they have to watch everything you edit, make sure you pick your Buddy well. Here are some criteria to look for…

 

1)      Find someone with an idea of what it is you are trying to do. You want to recruit someone who watches a lot of TV, and may even be in the industry themselves. One thing you want to avoid, though, is another editor. While they’ll be able to see things that you may have missed in your tunnel-vision state, like frame flashes, they’ll also be watching it as an editor, and making their own editorial decisions. You’ll most likely get responses like “oh, I wouldn’t do that”, or “are you sure that’s what you want to do”, not necessarily based on what they feel the piece needs, but based on how they would have edited it. No two editors will cut a piece the same way, so be wary of someone who wants to inject their own editorial thought process into it. Camera operators are good opinions, as are audio operators. That way, you’ll get both the video and audio perspective.

2)      Find someone who isn’t afraid to say your stuff stinks. How many times have you shown someone a piece of work that you have done, only to have them say “Oh, that’s nice”, as the answer to any question you ask them about it? They could be saying it’s nice because they don’t want to hurt your feelings, because they don’t know what else to say…or maybe because it’s genuinely nice. Either way, it’s not CONSTRUCTIVE criticism. Finding someone who isn’t afraid to make you see something or isn’t shy in saying that something really doesn’t feel right is rare. If you find that person, have them on speed dial. That opinion is valuable. Just don’t piss them off by returning the favour by saying that something they’ve done is “nice”.

3)      Find someone outside the project. If someone has spent the same amount of time shot listing the material and knows it as well as you do, then they too are too close to the project. Remember…the people watching your stuff in the end have never seen a second of the raw footage shot, so they go in with a clean slate and will take everything in that state of mind. So for your proof watch, get someone who has an equally clean slate as your first-time viewers. You’ll get a true sense of the affect and impact your work will have on its intended audience.

 

Now this may seem like a lot to ask of someone, so get a few people. That way, if you have to make changes after the first run through, you’ll still have a fresh set of eyes after you’ve made changes. If you can get a review group together, then you’re laughing. Just be good to them. Otherwise, everything you do will be “nice”. And always remember that constructive criticism is meant to be constructive. Don’t be deconstructed by it. In the end, it’s what’s better for the project, and your work will undoubtedly be better in the end for it.

To Jump Or Not To Jump

December 29, 2011 2 comments

There’s an old saying in television – “you have to know the rules before you can break the rules”. I don’t think there’s a single traditional edit rule that this applies to more than the Jump Cut. Considered to be a serious no-no in some video venues, it can be an effective effect if done right, and in the right context. But where is it right and where is it wrong, and is there a level of right and wrong in those contexts?

For newer editors currently scratching their heads and aren’t familiar with the concept, a jump cut is a cut from one shot to another in which the location, framing and positioning of the shot is very similar bordering on identical, but the subject in the shot has moved, causing it to appear as if the subject has instantly jumped from one position to the next. Now that I’ve described, you’re probably saying top yourself, “Oh, like in…”. That’s where the dilemna/debate comes in. How can something considered to be a “do not” have so many obvious examples where it works?

First, let’s take a look at where it is a no-no, and for that, you need look no further than your TV at 6pm every night. News editing is one of the last bastions of a genre of editing that strictly abides by all of the fundamental rules of editing. Never crossing the axis, b-rolling over jump cuts, low BG sound under broll to fill out the audio spectrum, letting action in the shot begin and end…all of these rules are adhered to religiously by news editors. The reason is because, while I consider editing to be 10% technical and 90% artistic, the end goal of editing in news is to inform and tell a story, and as with most stories, being too artistic detracts from the end message. The viewer shouldn’t have to deduce intent from artistic style. That’s why you can hear clear audio edits in interview clips under broll…because we don’t want you to see edit, as it results in a jump cut.

Now, before we go any further, let’s take a look at examples of things that aren’t jump cuts, and why…

1) Dissolves – this is probably the easiest way of getting around the jump cut, as it smoothes the transition between the two shots, takes away the instantaneousness of the cut and creates the illusion of time change. You see this sometimes in sports highlight cutting. If Mariano Rivera of the New York Yankees were to strike out the side in the 9th, you wouldn’t want to just cut between the three strike outs because, more often than not, you only have one camera shot to edit from…the main game camera. Putting a dissolve between each strikeout helps us see the transition of time while smoothing the harshness. For a non-sports analogy, think about how many times you see a shot of a clock, and then time passes. How do they do it…by dissolving t0 a later time. Technically speaking, if they were to just cut between the two shots, it would be a jump cut, as the framing and composition would be the same.

2) Stop-motion animation – If you’re really stingy on the definition and follow it to the letter, then all stop-motion animation is a jump cut. The framing stays the same, the compostion is usually the same, only the subject changes position. However, since the frequency of the usage of jump cuts is what creates this animation style, it can’t be considered the breaking of an edit rule. The same goes for…

3) Time Lapse – While this example really doesn’t carry into the world of non-linear editing, simply based on the fact that our ability to pull off a time lapse now is done through a couple of keystrokes, let’s go back to the days of tape. Back then, time lapse meant shooting sometimes as little as one frame per second. Then, you would take all those frames and put them together at normal speed, creating a clip that ran at 30 times normal speed. The key here is that to perform time lapse photography, you must in essence commit a jump cut. However, as the old saying goes…once is a mistake, twice is an oversight, three times is an effect. Time lapse photography requires a lot of jump cuts, therefore it’s an effect.

Now that we’ve cleared that up, let’s deflate the elephant in the edit suite and admit that jump cuts are also considered an artistic style. Some of the earliest examples of jump cut as style can be seen in the works of Jean-Luc Godard. But you don’t have to go back that far for an example. Any show that has a handheld, documentary-style look to it uses jump cuts to add to the “grittiness” of the mood. Fans of Homicide: Life In The Street will now start to recognize what I’m talking about.

I think where you see a lot of it, though, is in comedy shows. Not scripted sitcoms, but in Guerilla-style comedy shows. For Canadian readers, think back to early Tom Green Show episodes. While perhaps not done intentionally, the use of jump cuts created a style for the show that was unlike anything on Canadian TV at the time. It was raw, fresh…and made rules by breaking them. It said “it’s okay to do this”, and people since have followed suit.

So, when deciding whether or not you want to use jump cuts or adhere to the “rules”, ask yourself this…does it suit what I’m cutting. In the end, that’s the only question that matters, and should be the only deciding factor in any edit that you do.

Can The Cans!

December 28, 2011 3 comments

When I first started editing, you had a lot of gear in a small room, and you took a lot on faith that you were getting a good audio mixes. The reason…the best speakers in the place were in the control room, as that’s where the meat of audio production took place. You were usually relegated to stereo speakers wired into the system, or worse, the tiny speakers that were built into the monitor that was in the rack. However, I would happily go back to those days of editing in a broom closet on speakers that crack at the slightest onset of a popped “p” then the trend I’m seeing today.

With laptop computers becoming more and more powerful, and able to handle editing in 720p and 1080p, people are editing bigger projects on smaller computers. That’s not the problem…it’s how they’re listening to it. Headphones are becoming the studio monitor of the Edit DIYer, and it’s affecting their products without them really knowing it until it’s too late.

Headphones…really good headphones…have a way of bringing out the best in music. I’ve lost track of how many times I’ve listened to a song at home, then listened to it on my mp3 player with a good set of headphones. It was only on then did I fully get the layering of tracks that was done in the studio, and fully appreciated everything that the song had to offer. When you’re editing, the same things happen. You hear everything that’s on your timeline clearly, and so you therefore think that everyone else can too. The truth is you’re hearing it too good. To get a true sense of the audio mix on your piece…unplug the headphones and listen to it on speakers. There are a few reasons for this…

1)      Most people at home don’t have a sweet enough setup to replicate the mix you would be getting out of your headphones. It’s the same reason why we adhere to action safe and title safe guides when putting font into our pieces…because along the way, something is lost from our screen to the viewers. The same holds true for audio. To help remedy this, use “The Crap Test”. The behind it is quite simple…if you can make something sound great on crap speakers, then the people at home watching, no matter what their setup, will have a good audio mix. I used this theory when making an album years ago. We would set our levels on the audio board, do a quick dump to cassette, then run into the singer’s car and listen to it on the really bad speakers in his car. If something wasn’t coming through (background vocals, bass, etc), then we would go back in, adjust the levels on the board, then make another mix and test before we okayed the mix. Only after you are happy with your audio on crap speakers will everyone be happy at home.

2)      If you stick by the “one hour for one minute of finished product” edit ratio, then by the time you get to the point where you are ready to give something a full watch-through, you can probably recite your piece word-for-word like it’s your favourite song. It’s at this point that you are officially “too close” to the project. You’ve lost objectivity in listening to it because you KNOW what’s supposed to be there. This is where the Buddy System of editing really comes in handy. I’ll go more into the Buddy System in another post, but to briefly touch on it, you need a fresh set of ears to get a true feeling for if the viewer will hear everything. If at any point during the watch-through, your buddy says “wait, back that up…what did he say?”, then you should probably adjust the mix.

3)      What you do in the edit suite may not translate well to the home viewer. Have you ever listened to something and wondered “why the hell did they put that in there?” Some kind of audio effect that, while it seemed like a good idea at the time, really doesn’t translate well in the long run? It’s the same thing as video effects…some seem like a really good idea at the time, but by the time you’re done with it; you have trouble determining what you’re actually watching. Audio can be the same way. You can have BG sound, music, SFX and actual interview sound…but if one overpowers the other, or if everything resides in the same frequency, then things start to get muddy. Remember this…sparsness breeds clarity. If you’re trying to say something with your piece, let it be heard. Don’t let the medium obscure the message.

Simply put, you can avoid a bad audio mix by doing a few simple things…listen to it on speakers, watch your audio peak meters and don’t put too much when a little will do. In the end, you’ll have a more effective piece because it can be heard.

Categories: Uncategorized Tags: , , ,

In An Ideal World…

November 25, 2011 Leave a comment

What if we all spoke the same language?

It’s an intriguing question when you look at it in a global context. Gone would be the inability to communicate with people in different countries. No longer would someone fear that a weaker grasp of a foreign language has the ability to hold them back. The world would have one less difference for people to discriminate with. For fans of Douglas Adams, it would be as if we all inserted a Babel Fish into our ear.

The same thought could be put into the world of editing systems. What if there were one format, one codec that everyone agreed upon? Right about now, anyone who has ever waited patiently for the blue bars of a conversion program to finish has dreams of a Codec Utopia.  Not only would it eliminate some of the “hurry up and wait” that all editors know all too well, it would help avoid some of the digital degradation of our footage.

Let’s take a step back for a second and relive the days of analog tape. Betacam SP, 3/4″ U-matic and other formats that gave us headaches with all their flaws. In those days, we were very cognizant of how many generations we lost through the various versions and dubs we had to make. Every time we went from one tape to the next, we lost a generation. If you were the kind of person to edit your stories on a work tape, and then edit from the work tape to package the show on to the master show tape, then you lost another generation. Then, if you had to do a “Best Of” show, you were probably cutting from a master onto a new master…you see the vicious cycle. Every generation meant you lost a little bit of the picture quality. The same went for ENG departments on a budget who couldn’t always go out in to the field with single pass tapes. Tape hits were the norm after a few run-throughs, and editors sometimes lost great shots because of simple wear and tear.

Nowadays, cameras shoot file-based media on to digital storage devices, and you can copy files as many times as you like without fear of quality loss. But when it comes to putting that media into a show, you run into problems. First, there’s the question of what system you use. Some programs, Avid, take your source media and convert it into its native media format. Yes, it’s high quality, but in essence, you’re taking the shots, breaking it down and re-encoding it in an entirely different language. Anyone who’s seen a badly translated sign will all of a sudden see the potential for error.

Now, what happens if you have a large shop with different edit systems? If one edit system’s high-quality export format isn’t read by the other system, you tend to have to find a compromise, which usually means a lower-quality, compressed codec. That means you aren’t working with the best quality footage possible. And then comes the occasional need to transfer media across the internet, either via ftp or other transfer protocol. In order to facilitate this in a low-bandwidth environment, you usually have to compress the file…another language change, which they may have to re-encode for their systems on the other end.

Even our country plays a part in the incompatibility of systems. In the world, there are 3 television formats…NTSC, PAL and SECAM. Each of these have different frame rates and scan rates. PAL, for example, works at 25 or 50 fps, while NTSC works at 29.97 or 59.97. To be able to cut something encoded at 50 fps for NTSC, you must first convert the file to the NTSC frame rate, which affects the overall look of the viz. I won’t even begin to get in to cameras that shoot at 24 fps that then need to be cut at 29.97.

You see, despite all of our advances in technology, we fall in to the same problems as we had 10-20 years ago. Commercial competition breeds not only better products, but the realm of incompatibility on a larger scale. Now, there are edit systems out there that read many different codecs and formats, and even read them off of the original source, without having to pre-convert the media. However, what happens when you render? When you export? It may be idealistic, naïve and entirely inconceivable for us to one day see one system that every newsroom, production house and editor working independently at home uses. It may never happen…but it bears some food for thought. Instead of making products that make our picture bigger, crisper and 3D, let’s remember the golden rule of television editing…Garbage in, garbage out. At the very least, it would end conversations like this…

Even In The Quietest Moments

November 15, 2011 Leave a comment

 

In my life, I try to juggle a number of things. There’s the needs of managing a department, my work on shows that has me out-of-town at times, prepping class lessons…all the while making sure to have quality at-home time with my family. It’s a very rewarding life I lead, but does require some time management. What I’ve discovered is that there are some stretches of time that before went by where I could get more done. Riding the train in to work provides 20 minutes of quality writing time. Waiting for the train allows me to go through emails. Having two computers on my desk leads to multi-tasking.

 

The same can be said in the edit suite. There is an old axiom in television…”Hurry up and wait”. There may never have been a truer saying for the editor’s life in the non-linear world. In my previous post, I talked about being prepared before you even step into the edit suite…but what about once you’re into the project?

 

The beauty about the non-linear world of editing is that it IS non-linear. In the days of tape-to-tape editing, you had to wait for certain things to be completed before you moved forward. Oh sure, you could cut a story or feature on a separate tape and then dump that in once you got to that point in the show…but you lost a generation of video quality. Not exactly rewarding for you to think ahead if it compromises the quality of your work. But that’s not a problem anymore. You can start a new sequence, cut what you need to cut, and in some cases drop that sequence down on to your master timeline as one chunk without any video degradation. This allows you to keep working while waiting for an element of the show that’s taking a little longer than expected.

 

In some cases, even rendering time can be beneficial. It’s true that the life of the non-linear editor can sometimes be measured in progress bars. At times, it seems we watch those green/blue/black units of completion scroll across our screen more than we edit. But that “can’t do anything else” time doesn’t have to be a mandatory break. Think about what you can do that doesn’t require your editing computer. Anything from email correspondence with clients updating them on progress, gathering shots from an archive system on another computer, listening to music for the next piece in advance so you’re not wasting valuable edit time…all this can be accomplished in the time it takes your system to do a long render.

 

Of course, and perhaps above all else, is to make sure that you feel ready for the next part of the project. Sometimes, this means taking that 10 minutes or so and grabbing a snack, heating your lunch or simply refueling on coffee. The more alert and nourished you are, the less likely you are to make little mistakes. When I pull an all-nighter, I go in prepared…not just in having my materials in order, but in making sure I have what I like to call Edit Fuel. Have a late small meal prepared for the long renders, and some snacks for the shorter ones. If you’re tired, and you feel like you’ve been up for 3 days straight when you’re only in hour 6, you will start to miss things. This is where frame flashes, unwanted jump cuts and bad audio mixes happen…when the editor is not at 100%. It’s true that sleep deprivation can affect your hearing, so if you’re too tired to hear properly, how can you make a good mix? Take care of yourself and you’ll be better able to take care of your project.