|
Forum List
>
Café LA
>
Topic
Broadcast MonitorPosted by Jen
I am new to the world of broadcast video so I need a little clarification with regards to using an external monitor. Why is it that when I view my footage on a computer monitor, it appears much darker then on the television set. Viewed on my apple display, the luminance and chrominace values look good. But when viewed on the television, the brightness levels are almost unbearable. Does this have something to do with RGB/CMYK (television is CMYK?)? Levels of brightness on a monitor? How do you prepare your media for both situations? Thanks so much.
a computer monitor is never an accurate way to view images intended for NTSC viewing.
you NEED a broadcast monitor - or at very least a proper television connected via camera, deck or converter box like a canopus advc. computer monitors and televisions are RGB, NOT CMYK. they only reason to work in CMYK is if you are doing work for print media. there is no need to prep your media for use in two places. monitor it on a broadcast monitor and your finsihed product will be fine.
> How do you prepare your media for both situations?
Do as Wayne suggests and go with what you see on a properly calibrated broadcast monitor. A television set is, unfortunately, no substitute, but it's better than watching the deceptive images on the Canvas. Also, use FCP's tools to help you stay within broadcast limits -- Scopes, Waveform Monitor. Range Check is a good, basic and fast checker for getting you near the ballpark. If it's good for NTSC, it'll be OK for computer monitors.
derekmok wrote:
>A television set is, unfortunately, no substitute, but it's better than watching the deceptive images on the Canvas So if I'm plugging a regular tv set via a deck, how can I calibrate/adjust it so that it is as helpful as possible? Do I do it with colorbars, is there a specific adjustment for color/brightness/contrast that I should pay attention to? Btw, the tv is a Sony Trinitron 29", not flat-screen
Jen-
The answer to your basic question about the differences in brightness have to do with an attribute of displays called gamma. In a nut shell, the relationship between how much energy is applied to a screen and it's lightness in not a linear one. If you think of the power applied to a screen as a value of 0-1, .5 is NOT exactly half bright. The actual relationship falls on a curve, and each display type has a different one. If you want to investigate this, go to the "Displays" setting in System Preferences on your computer. Go to the Color tab and walk through the calibate procedure. You can see the result of playing with the Gamma. -Vance
Or, the TV vcould be adjusted wrong.
How is the TV connected to the video system? That can make a big difference. Put the colorbar signal up on the timeline and push it over to the TV. Do you see the three really dark gray short vertical stipes on the lower right? You're not supposed to be able to see them. Adjust the Brightness control on the monitor until only one of the gray stripes is visible. Yes, this will change with room lighting. Put the room lights where they are going to be for final viewing. You will discover almost immediately, that Sony TVs "help you out" way more than they should. For one thing, as the brightness of the show changes naturally, Sony pushes the TV brightness around a little to make it "look better". the other thing you notice immediately, is that flesh tones are competely different on the TV than the timeline. That's Sony again making everybody look nice and healthy whether they started out that way or not. Just between those two, the TV is pretty much useless as a test for picture quality. There is one absolute place to set the color and phase (hue) controls on the monitor wheras the TV leaves it up to your imagination. You would think that the TV should be the final place to judge quality. It is if everybody had the same TV. Each manufacturer has a different distortion and the only standard is the broadcast monitor. Koz
Sorry, only registered users may post in this forum.
|
|