|
Forum List
>
Café LA
>
Topic
Optimizing photos for HD importPosted by jclocke
Hello,
I'm wondering if anyone knows of any tutorials on formatting and optimizing still images for import into an FCP HD project? Something like this [www.lafcpug.org] but updated for HD? Thanks!
The concept is the same. Resize to the size you need- there are quite a few HD frame sizes in use. Just make sure the size is under 4K.
www.strypesinpost.com
The thing that I'm wondering about specifically is the final pixel-shape adjustment where you "squish" it from 534 height down to 480 before importing it into FCP, as well as which dpi is best.
Can I assume that 300 dpi is best, and that whatever the final size of the photo, I need to bring the height down to 0.89% of its original height before importing?
Depends on your timeline format. Uncompressed HD and ProRes timelines use square pixels. DVCPRO HD uses non-square pixels.
And no, the concept of "dpi" ceases to have any meaning once you get out of the print world. Like Strypes said, ignore the whole "dots per inch" print convention and just scale your stills to something under 4K, applying whatever pixel-aspect-ratio correction is appropriate for your timeline format.
Best to resize to the size you need, do not exceed 4K, or better yet, do that sequence in After Effects, or Motion.
There are lots of frame sizes in HD. There are 2 standards (720p and 1080), whether they use square pixels depends on the format and codec. HDV, DvcproHD and some variations of XDCAM are anamorphic formats. ProRes supports anamorphic, non-anamorphic and square pixel formats from 2K down to SD. You need to get that out to the frame size/aspect that you need, the smaller the better without losing quality. www.strypesinpost.com
No no no no no! There is no "rule of thumb" or assumption that a DPI is fine without the context of the original image size! You have to be specific and its very easy to learn! Pixels are the only thing that counts for image dimensions in Video. Read the FAQWiki !!!! [www.lafcpug.org] For instant answers to more than one hundred common FCP questions, check out the LAFCPUG FAQ Wiki here : [www.lafcpug.org]
[Can I assume that 300 dpi is best]
Ben is right; a 320 DPI scan will perfectly fill a 1920 X 1080 frame-- if you're scanning a 6" wide photo! Blanket formulas aren't always best. What if you need to scan a 1" wide postage stamp? Or a 14" wide map? Here's tutorial to look at which integrates real world flat art sizes we normally encounter: Intelligent Photo Scanning - Loren Today's FCP keytip: Toggle Dynamic Trim with Command-Shift-D! Final Cut Studio 2 KeyGuide? Power Pack. Now available at KeyGuide Central. www.neotrondesign.com
Just to drive the point home: Motion picture effects work and DI are typically done at 2K resolution before being printed to 35mm film. That means a frame of 35mm film print has a resolution of approximately 2360 dpi. Until you remember that that same frame is typically shown on a screen that's about 60 feet across, meaning that same image has an actual resolution of about three dpi.
The whole concept of "dots per inch" simply doesn't apply to video or film.
It really doesnt matter at all what your DPI settings are for the image once it is in Photoshop...
All you want to know is that you have scanned enough pixel data for your final screensize. So YES 2000dpi would be fine because it doesnt make a blind bit of difference. If your source material was 1 inch wide and you want it for 1080HD video then you scan it in at HIGHER than 1920dpi to get at least 1920 pixels from that 1 inch! Go read the Wiki properly because its explicit: To work out your scan size in pixels: Image size (width in inches) x dpi setting = horizontal pixel size Image size (height in inches) x dpi setting = vertical pixel size To work out what dpi setting you need for a particular image size: Horizontal pixel size you require (Square pixels) / Image size (width in inches) = dpi setting Vertical pixel size require (Square pixels) / Image size (width in inches) = dpi setting For instant answers to more than one hundred common FCP questions, check out the LAFCPUG FAQ Wiki here : [www.lafcpug.org]
Joey, you're thinking of the PostScript definition of a "point," a unit of measurement equal to 1/72nd of an inch. Early Macs had a screen resolution of 72 pixels per horizontal or vertical inch so the math of translating on-screen graphics into PostScript would be simpler. But it's been decades since computer screens had 72 pixels to an inch, and besides, none of that ever applied to television or film.
No, actually, I am not. To illustrate, open Photoshop CS2 or above and select File / New. Under Preset, go down to HDV 1920 x 1080. It automatically loads in 72 pixels/inch in the Resolution window. Why would it load those settings if they weren't a standard?
BTW...loads 72 ppi in all the Film presets as well. When life gives you dilemmas...make dilemmanade.
That's just because Photoshop, for some reason, lacks the ability to disregard the whole (in this context) pointless "dpi" concept entirely. It has to have some notion of how many pixels in your image correspond to an inch on paper, even if your image will never actually appear on paper.
After Effects, which is conceptually similar to Photoshop in most respects, has no concept of "dpi" at all. The fact that Photoshop plugs in a totally arbitrary, historically rooted number into a mandatory spot doesn't really mean anything.
Nope TV doesn't have a DPI unless you tell me the size and effective resolution of your screen then you can work it out.
Its all a confusion from old Mac days when the wysiwyg screen resolution was 72ppi (Pixels Per Inch) for actual size desktop publishing. This hangover has continued and along the way gotten confused with dpi and so we are here with the crazy questions about dpi on screen! If you had a perfect 50" SD monitor showing overscan with every single pixel of a 720x480 image in 4:3 Its PPI would be 14.4ppi horizontally and 16ppi vertically Avoid using dpi to describe images for video or electronic display UNLESS they are destined for a print output. For instant answers to more than one hundred common FCP questions, check out the LAFCPUG FAQ Wiki here : [www.lafcpug.org]
you can tell this is one of my pet hates can't you...
For instant answers to more than one hundred common FCP questions, check out the LAFCPUG FAQ Wiki here : [www.lafcpug.org]
[It really doesnt matter at all what your DPI settings are for the image once it is in Photoshop...
All you want to know is that you have scanned enough pixel data for your final screensize. So YES 2000dpi would be fine because it doesnt make a blind bit of difference. ] When you're bringing analog art into a digital world it makes extreme difference what your DPI setting is, because it will translate to video PPI-- pixels per inch. There is no other translation pipeline. On a scanner, you will not find "pixels per inch"-- the language is dots per inch, from the publishing world. Once scanned, it's immediately pixels per inch. Believe me, the system works, because the approach comes from what producers throw at you for scanning-- everything from a postage stamp to a world map. Try it. Or don't. Hundreds use it successfully; if you're satisfied with your method, by all means use that. But blanket formula for screen sizes and DPI settings don't work because one size doesn't fit all needs. It's not a size or setting, it's the approach that counts, and that always depends on what you encounter for scanning and what format it's intended for. - Loren Today's FCP keytip: Toggle Dynamic Trim with Command-Shift-D! Final Cut Studio 2 KeyGuide? Power Pack. Now available at KeyGuide Central. www.neotrondesign.com
How do you figure? From where I sit, After Effects is "resolution independent" only in the sense that it can scale stills and footage. Every comp in After Effects has a defined size in pixels, as well as a pixel aspect ratio. In that sense, it's totally resolution dependent. The whole confusion here arises from the use of the word "resolution" in the print world. In print, "resolution" is a function of pixels per inch. If you have enough pixels per inch, your image will look sharp and clear regardless of whether it's as small as a postage stamp or as big as a movie poster. If you take an image with sufficient pixels per inch and enlarge it, you're reducing the number of pixels per inch, which will make it fuzzy. But television and film just don't work like that, period. Talking about film or television frames in terms of pixels per inch is like describing music in terms of notes per gallon. The concept just doesn't apply.
Loren you misread what I said... (or maybe I took your post wrong )
Looking at the image dimensions - It was approx 1.0"x0.5" image at 2000dpi with translated to the 1920x1080 pixel image. I also said in the quote you pulled out... "It really doesnt matter at all what your DPI settings are for the image once it is in Photoshop..." We are talking about the digital image on screen - not printing it out again or scanning it in! I think we agreed on that earlier no? Oh yeah and anyone who doesn't agree with me will have to arm Wrestle me and explain why I'm wrong at NAB if they don't concur... ...and I've being workin out No wrestling by proxy either! For instant answers to more than one hundred common FCP questions, check out the LAFCPUG FAQ Wiki here : [www.lafcpug.org]
"Heeeee so youz a wiseguy huh? Wayayaorta!"
For instant answers to more than one hundred common FCP questions, check out the LAFCPUG FAQ Wiki here : [www.lafcpug.org]
Joey, the relation is between dpi, pixels and physical length.
Export a still from FCP and open that up in Photoshop. Go to resize image, take down the physical dimensions in inches, and the length in pixels. Change dpi from 72 to 300. Then re-adjust your pixels back to what it was, and notice that the physical dimensions have changed. www.strypesinpost.com
the logic ive always applied has been:
lets say youre working on a 1920x1080 HD show. you have a still photo that youre prepping in photoshop. you know that you want to be able to scale it up to 200% and not lose quality. so you build it at 3840x2160. now im like joey in that everything ive ever built in photoshop for video has been 72dpi - just because thats the way ive done it since like 1995. on a side note, i also always just assumed that if i made the images 144dpi id have 2x scaling headroom in say AE or FCP. i was so wrong. i just did an experiment (embarrassed that ive never tried this prior to tonight) i built two identical graphics in photoshop. one 720x480@72dpi and one 720x480@200dpi - i just assumed that the 200dpi file would have given me nearly 3x as much headroom to scale once brought into after effects. they both looked identical when blown up 300% in aftereffects so i tried it in reverse and again in photoshop built the same 720x480 image but this time at 12 dpi. to my surprise, it wasnt noticeably any different from the 72 or 200 dpi version. i now wonder if the 72dpi (or any dpi) baseline actually made a difference 10+ years or so ago and software advances have now negated it - or if we were just always following an incorrect assumption...???...???
>i now wonder if the 72dpi (or any dpi) baseline actually made a difference 10+ years or so ago
Jeff got it. I used to think it was 72dpi/720x576 pixels. But but basically there's no relation between actual screen size and dpi, since all that concerns video is how many pixels you have. Print however, is a different story. 720x576, without a stated dpi, you don't have a physical size to print out to. www.strypesinpost.com
Sorry, only registered users may post in this forum.
|
|