DIGITAL TECH - PHOTOGRAPHER

Shop Talk

Beyond Resolution: Is There Such A Thing As An Image That Is Too Sharp?

It seems like just yesterday we were discussing the problems faced by makeup artists and production designers by the introduction of HD? What a proclivity a few year makes huh? Many of today's camera companies and television makers like Sony are pushing for 4K and beyond to become the new standard. 

As cinematographers we understand that every camera today works essentially as a film stock and each has its own proclivity toward a certain look.  

As we move toward cameras that resolve at higher and higher resolutions there are several questions we must inherently ask ourselves. Does it matter more to capture in 4K or higher and then downsampling to 2k? Is there such a thing as having too much resolution at the aquisition stage? Have we already reached that point digitally? 

First I'll defer to David Mullen, ASC. "Generally there is a principle that you should sample an image at a higher resolution than it contains. A 4K scan barely allows this; generally 35mm film seems to resolve around 3K in detail, which means it is best to scan it at 4K to 6K to avoid aliasing. Essentially you want to oversample it. Since the ARRISCANNER can do 6K, a lot of archival work is now done starting with 6K scans, which are then downsampled to 4K.

A number of D.I.'s involve a 4K scan that is downsampled to 2K for the rest of the work. The idea here is that even though the finished master is in 2K, a 4K scan insures that every bit of grain in the original is faithfully reproduced.

There was a even bigger reason to finish at 4K if one was going to film-out the results because there is some sharpness loss, especially if you were recording a 4K dupe negative but then striking an IP and then multiple IN's to make mass release prints, so if you had started with a 2K film-out and then went through multiple generations to release print, the results would be a bit softer (unfortunately a lot of movies did it this way.) But now most movies are released in a 2K DCP so the quality doesn't really drop further down.

And some studios (Warner Bros. mainly, perhaps Sony) are pushing more for a 4K finish because they want movies that went through a D.I. archived at 4K and there are plans for 4K distribution in the future.

Most D.I. facilities can work at 4K, it's just that it is 4X the data to handle, and they charge more for dealing with it. The other limitation has been that visual efx companies are adverse to doing the work at 4K, particularly if the movie is 3D because that's already 2X the work."

Roger Deakins, ASC, BSC says "There is an obsession about whether a camera captures in 4K or 6K but the fact is that the images that are produced by all of these cameras are compressed before you actually get to see them. If you are really interested in image 'quality' and not just a number that sounds 'cool' then use your eyes and/or study the output of the camera (that is the image that comes out at the end of the workflow) rather than what are its theoretical capabilities"

All that being said there is a lot of marketing tricks involved in the way these camera sensors are being described. Once the sensor processes the light through its photosites the image must be compressed. Essentially as my former professor, Dejan Georgevich, ASC used to say "you are squeezing 10lbs into a 5lb bag. Also there are factors like Modular Transfer Function of the Lens and temperature that play a huge part determining your actual resolution at aquisition. 4K doesnt really mean 4K. 

Instead of focusing so much on higher resolution we would be better served to continue to work on Dynamic range and color gamut and telling good, visually dynamic stories. 

Edward PagesComment