by John Schuermann » Sat Jan 31, 2015 4:49 pm
Here are excerpts from a much longer article I wrote about this stuff for a different outlet:
Much of what is discussed here has to do with resolution in terms of pixel counts and potential resolutions, but there are numerous hard facts of the movie-making and post-production processes we keep coming up against. One of the things we are discovering is that 35mm film really does not support much in the way of visible picture detail at scans much above 2K (essentially 1080P), especially if the original was shot Super 35mm (4K scans are likely overkill). Scope films shot anamorphically, on the other hand, do have a bit more visible (and therefore usable) resolution. I've sat in on comparisons between film sourced 4K scans and 2K down-rezzed versions, and the visual difference is almost zero. Unless the film in question was shot with a native 4K / 5K camera or sourced from 70mm, there seems to be little visible benefit to going to resolutions much higher than 1080P.
I also quite often hear from people who (quite understandably) would like to see their favorite blockbusters released in 4K. However, we now come up against another limitation in terms of how movies are typically assembled in post-production. In the case of almost every FX heavy movie over the last 10 years right up until today, the FX have been rendered at 2K. So even if you went back and re-scanned the film at 4K, you would have to totally recreate every CGI effect and composite from scratch at 4K resolution - something that would be incredibly expensive and time consuming. As others have pointed out, even those films that possess very little in obvious FX work are edited and graded at 2K resolution levels. Even the average romantic comedy can have numerous CGI fix-ups, composites for lighting alteration, adding or subtracting buildings, changing parts of locations, etc. It will perhaps be the rare movie which *doesn't* have any 2K rendered FX elements incorporated. The downside here is that any movie that was finished in 2K will need to be totally re-assembled in 4K with all new FX renders. All this means time and money.
It is true that post-production suites are moving toward 4K capabilities, but this will only help us in terms of future content. If a movie was shot digitally in 4K and transferred in 4K AND the FX / composites were rendered in 4K, yes, they might be more impressive at 4K / 5K resolutions assuming that you are standing less than 2 picture heights away from the screen (once you get more than 2X the screen height away, the difference between HD and 4K becomes very difficult to discern). Remember, the difference between HD and 4K is in fine detail, not picture sharpness. You can simply make out more fine detail in a 4K image than a 1080P image, again assuming you are close enough to the screen for your eye to even make out that level of detail. (BTW, the number of movies shot and finished in 4K is extremely small - most of them are Sony titles.)
Here's where it gets even more complicated. At 4K resolutions we are literally getting to the limits of what not only the human eye can perceive on a display at reasonable seating distances, you are also getting to the limits of what can be resolved on the source medium. For example, if you move the camera (especially at 24 frames per second, the motion picture standard), motion blur becomes such an issue that fine detail gets destroyed. If your shot is even slightly out of focus, fine detail gets destroyed. If you aren't using camera lenses that resolve 4K resolution, those fine details won't even be captured. If you are shooting a landscape on a hazy day, if the camera shakes, if your shutter speed is too long, if the photosensors in the camera don't actually resolve 4K, the list of things that can destroy fine detail levels goes on and on.
If you look at the 4K demo clips TV manufacturers use on their sets, you will notice that they all feature ROCK STEADY high frame rate footage of picture subjects that possess fine patterns and betray very little movement. Of course, your typical Hollywood movie features quick cutting, a moving camera, etc, so therefore the likely visible benefit of 4K is greatly diminished. All of this is precisely why Hollywood is looking to things like High Dynamic Range and an expanded color gamut for next generation video and theatrical formats, as the benefits of these technologies are obvious at ANY seating distance and with any material. That's where most of the real picture improvements will come from.
All of that said, there are cases where a film shot in 4K or with high resolution film stock WILL definitely benefit from higher resolution technology. Films shot in 70mm, such as Patton, 2001, The Sound of Music, or Interstellar (some shot in IMAX) would definitely benefit, as there are often nice long static shots in those films and the film stock itself supports high detail. Films shot 1.85:1 or anamorphic 2.40:1 35mm could also benefit, but to a lesser degree.