With all the mass-marketing hype behind ‘new, never before seen’ benefits of 4K television, the average consumer can be forgiven for not knowing that 4K content has been an increasingly regular feature of our media for the past decade.
I found myself reflecting on this back in November, following the conclusion of a 70mm IMAX presentation of Nolan’s space-time epic Interstellar. Being at one of only two locations in the UK equipped to show the movie in said format, I count myself lucky. The scale was spectacular, the depth of colour jaw-dropping, and the flawed realism an almost out-of-body experience only enhanced by film grain… film grain??? You know that quirk of cinema created as the projector lamp is shone through imperfections in a reel of film? Yeah, I can’t remember when it ceased to exist either.
The whole ‘Digital Cinema’ nonentity has been completely missed by the majority of us in virtue of the fact that film is a technically superior medium. Sure, digital projection is cheap, but what cinema is going to advertise a cheaper system with the exorbitant price of a cinema ticket in 2015? Since the advent of digital cinema in 1999 (Star Wars: Episode I) the initial focus of the industry has been on simplifying the running of a theatre. No longer is a skilled projectionist required on-hand throughout the film; a simple button press at the start of the day will suffice to launch a playlist for the next 24 hours. Any improvement in the digital cinema experience can be seen as incidental catch-up. That is, until now.
“Sure, digital projection is cheap, but what cinema is going to advertise a cheaper system with the exorbitant price of a cinema ticket in 2015?”
Claims that Nolan, Tarantino et al. make of ‘compromising their artwork’ by shooting digital begin to sound hollow when technology begins to match the quality and artistic flexibility of analogue film. That point is now (For one example of a beautifully shot digital film, see last year’s Gone Girl). On the horizon, however, there is a whole new breed of laser projection systems coupled with new imaging standards that threaten to blow the entire established hierarchy out of the water. Dolby Digital (still better known in the industry for their surround sound prowess) has identified the key difference between how we perceive screen and reality is the brightness. Specifically not the brightness of the whole picture, but the contrast between the brightest highlights and the darkest shadows in an image. With brighter highlights comes a wide range of colours that are completely absent from the typical screen. If we were to consider the image below as what we see on a sunny day outside (‘nit’ being a unit of brightness*) it is incredible to think that a conventional theatre projector currently has the paltry range of 0-100 nits.
As human beings, numbers are a substandard way to sell the story. High-up journalists/filmmakers who have attended demos attest to the visceral prowess of this technology. By tricking the mind into thinking that it truly perceives reality, instinct gives way to phantom sensation. Flames feel hot; an icy scene seems to generate a chill in the room. Glasses-free 3D becomes an irrelevant concept; after all, do you ever doubt that the real world has spatial depth? Laser projection has the potential to make this level of immersion possible.
“By tricking the mind into thinking that it truly perceives reality, instinct gives way to phantom sensation.”
Dolby Vision is just one example where the technology allowing us to reach the pinnacle of cinema has arrived. There will be minor leaps in streamlining and efficiency over the years but, fundamentally, filmmaking in its current form can get no better. All that is left is for the industry to adopt these new standards while raising the bar on other fronts to keep up with an unprecedented degree of audio-visual clarity. The disparity between ‘technology’ and ‘other fronts’ has been a jarring shock for those of us who braved Peter Jackson’s ‘Hobbit’ Trilogy in HFR (High Frame Rate). Plot complaints aside, much was made of the unrealistic make-up and set-design as was exposed by filming at a higher frame rate.
Yet, over the course of the three movies, I found the negative side-effects getting less pronounced as artists and cinematographers alike learnt to aspire to the new standards of perfection within their trade. Even Dolby’s own Atmos surround sound system (an insane 128 channels as opposed to the standard six or eight) is having problems gaining mass market usage. Imagine closing your eyes and being able to physically hear an object travelling around your body at varying heights in 3D space. This is all possible with current technology; it’s just up to sound engineers to want to record and mix 128 channels of surround sound. I for one will definitely be making the effort to see Avengers: Age of Ultron at an Atmos-equipped cinema when it’s released in the format.
“…fundamentally, filmmaking in its current form can get no better.”
I guess the point that this article is trying to make is cinema has outgrown the whole ‘film vs digital’ argument. Digital can and will go above and beyond anything we ever thought possible in the realms of traditional cinema. The debate itself is very much analogous to the music argument of vinyl vs. CD. Yes, we do lose out on the ‘fuzzy warmth’ of film grain in place of a cold, ethereal perfection, and some like the imperfections in film whilst others scorn them. In the real world though, the average cinema-goer (even if they had noticed) simply doesn’t care. To go beyond this apex of theatre would be to change the entire filmmaking paradigm. And while we wait for that to happen, I’ll begin saving up for my Oculus Rift.
*1 nit = 1 candela per m2 = cdm-2
For more on The Future Of Cinema click here