From A Whisper to A Screen – Digital Cinema On The Rise

     |    Thursday March 15th, 2012

By Andrew Horn

I remember once reading what I thought was an amusing quote by Joseph Campbell which was something about how he thought computers were like religion – you expect them to solve all your problems but in fact they drive you crazy.

The same could be said for the whole so-called digital filmmaking thing. That being said, I will be the first to declare that working with increasing degrees of digital technology made the last three films I did possible. Not only could I not have been able to what I did in these films, let’s face it – if I had done them on actual film, I couldn’t have afforded to do them at all.

The idea – on my latest – that I no longer even need to make a 35mm negative in order to show it was, shall we say, a very liberating thing. Pretty much every film festival, most, if not all, art film venues, and an ever increasing amount of normal theaters have digital projection. And it’s been looking pretty good these days.

The idea that you send over your digital tape, or now, your hard drive, and that’s that. No more lugging around film cans, no more print costs and, even better, no more scratches or dirt or rips. Every time you show it, it looks brand new. All problems solved, right?

Well that would be what my little ignorant mind would think.

And then I got this press release from the Berlinale a couple of days ago. Usually these things are nigh on to bragging about how great everything went and how well equipped the festival was, and all their latest technology and blah blah blah. But this time the press release was almost like one long kvetch, going into minute detail on all the “substantial technical, logistical and financial challenges” involved in just screening the movies. Where once upon a time we had the choice between showing a 16 or 35mm film – and there might be a few wide-screen formats thrown in there – they now announce they have all sorts of “diverse digital formats” requiring “a network of 37 digital cinema servers at the festival venues – each with a capacity of 2.75 terabytes” as well as “5 SAS data libraries, each with a capacity of 24 terabytes,” and an additional 11 high performance digital projects, some with 3D capabilities, all requiring the support and assistance of companies including Barco, Dolby, Kinoton, Colt Technology Services, and Media Logic.

They go on to say that, “to complicate matters, the process of creating DCP files that work flawlessly and providing them each with a valid key (KDM) so they can be played back at the right time and place is in itself a common source of errors – a problem that is often underestimated by producers and/or post-production companies. In combination with the infrastructure available at a specific location, this can, in the worst case, make it impossible to play back a specific film or to show it without errors.”

As a result, “an elaborate procedure to test the technical quality and structural integrity of the submitted DCP formats was set up in close cooperation with the Fraunhofer Institute for Integrated Circuits.” Aside from the cadre of experts belonging to the above mentioned companies, the Berlinale had their own team of 45 people in charge of technical co-ordination, while prior to the festival itself, 80 projectionists had to receive special “in-depth training” to cope with the various anticipated technical challenges.

That was this year. God knows what it’s going to be like next year.

I have to say I was hyperventilating just reading about it.

As one of the aforementioned “producers”, it’s just more scary things to think about. But frankly I shouldn’t be all that surprised. Trying to make things easier can just get so complicated. I’ve been going through my own similar kvetch with my last movie. We shot the film with a Canon 5D, which if you haven’t heard of it, is actually a still camera with a video capability. This was supposed to be an added attraction, but it turned out so well that Canon was at first totally unprepared to deal with the popularity of it all – and never mind the other two companion models, the 7D and the T1 Rebel (now T3!), all in true HD mind you. In a resolution, or shall we say image quality, that is so crisp that I find it a bit disconcerting. And it was cheap!

All well and good, except that it necessitated a whole series of three-way international phone calls between me, my cameraman and my editor, to try and determine how all this data – no more video tape now – was to be downloaded, stored and encoded. Since this was not officially a video system, there was no official instructions on how it all went. The good part being that it spawned a whole grassroots dialogue through YouTube and Vimeo about what to do and how to do it, along with the accompanying “you-ask-three-people-and-you-get-four-answers” effect.

Then came all the – what felt like to me endless (and endlessly confusing) – phone calls between me and my editor, which just prepared us for the weeks of rendering the material into compressed “proxy files” that could be used to edit with without crashing the system because of the sheer size of the “camera original”.

For all this I needed a series of hard drives for the camera original, backups and second backups as well as drives for the proxy files and their backups. Then there was my drive for the various edited playouts so I could look at stuff at home. (You probably know the story.)

We could all take pride in having doped it all out, and in a better world we would now be all prepared for the next time we had to do this. But of course by the time I finish this film and am on to the next (assuming I should live so long) this format will probably no longer exist.

In fact, the first film I ever did with digital editing, way back in the far flung 90s, had various technical specifications written into our contract for delivering the final product, which, as we were finishing, we found out were not only obsolete, but the tv station we had to deliver to didn’t even understand the questions we were asking.

According to the film “Side By Side”, a documentary which dealt with the transition to digital filmmaking that I saw at the Berlinale this year, since the development of the original Portapak self contained video system, there have been like 84 different formats come and gone. What’s more, in the last few years, I’ve been struggling with how I’m going to digitally archive the various films I’ve made, and trying to get a straight answer on what format to put them on has been basically impossible. The best suggestion (at the point I stopped asking) being digibeta, which itself is on the way out, but is at least uncompressed and relatively stable. For the time being.

One of the points made in “Side By Side” was that in spite of the fact that film is on it’s way out, it’s still the most stable and constant medium we have. Somebody in the movie pointed out that you can keep shooting on film as long as you want. But the question is, only who’s going to develop it? (Ironically the announcement of Kodak’s bankruptcy came only a couple of weeks before the movie premiered)

Progress can be just so paralyzing.

Which reminds me, I’ve still got several 400 ft. rolls of 16mm film sitting in my fridge. And they’re only 17 years old. Any takers?

Ahorn