There's a particular discussion that has come up again and again on our forum that we've now noticed cropping up in reactions to the footage we posted last week. So I decided to open it up here for a more public debate.
I know that this discussion is a heated one, and there is no correct opinion. The intention of this post is to tell you about the decisions we have made when designing our camera and about how we arrived at those decisions.
To me the core functions of digital cameras are:
1. Drive the sensor in a clean way with good A/D conversion.
2. Transport and store the image data collected by the sensor in the best way possible.
3. Provide the user a good experience and a high-value proposition.
To me this means we create the electronics that run our amazing Kodak designed sensor, and then get out of the way so that filmmakers can have an image as close to sensor data as possible. Kinda like a film camera does with film.
Many cameras makers believe their job is to make your life easier by giving you a few limited shooting styles and smaller file sizes through compression, again limiting your choices, this time in post. We believe our job is to make a camera that gives the maximum control and freedom to the artist, both on set and in post. This is our North Star, the guiding light behind all of our design choices. How do we get the most accurate representation of what the sensor captured to the filmmaker in the most pliable format?
RAW VS. COMPRESSION
Debayering is hard. When running a really nice debayer algorithm in 2K resolution, most desktops computers can only do a few frames a second at the fastest, 4K takes longer. To do this on the fly most cameras use inferior algorithms.
D16 footage is impressive. Our designers and engineers have worked really hard, researching components, tuning the sensor to perfection, designing amazing analog to digital conversion modules, optimizing data paths and write speeds, and generally doing everything we can to protect the image integrity as it travels through the camera from sensor to storage. Basically it takes a lot of work to protect a 12-bit raw file as it travels through the camera. It isn't automatic. Cameras are either built for raw or they're not.
In the near future, when people inevitably make their camera comparison tests comparing raw footage on the D16 to other cameras, they will be impressed, even when the other cameras are much more expensive. But if/when we add compression formats, that will change completely. The processing power in our camera won't be good enough to run the best debayer algorithms. And when people do their camera comparison tests and compare our compressed footage to other cameras' compressed footage, the image will be pretty much the same, except without the rolling shutter. All of our other advantages, all of the research, all of the hard work, all of our design efforts will be washed away by the tide of compression.
This is why I am hesitant to do it.
There has been a big push from a lot of companies recently for 4K. They say it is the future, and I'm sure it is. But there is another, more quiet tech revolution happening, and it is one I think may be more important in the long run. It's the Color Revolution.
When you go to a movie these days, most of the time you are seeing a 2K resolution image from a DCP, which in size isn't that different from the 1920 x 1080 resolution of a Blu-ray disc (yes there are 4K theaters, but I'm talking about your average screen in an average movie theater). However, there is no way a Blu-ray looks anywhere near as good as the 50 foot movie theater projection. Part of the reason is that theaters use amazing projectors that are DCI compliant, but another reason is that the images they are projecting have 12-bit color depth. This is a huge difference from the 8-bit color we see at home, and the 8-bit color most reasonably priced cameras shoot, including many of the new 4K cameras.
Let's break it down. With 8-bit color you get 256 shades of red, green, and blue, which combined gets you 16,777,216 colors. Which sounds like a lot, but it's not, when you compare it to higher bit rates. With 10-bit color you get 1,024 shades of RGB, giving you over a billion different colors. And 12-bit is 4,096 shades of RGB and over 68 billion colors! That's some color rendition.
Why does this matter? Because just like resolution is advancing, so is bit depth. There are affordable 10-bit monitors and 10-bit video cards these days. They don't get as much radio play as 4K does, but are as every bit (and possibly more) revolutionary. So in the future when everything is Ultra HD, it will also be high bit-rate.
Bit-rate vs resolution in imaging is analogous to bit-depth vs sample rate in audio. In my opinion, it is much easier to hear the difference between 16-bit and 24-bit recordings, than it is to hear the difference between 48K and 96K sample rates. It's true that both 24-bit and 96K probably make recordings sound better, as the extra detail in 4k does, but the focus is usually pretty even on providing both simultaneously. People in audio don't generally push 96K and 8-bit together the way that video / digital cinema companies push 4K and 8-bit together. When they do it seems a little wonky to me.
High bit-depth has been around for years just like 4K. And professionals and tech junkies have been preaching about it for years, just like 4K. And it is finally getting to a price point normal people can afford it, just like 4K. And just like 4K, the distribution side of the industry isn't really ready for it yet, unless you are going theatrical in a major theater chain. There are very few computers and monitors that can handle 10-bit images right now.
I'm not suggesting anyone go out and purchase a new computer / video card / monitor in order to work in 10-bit right this minute. I'm proposing that when thinking about the future of imaging, we consider color depth at least as important as resolution.
Technology moves fast and we need to keep up, or at least we feel that way. But it actually isn't moving that fast. The first CDs were released in 1982, 30 years ago. It has only been in the last five years that digital music distribution has become a major player in that marketplace. Blu-rays were first released in 2006. It's entirely possible that it will take Blu-rays as long to dominate the marketplace as it did the CD and DVD, who both took 15 years to reach a 75% market share. In today's fast-paced high-tech YouTube world there are still almost no TV broadcasts in 1080p. Most of the big players in online media delivered to your TV, like iTunes and Netflix adopted 1080p just a little over a year ago, and most of the content on these platforms is still 720p. For television 720p is even considered a premium, for which subscribers pay extra.
The current HD standards were put into place in the mid 90's, yet standard definition DVDs still outsell Blu-rays almost 4:1. Many analysts thought Blu-rays would be outselling DVD by 2012, but adoption has been slower than people thought. Many financial papers are still talking about the growing popularity of HD even today. HDTVs have only hit 75% of market saturation here in North America, and that was only last year!
How long will it take for all of our content delivery to be in HD of any kind? How long before it's 1080p? How many years will it take for a majority of screens to be 4K? How many millions / billions of dollars, will it take? How much will it cost for servers to host libraries of 4k content? How long will it take to create the infrastructure / bandwidth capable of streaming 4k online in average homes? In essence, how long will it take to even show your 4k film to an audience in the format it was created in? Probably longer than we expect, considering all of the tiny moving parts that it takes to embrace new technology on a worldwide scale.
So is 4K the future? Yes it definitely is. Is it here today? Well sort of, but not really. Netflix / iTunes in 4K? Sure, in 2030. Is 4K necessary for me to make movies? Absolutely not. Is 4K right for me? That's really the question at the heart of this debate, and only you can answer it.
I would say if you get hired to make AVATAR, by all means, use the highest K you can find. But if you're making a gritty indie film, or most TV shows, I think 2K is more than appropriate. In the film world there were dozens of formats in the early years, and eventually the market settled down to S8, 16/S16, 35/S35, and 65. I believe the same will happen with digital. Over the next 20 years the markets will settle into a few tiers. 4K will be one of them, but so will 2K.
I'm a low budget filmmaker, and I'm proud of that. To me, a higher bit rate is more important than faster sample rates or more pixels. I think in the end what's most important is that you can fall in love with the creative work you're doing. I had that years ago with 16mm film, and I'm finding that again with the D16. If you fall in love everytime you see a 4K image than that's a good choice for you. I just don't want you to feel like if you don't have 4K you can't have great images, and you can't tell stories. At the end of the day resolution is only one of many many factors, and they all should be considered evenly, at least in my opinion