After coming home from a local sports bar and bearing witness to the worst set of calibration I have ever seen, I felt I needed to release some frustration through this blog!
While working in the TV and film industry I spend a lot of my day making decisions and providing guidance on aspect ratios and color flow between different departments.  However what I find really crazy is that when going out to a local sports bar as an example, EVERY screen looks different!?  To the point of some screens literally looking bright green, or yellow…  And its shocking how common it is for the average viewer (not in our industry) to not really pay attention.

I find this really funny considering you will have commercials wanting to be 100% sure that their Coca-Cola red is perfectly on point, to the Chase Bank blue being exactly correct… To the secondary Color correction done focusing specifically on our actors skin tones, to the painstaking attention to the SLIGHTEST detail in every facet of color… It kills me that once this actually hits our average consumer the color is drastically changed.  And this is not just in a sports bar setting I am referring to.  That just happens to be a great example where you will see it right in your face…

For any of this content that has now been completely butchered, people on set live Color corrected this footage. Then dailies were color corrected ensuring color matched between on set and into the lab perfectly… And then the VFX department needed to ensure the Color meta data tracked through perfectly into their visual effects and back into the offline edit… Final Color correction where we needed to ensure our looks tracked into…where thousands and thousands of dollars were spent having color meticulously perfected… After ALL of this work…  We get this…:

BarMonitors1 BarMonitors2 BarMonitors3

Where did the ball get dropped…  What is the real issue at hand!?

I suppose its easy to point the finger at the consumer and say; what the hell are you doing?  Fix your monitor!  Unfortunately the tv and film industry that treat these decisions like the biggest deal ever, forget that the average Joe…Cant tell the difference.
I think there are several points off the top of my head worth mentioning.

It is very common for people on set these days to want to live grade and monitor strictly with a beautiful OLED monitor.  This is a monitor that unlike an LCD, Plasma or LED screen that people will have at home, can actually turn pixels off emitting no light through, giving the viewer the cleanest black I have ever seen on a monitor!  Where as it’s simply not possible with the other screens (that people typically have at home) as each pixel on their screen whether it be black or another shade, is still emmiting light… So it’s certainly an interesting question as to … Should we actually be monitoring on set with this monitor?  Some may say, yes, we should be working with the best technology there is, the consumers will one day catch up!  Or could the argument be made that maybe we should be making decisions based on what people will actually see in their homes?

Unfortunately I don’t think the answer is that we try to mimic what people will see in their homes, because every single persons home will have a different looking screen.  So that would be a total crap shoot with the way televisions are set up today.

Aside from Color, aspect ratios and framing are a whole other beast.  As a personal example, I often have funny conversations with my father when I go home and set his television screen to fit the entire 2.40 image for a movie on the screen.  When I leave he asks me to put his TV back to “full screen”.  But what he and many other consumers don’t understand is the different aspect ratios and how they may fit on to their screen.  So let’s take this 2.40 image, and scale it to a “fit Height”, essentially giving you your “full screen”.  However now your cutting off the sides of your image… Certainly not what our crew on set were framing for!

I think the real issue at hand however is the disconnect between the people making content (and standards for us people making this content) and the people making the devices we consume this content on.  As many of you reading this know, we have a lot of standardization when it comes to how we work on set and in post production.  We are often ensuring that we are mindful of legal extended ranges, what color space and gamma space are we working in.  What space will others be working in down the chain > lets ensure we set them up for success.  Every person judging color at any stage of the game also has to ensure they are calibrating their monitor to a specific spec…  Then on to delivery, we are ensuring that what gets sent out for broadcast meets certain specs.

 

Now, I don’t think I really have the ultimate technical solution.  But I have to ask, who is setting the same kind of standardization and specs for people manufacturing televisions, that we are adhering to when creating the content?  Who was the person that decided “True Motion” was a good thing to add on to our home television?  Who decided that its a wise move to have a ton of different color options including “Vivid”?  Or the fact that TVs are now upping their refresh rate incredibly high… I understand that may be great for gaming…sports, and other things…But for scripted dramatic shows … They are now looking like F&^%$ soap operas with all of these settings turned on!?  Would it not be possible for our cable box, or what ever box we are plugged into, to read some type of technical meta data from the program being streamed that could set the TVs settings up for the particular program we are watching… Now this would not necessarily solve certain calibration settings as every monitor is unique, but it would certainly be cool if it could change your refresh rate, internal color setting, and a few other things that could help get us in a better range!

 

Over and out!

Jesse