DDC?

DDC?

Digital Dailies Colorist. 

In the January issue of ICG Magazine, this short statement by cinematographer Paul Cameron, ASC caught my eye..

cameron-top.jpg
cameron.jpg

Soon after I began hearing rumors of movement towards the establishment of a new union classification, something like an "On-Set Colorist", to distinguish the task of creating color corrected dailies from the traditional role of the Digital Imaging Technician (DIT). 

Here we are in April now with IATSE Local 600 elections around the corner and I was very surprised to read this mailer in email tonight - 

To CAMERA GUILD/ Local 600

Brothers and Sisters,

My name is Paul Cameron. I am a Western Region Candidate for NEB Director of Photography & Visual Effects Supervisor.  I have been a Director of Photography and member of the Camera Guild for over 20 years.  Recently I had the opportunity to testify on behalf of Local 600 in a closed deposition aimed at reducing rates for First Assistant Cameramen.  It reminded me that decisions are made every day to challenge our Union and its Members.  That now is a time to be focused and aware of our future. That now is the time for all members including myself to participate in our Union .   A union that provides the Film Industry highly talented and qualified craftsmen and women.   A union that secures our rights in the work force.  Secures our rates as Technicians.  Provides its members options for Health benefits and secures a plan for Retirement funds. 

It became apparent to me at the last Board Meeting that all the issues that came to the floor were vital.  Being a board member means looking at the issues we face as a union in an environment that wants organized labor to vanish.   It's also a time of great change with the advent of Digital Capture.  A time where technology is changing exponentially.

One of the issues I would like to address immediately is the re-classification of DIT - Digital Imaging Technician to DDC - DIGITAL DAILIES COLORIST.  The responsibilities of the DIT have changed dramatically over the last 10 years.  It's my firm belief that the position of DIT was created to help bridge the transition and development of new technology in Digital Capture. It was a time when the DIT matched digital camera bodies during the check out and painted the cameras on set while wrangling the endless amounts of new cables and conversion boxes.  The DIT became this hybrid Electronic Engineer/Camera Assistant.  Cinematographers and Camera Assistants were relying on DIT's to handle Camera settings and make decisions regarding how images were captured.  Then almost overnight Cinematographers lost the Off Set Dailies paradigm.   Dailies were no longer sent to be transferred or graded by labs or Telecine facilities.  The DIT became the On Set Colorist.  Now Cinematographers are relying on DIT's to handle all Color Timing and Transcoding of Dailies with those looks applied.   The job of the DIT has shifted.  Producers are aware of the change but confused what a DIT really does.  The DIT color corrects and transcodes all Dailies at this point.  That is certainly the present + future of that position.    Camera Assistants need to stop relying on DITS for Camera Settings and Reloads.  Traditionally the Camera Assistant never let anyone touch much less change a frame rate or shutter angle on a Camera.  It is my belief Camera Assistants need to reclaim that space.  Part of that means additional training for Camera Assistants on current Digital Capture Systems.  Current DITS’s also need further training in Color Correction as well as Data Management.   Establishing the best Digital Workflow is the responsibility of the Cinematographer and the DIT.  The DIT is now responsible for helping establish and securing Cinematographers looks through the Metadata/Digital Pipeline.   Once Producers + Producers understand DDC -Digital Dailies Colorist they will value that position more than ever.  Once there is an honest acknowledgement that DDC's Color Correct all Dailies from Cinematographers there will be a new found respect for that position.  The DDC's are also integral in interfacing with Production and Post Production in the ever-changing world of constantly changing Digital Workflows and needs in Deliverables. There is also the opportunity now to welcome into Local 600 the best Digital/Telecine Colorists from Labs + Post Production facilities and make them union members. The DDC can also be a position in Post Production facilities and Studios can hire to ensure Timed Dailies and Work Flows are being handled correctly in Near + Off Set Labs.  I hereby advocate the transition of DIT to DDC.   Regardless of whether I am asked to serve on the Board or not I plan to push this through.  This is one example of a few changes I will try to bring to the Board.  We need this re-classification as soon as possible.

This would be the first time I serve on the board. It would be an honor to represent and be a voice for each and every member.

Thanks for your consideration. Vote now and send your Ballots in.  Your voice counts.  Every small step or action you take means something.

Paul Cameron, ASC

Candidate -Western Region -  NEB Director of Photography & Visual Effects Supervisor.

To anyone actively working as a Digital Imaging Technician, the constantly evolving nature of our job description is something that keeps the work very interesting but can also be nerve wracking as the future of the position is clearly so tenuous. What I take from this piece is that we need more standardization among our ranks. The reality of so many independent owner/operators offering such a helter skelter variety of services at completely arbitrary rates, has done little more than put a gigantic question mark in the brain of many producers as to what a DIT is, what a DIT does, and most importantly - why do I need one? 

The problem is that two distinct skill sets have emerged - video and computer. There are many talented operators out there who excel at both and can effortlessly jump back and forth between doing a live color correct for four cameras one day and then handling Codex deliverables on a workstation on another. But then there are those who really better suited for either one or the other because of their background, interest, conviction about what the position "is and isn't", etc. The problem is obvious. In the eyes of a producer, we are all the same because we are all DIT's. Is there are a 1st AC out there who doesn't pull focus? Or one who only builds the camera? No matter what your opinion is on this sensitive subject, I think we can all agree we have a "brand" problem as a union classification. 

What do I personally want to happen? It doesn't matter. The market will decide. I, like everyone else, just want to keep busy and for the position to remain viable. Whatever it's called. 

negativespaces.com UPDATES. 

I have not been posting because this site is undergoing a massive overhaul. New design, new functionality, I will be opening it up to advertisers for the first time, and the biggest new feature will be a forum for discussion and information sharing. This has been in development for awhile and will hopefully go live within a month or two.

Thanks for stopping by. 

Codex XR.. finally

Codex XR.. finally

At last.. integrated Arriraw recording in the Alexa camera.

xr.jpg

Codex Announcement >

Codex has worked closely with ARRI during the development and launch of the Alexa family of cameras. The Codex Onboard Recorder was the first recorder to be certified to record ARRIRAW from the Alexa back at the beginning of 2011 and has since been used on hundreds of feature films and commercials worldwide.

ARRIRAW has become the output of choice for feature films, including The Avengers and Skyfall and Codex recorders have become the recording standard. The ARRI Alexa features a 35MM CMOS bayer sensor. The sensor data is output over T-link to the Codex Onboard Recorder, where it is recorded on a Codex Datapack or Capture Drive. ARRIRAW is 12 bit logarithmic raw Bayer data. The resolution for 16:9 is 2880 x 1620 and for 4:3 it is 2880 x 2160. ARRIRAW can be output and recorded at up to 60FPS for 16:9 and up to 48FPS for 4:3 to an external recorder.

Fast forward to 2013. ARRI and Codex announce the new Alexa XR  (extended recording) Module. Building on the success of the Alexa/Codex combination,  Codex and ARRI have developed a module that incorporates Codex recording technology directly into the camera. This alleviates the need for cables between the recorder and camera, makes the camera package smaller, and further simplifies ARRIRAW recording.  It also enables higher speed ARRIRAW - up to 120FPS for 16:9 and up to 96FPS for 4:3.  These developments are bound to intrigue cinematographers and further cement the Alexa/Codex/ARRIRAW workflow as the standard for digital production.

The XR Module provides several recording options in a single package. ARRIRAW at up to 120FPS (16:9) can be recorded onto a high performance Codex Capture Drive. In addition, Apple ProRes or Avid DNxHD can be recorded to a Capture Drive, making longer recording times possible (up to 2.1 hours of ProRes 4444), or with an SxS adapter, to an SxS PRO card.

Anyone who's spent some time working with the Alexa and Codex on-board recorders can attest to how desperately needed internal Arriraw recording is. Just about everything about the Codex system is thoughtfully designed and executed - the stability of the recording, robust equipment, and control of metadata and deliverables with the Virtual File System. The big problem is the deck is quite large, is an expensive rental, and there are inevitably BNC cables running between it and the camera. Not to mention getting the Codex working happily with the Alexa means matching many menu items on the two devices, several of which are labeled differently and mismatch can result in irreversible recording errors. It's a good system but it's one that's by no means bullet proof.

In any video recording system, whenever a cable is introduced between the camera and the recorder the possibility for error is exponentially increased. A native, integrated recording system controlled solely by the menus within the camera greatly reduces all these problems and makes for a faster and more intuitive user experience. Needless to say I'm thrilled to see this functionality coming to the Alexa and think Codex was the obvious partner given their expertise with Arriraw. Why reinvent the wheel if you don't have to?

All new Alexa's (except the entry level 16x9 camera) will ship with the XR module and will be called Alexa XT (Extended Technology).

arri-alexa-xt-super-duper-cameras-and-the-alexaremote-update.jpeg

All existing Alexa cameras can be upgraded with the XR module. Sony SxS cards for ProRes recording only will still be able to be used with an adapter. In addition to native Arriraw recording, XT cameras will feature new processing hardware, 120 fps recording, 4:3 sensor and anamorphic desqueeze, and IFM In-Camera Filter System for behind the lens IR ND filtration. <via Film & Digital Times>

Behind-the-lens ND is actually another godsend and something I've been preparing a separate post on. Conventional, Front-of-the-lens filtration and use of heavy ND, sometimes 7 or more stops, is the most destructive factor in digital imaging and something that is easily remedied by putting the glass behind the lens. As Digital Imaging Technicians, painting out color temperature offests in neutral density filters is one of our most common tasks and another factor that it seems technology will alleviate us of. Much more on this later..

Conveniently the integration of Arriraw into the camera solves another big problem for us, and means another blog entry I don't have to finish writing, which is the problematic workflow for on-set color correction with Codex recording.

Because all the current external Codex decks - M, S, and Arriraw - require the Alexa's REC video outputs, getting a Log-encoded video signal to be used for on-set color correction presents a host of challenges. Ordinarily, the DIT would use one of the camera's REC Output's for a clean, 10 bit, 422, Log-encoded video signal to be used with color correction software such as LinkColor or LiveGrade. This allows the operator to create data directly on the set in the form of ASC-CDL or 3DLUT's to be used for color corrected production dailies. When the camera's REC Out's are being used to feed the deck and the deck in-turn doesn't output a video signal that's useful to us, the user has to get creative to get their workflow working.

This tends to be the workaround -

Alexa REC Out's to Codex Input's. Even with a Single-Link 3G recording, the camera's other Rec Out still ouputs a data stream that isn't useful for video monitoring.

So in order to get Log-encoded video for the DIT's use, the camera's MON Output needs to be put into Log C. Addtionally, the MON Out can only output a Legal levels video signal unlike the REC Out which can be set to Legal or Extended levels. There are workflows that require an Extended level video signal so if we can't get one out of the camera, this can present more problems to be solved.

Using a 3-wire BNC video harness, one cable is used to monitor the Codex' output which unfortnately isn't useful for anything other than verifying the recording. The second cable takes the Log-encoded MON Out from the camera to the DIT where it's color corrected and then using the third cable in the harness, is fed back to the camera to be used by the assistant's on-board monitors.

The main problem with this is it's often impractical to send a color corrected return back to the camera to be used by the AC and operator. This can be for any number of reasons - long cables runs, wireless video for a vehicle shot, process trailers, etc. In practice, it really isn't fair to them to have to focus using Log C video as this flat, washed out image makes their job even more difficult. The operators tend to hate it as well. Codex XR solves these problems. A separate video channel for users at the camera, an in-camera recording, and a single video link to the DIT is always ideal. Or better yet - a single wireless link. Less is more. Keep the recording and monitoring as simple and discrete as possible and everyone in the camera department - DP, operator, assistants, and engineering - can do their best work.

Cutting the Cord - Follow Up

Cutting the Cord - Follow Up

February 4, 2013

As is always the case with content published on this site, the first version is certainly not the last. Through valued reader feedback, these articles are amended and updated over time. Chris MacKarell at Arri CSC Digital, whose feedback has routinely been beneficial to this site, raised some valid points on my post on Wireless HD-SDI that I'd like to include. The main concern is that for Wireless HD Video, because this digital signal has to be converted to analog for transmission and then converted back to digital upon reception, this resulting signal can never really be "identical" to a cable-bound one no matter how high quality it is.

How true! 

From Chris -

.. as far as wireless monitoring is concerned, here some questions you didn't touch on which certainly affect the quality of the image on the monitor:

At the Transmit end, precisely how is the deserializing and digital-to-analogue conversion of the original HDSDI signal performed?

How is the reverse, the analogue-to-digital conversion and serializing of the signal at the Receive end performed?

How high a fidelity are these DAC and ADC functions, what quantizing steps are taking place to perform them?

How are overly high peak-to-average power ratios in the modulated signal dealt with? - Note that this is usually done by clipping the transmitted sine wave.

How is intermodulation distortion i.e. non linearity in the signal chain handled?

If COFDM, how is this managed?

Of course, there are industry standards ( e.g. IEEE 802.20 ), but were they designed for the kind of critical application the DIT requires?

The key question is, can you trust what you see on your monitor at all times?

I suspect that for many, any questions at all around the image processing techniques applied and therefore the integrity of the subsequently transmitted image, are enough by themselves to mitigate against that signal's utility for serious critical imaging work.

And since all this processing will always be a necessary precursor of image transmission and reception, the wireless utility you seek for on set monitoring applications perhaps maynever be approached.

In short, whatever future technology developments occur, the underlying principles governing wireless transmission will not change. Wireless may therefore never yield the kind of critically accurate signal you require in your day-to-day work.

All of the above points will certainly affect image quality and is something to be aware of.  As Chris pointed out, "the underlying principles governing wireless transmission will not change," so no matter how good these systems are, just how cautious should we be in using them? Should we even be using them at all?

Unfortunately, we often don't have much choice in the matter and if the shot calls for wireless, then it has to be wireless. In my opinion, if you know the nuances of your wireless system and have spent some time testing and evaluating the image it produces, you can come to trust it. My favorite thing to say on the set when things become questionable is, "the scopes don't lie." Waveform and Vectorscope are probably the only thing on the entire set that are truly objective. They reveal problems now that will certainly be problems later and basically tell you everything you need to know about the video images you're working with. In the case of wireless video, the scopes can tell you just how well your system is putting that signal back together upon reception.

Here's a great way to test this - using scopes, compare the exact same image - one coming to you over the air and the other through a cable. Use one of the camera's HD-SDI outputs to feed the wireless Tx and then run a hardline to you from the camera's other output. Now switch between the two and study the waveform and vector. Are you seeing any shift in Chroma in the wireless image compared to the cable? What about contrast? Does the highlight and shadow information in the wireless waveform sit at the same place in the cabled image? What about midtones? Are they more compressed in the wireless image? 

Really if you're going to use this video signal for anything other than basic monitoring, any differences between the wireless and cabled image should be minimal. In the case of the Boxx Meridian, in my experience very little if any, shift in chroma or contrast is evident in the wireless image. As the signal degrades, the image becomes noisier and blockier but largely maintains its correct luma and chroma information. This is why we spend so much time testing gear. You're going to have to use it and often in very compromised situations. The more you know about its strengths and weaknesses, the more confident you can be in its operation. 

I encourage anyone who follows this site to contribute to the knowledge base and feel free to eviscerate anything I've written here with a fine-toothed comb. I'm a technician and not an academic so many of the topics I address here are much more focused on practical application or the "end-user experience" and not the hard science driving it.