Personal View site logo
Make sure to join PV on Telegram or Facebook! Perfect to keep up with community on your smartphone.
ACES DCI sRGB Codec Supported???
  • Is there a way for ProRes, DNxHD or any other codec to support ACES (DCI, sRGB, etc.)? In Sony Vegas the only possible I found is the Sony YUV 10-bit which is almost uncompressed. Any suggestions, please?

  • 26 Replies sorted by
  • Can't believe that no one experiences such a thing. No one uses ACES?!?

  • If you're looking for an intermediate format for working in ACES you have OpenEXR. ProRes or DNxHD would be inputs or outputs from an ACES workspace as they represent limited, (usually) display-oriented color spaces and not ACES.

    Sony has designed some of their cameras to work more explicitly as inputs to an ACES workflow, cameras like the F55 and F65. The format itself isn't necessarily as important and there isn't much cause for transcoding from any other format to this format if you're going to ultimately be working in ACES anyway. You're just creating more work and consuming more disk.

    Other cameras work in ACES with their camera-specific IDT (Input Display Transform). So you would take the CinemaDNG or R3D or ProRes or whatever the camera records and then transform it into the ACES space while applying an sRGB or rec709 ODT (Output Display Transform) for viewing, and any target space needed for distribution.

    edit: unfortunately there isn't widespread adoption yet. Old dogs don't like new tricks, even if it makes their life better and easier.

    edit2: also, if you aren't shooting a RAW camera there is basically no reason to be messing with ACES. Any GH, any camera that shoots h.264 or 422 Prores, it's a waste of time.

  • No film has yet work with the full ACES color space, a few attempts were made their accuracy are far from being 100% and usually companies use their proprietary R&D color space. It was mainly created to deal with the variety of different containers, all that could be contained within the ACES color space resulting into a very simplified way of working (in visual effects mostly). The basic example is, I took a JPEG picture on a scoot (cos I had to take 999999999 pictures and raw could not work) then I treated this jpeg (rec709) into a photoshop(sRGB) to make it a layered texture that is going to be mix with a float shadder in my 3D software and eventually mixed with some other very friendly formats like let's say you receive a shot in DPX(cineon )and a last moment ref in TIFF(Ycrcb) and all of that MUST be linearized to be composited all-together. (You guess right, it's a pain in the ass). Most VFX companies had their own kind of "ACES" color space since quite some time. ACES is just a standard the academy is trying to set but very few use 100% of it. It's not new, it's just trying to make a standard which is why they are making noise like it's a totally new feature and it's going to save the word.

    I strongly advise you to not use it, it's really meant for very complex production and it requires tons of development in your workflow to be fully integrated. Plus if you don't do heavy FX composites or use more than 6 different cameras (with different color space) there are almost 0 reasons to use it.

  • Wat?

    ACES is a large color gamut, larger than even the visual spectrum. There is no using 100% of it.

    You can work this way in Nuke and in DaVinci and even After Effects to a certain extent. You don't have to develop anything.

    This is part of the reason we can't have nice things...

  • Nuke doesn't like ACES (2.0) at all (I don't know about AE and DaVinci, but I really doubt there's much difference). As I said, the point of such a large color gamut is to make all the other fall into that one to simplify the linear aspect of a production. Unfortunately it creates more problem than it solves (at least in Nuke). By using 100% I meant the advantages it offers (i.e. a much smoother workflow) and no one yet to my knowledge has managed to take all the benefits of what it is supposed to do.

    Plus as I said, many production companies had that kind of things for years. Where I worked we use to work with a space called bMut(and a lot of numbers after that, R&D versions) It's not quite as wide as ACES (but not far from it) and it did the job (with a lot of scripting).

  • Below is a test of GH2 with ACES sRGB:

    Whatever the theory says, ACES give a colors and tonality that GH2 can't produce by itself or with any normal grading, filter or plug-in. And I don't see any headache working with ACES at all.

  • @producer How do you use ACES in your workflow? Which software and process? I'm assuming you are grading in ACES colour space using GH2 sRGB footage?

  • What exactly do you mean when you say that the video you posted is with "ACES sRGB"? Those are two different mutually exclusive color spaces. Either something is in sRGB, or it is in ACES. Perhaps you could share a little more about your workflow? I think one of the main reasons that you don't see any headaches in working with ACES is because you misunderstand what it is. As far as stuffing ACES into an 8-10 bit codec, you could certainly do it, but I'd advise against it. Capturing full ACES gamut generally requires floating point, and your color data will occupy only a tiny subset of the total gamut, and consequently most of the bits of the codec would be wasted on colors you will never care about, effectively giving you something like the equivalent of 5 bit sRGB chroma resolution. That's why the ACES spec explicitly spells out that OpenEXR is the interchange format.

    "ACES give a colors and tonality that GH2 can't produce by itself or with any normal grading, filter or plug-in." Uh, my only response is "citation needed." What are you doing with ACES, and what do you consider "normal grading"? Ultimately the math of an IDT to transform GH2 footage to ACES is just linearization, and a matrix multiply to move from sRGB primaries to ACES primaries. I'm not sure why you think this math has magical powers, or that it's terribly exotic.

    As far as I know, I was one of the first people to actually have to set up a Baselight to do end-to-end ACES grading for a national TV spot in the US, and I attended a very interesting presentation by SMPTE about ACES workflow a while back, and I genuinely have no idea what you are talking about.

  • Another test with ACES RRT (sRGB):

    For comparison below is a framegrab from the original source.

    00038.MTS_20150422_121906.206.jpg
    1920 x 1080 - 322K
  • @producer How about letting us in on your secret?

  • @caveport: What secret?!? My first post says it all.

  • @producer I thought you might like to share how you do your ACES workflow test videos.

  • @caveport: If you use Sony Vegas, just choose 32-bit ACES RRT (sRGB) and then do your job. The only problem for me is that only Sony YUV 10bit codec carries this. I tried ProRes, DNxHD and any other codecs and they doesn't show any picture but vertical lines like a defected LCD TV matrix.

  • Ok. You do realise that even the Sony YUV 10bit codec is rec709? It's the Vegas export that can't do the colour space conversion to other codecs.

  • And another quick one:

  • Nuke doesn't like ACES (2.0) at all (I don't know about AE and DaVinci, but I really doubt there's much difference). As I said, the point of such a large color gamut is to make all the other fall into that one to simplify the linear aspect of a production. Unfortunately it creates more problem than it solves (at least in Nuke). By using 100% I meant the advantages it offers (i.e. a much smoother workflow) and no one yet to my knowledge has managed to take all the benefits of what it is supposed to do.

    Nuke has problems with a lot of things thanks to The Foundry. ACES or no it doesn't properly deal with Alexa conversion. It doesn't have reliable streaming media support and, currently, it's got a very unhealthy habit of writing corrupted project files from the Mac version. One can only hope they successfully sell the company like is their aim to a more capable licensee. And yet none of that has anything to do with how other software works either.

    Besides the larger gamut its aim is to also normalize the results from various cameras. Most of them nominally produce imagery that must first be "fixed" before actually crafting a look. It's easy enough to find solid reference for the benefit of better nominal imagery through an ACES pipeline versus the standard sub-standard methodologies. And I've seen it first hand with RED imagery as well, given footage right off the camera looks pretty horrible through their typical color science.

    The only thing that makes it necessarily harder to deal with is having to deal with other people who think it's trying to take their job, who haven't read the white paper, old dogs, new tricks, etc.

    Plus as I said, many production companies had that kind of things for years. Where I worked we use to work with a space called bMut(and a lot of numbers after that, R&D versions) It's not quite as wide as ACES (but not far from it) and it did the job (with a lot of scripting).

    I worked at two of the premiere shops for color in film (among others), Digital Domain and Sony Pictures Imageworks, who, along with ILM, set the standard by which the entirety of the industry followed and ACES is the result of this, along with OpenEXR. The methodologies for color prior to this were terrible, with the wheel being constantly re-invented and wasteful, show-specific solutions. Those were the dark times and they should be viewed from a place of "glad we don't have to live that way anymore." All that pain was necessary to arrive at this better solution.

    This sRGB conflation with ACES of producers is unfortunate though.

  • @producer - Whatever the technical whyfores (explained quite elegantly by above posters) I find it hard to believe that you're holding up the examples you've posted here as having desirable characteristics that 'no color grade or plug in could hope to achieve'. The footage looks like it was originated using early sony betacam tube cameras.

  • In the context of producer's footage, he isn't starting with anything that would benefit any more from an ACES workflow than working in a 32bit project in After Effects. Most of the work you see online is from people working in 8bit or 16bit workflows. The results you can get from a float workflow, yes, can be startling compared to the conventional 1990s (non Nuke) way most people still seem to work but this isn't really a function of ACES.

    An irony here is because he's starting with rec709 footage specifically and fixated on sRGB, he's actually, in a very round-a-bout way, transcoding to a smaller gamut colorspace than where he started, though this is often just an academic distinction and shouldn't adversely effect the footage in any way that matters here. It's the one appropriate for his monitor (sRGB), if he doesn't at least have the equivalent to an HP Dreamcolor, but it's much ado about nothing. ACES is for higher dynamic range sources. Even the white paper says don't bother for low dynamic range video sources.

  • +1 I did some testing in DaVinci Resolve using sRGB source footage in ACES colour space and concluded that working in rec709 gave the same result with an easier workflow. Unless the source footage uses a colour space greater than sRGB/rec709, I really can't see the point of using ACES.

  • Unless the source footage uses a colour space greater than sRGB/rec709, I really can't see the point of using ACES.

    Yes, there is none. I guess some might argue using what footage someone has available to get familiar with the workflow has some merit but none of that really ends up in the imagery.

    I will be pretty happy when rec709 goes the way of the dodo bird though.

  • And another very fresh test (1080p):

  • You will be waiting a very long time for rec709 to disappear. It is the standard for nearly all editing software, cameras & broadcast delivery!

  • Rec2020 is already making its transition in. 4K/UHD is ushering in the end of rec709. Hopefully gamma will be rigidly enforced with this new standard and not the sometimes 2.4 sometimes 2.2 liberal application of gamma that you find with rec709 implementation and deployment.

    Unfortunately, it too keeps the anachronistic and unnecessary "studio swing". I dearly hope that I see that nonsense disappear in my lifetime.

  • @producer, these samples look like some kind of film look has been added, the final part of an ACES workflow is usually an ODT depending on your target media.

    I can't speak for Nuke under Mac but i find it difficult to believe there are still any import issues, one thing the foundry are very good at is updating and beta lists. What problems specifically? I'm not aware of any recent Alexa ones. I've never had a corrupt project (and it's just a python text file so it's pretty easy to change) and why on earth would you want to reference streaming media as a footage source? They've had to totally rip apart QT support recently because of QT and they're one of the only apps under windows that will encode ProRes 4444 officially sanctioned.

    No, i don't own shares, just a happy user. It does what it says on the tin.

    But to the original point ACES is just a colorspace and a way for manufactures to profile the characteristics of their sensors to enable a shared workflow between different cameras. An sRGB IDT probably isn't an official release and would be, by it's very nature, generic as is the cinemaDNG IDT created by BlackMagic, they're not that useful.

    cheers Paul