Personal View site logo
Logarist Color Correction for DaVinci Resolve, Vegas Pro, and Final Cut Pro X
  • 92 Replies sorted by
  • Hi @balazer It is not easy to grasp all the implications of wide-gamut vs sRGB. I see the advantages of using a monitor with at the same time wide gamut and sRGB (by the way, would you give me any suggestion about which 4K model to chose?). What I don't see yet is the advantage of a wide gamut monitor actually.... At this point why don't use just a sRGB monitor? May be the wide gamut monitor could be useful only in this scenario: color grading at the same time for big screen projection and for web and home vision. In this scenario I would grade for wide gamut delivery and control on a second display (or also the sole display if it has sRGB mode) how it works in sRGB. In this scenario therefore I would use Logarist how and when?

  • Balazer, Having a consumer computer display calibrated with x-rite i1 - is it good enough for home color grading? What would it take to make grading acceptable by pro standards?

  • I don't agree with most of that and I don't want to debate it with you. Plus it's not really relevant to this topic.

    You got tired. :-) Your topic, your preference. Yet, suggest guys selecting monitor to read and think hard.

  • I don't agree with most of that and I don't want to debate it with you. Plus it's not really relevant to this topic.

  • Display standards don't specify the spectra of the red, green, and blue pixels. They specify the primary chromaticities in terms of the CIE 1931 XYZ color space. Display makers must invoke the CIE color model. It is unavoidable.

    CIE 1931 model is model derived from specific task, if you read it. And large display is far from such conditions. Yet, it makes mostly accurate predictions, except in cases where they are wrong.

    Maybe someday in the era of laser and LED displays we will have display standards with spectral primaries, but not today.

    With Rec 2020 you already are literally forced to extremely narrow variation.

    And current idea is to use QLED further cut with filter (can be very close to Rec 2020), quite cheap actually.

    The problem with making a display with a natively sRGB gamut is that you rely entirely on the physical properties of the materials used to make the display. You depend on the emission spectra and transmission spectra of your phosphors, backlight, and/or color filters. It's very difficult to control those physical properties to achieve exactly the primary chromaticities you want. That's why we have a whole lot of ostensibly sRGB displays that are sort of close to sRGB, but not really that close.

    It is not really hard as you think it is. Making proper P3 primaries that you suggest to look for is much harder.

    Also, if you just look at actual statistical experiments you will see that variation of human spectral sensitivity actually is not as small as pro display manufacturers want you to believe :-) Actually, with wider gamut it becomes real issue, and due to spectrums it stays such issue after calibration.

    The better approach is to aim for a gamut somewhat wider than sRGB, measure the chromaticities of the resulting primaries, and use math to make up the difference. Done correctly, it yields almost perfectly accurate color. You can read about it at http://www.displaymate.com/Color_Accuracy_ShootOut_1.htm Again, there is no downside to using math and the CIE color model.

    http://www.displaymate.com is kind of for sell guys (loving Samsung lately) with not very deep understanding of that they are doing and promoting.

    Yet I agree that on good pro level monitor with high bit panel you can do it with proper tool (note that colorimeter is made as eyes sensors analog, so with wider gamut you have rising issues due to sample variation or filters degrading). I am just not sure that this advice is good for average Joe reading this. Average Joe has instead big chance to get something with strange absolutely non standard wide gamut, 8bit panel and lot of trouble on his head.

    You need to do that anyway, because the display standards are specified in terms of the CIE color model.

    Well, to be really pro level accurate as stated, display must reconstruct specific viewer curves, instead of CIE standard observer. Plus add requirements for working space, display borders material and colors, etc, etc.

  • with 8bit panels it'll always be easy to spot issues.

    Yes. If the display is doing any kind of color space transformation, color precision errors (banding) will be minimized if the panel has 10-bit or higher precision.

    No emulation mode actually can change spectrums of pure R, G and B pixels. All it can do is to make some mix from existing ones based on CIE models.

    Display standards don't specify the spectra of the red, green, and blue pixels. They specify the primary chromaticities in terms of the CIE 1931 XYZ color space. Display makers must invoke the CIE color model. It is unavoidable. Maybe someday in the era of laser and LED displays we will have display standards with spectral primaries, but not today.

    To be short - ideal monitor to making sRGB work is monitor with R, G, B matching primaries.

    The problem with making a display with a natively sRGB gamut is that you rely entirely on the physical properties of the materials used to make the display. You depend on the emission spectra and transmission spectra of your phosphors, backlight, and/or color filters. It's very difficult to control those physical properties to achieve exactly the primary chromaticities you want. That's why we have a whole lot of ostensibly sRGB displays that are sort of close to sRGB, but not really that close.

    The better approach is to aim for a gamut somewhat wider than sRGB, measure the chromaticities of the resulting primaries, and use math to make up the difference. Done correctly, it yields almost perfectly accurate color. You can read about it at http://www.displaymate.com/Color_Accuracy_ShootOut_1.htm Again, there is no downside to using math and the CIE color model. You need to do that anyway, because the display standards are specified in terms of the CIE color model.

  • In my experience, the monitors with the most accurate colors are wide-gamut displays with an sRGB mode. And that's what I recommend for anyone who's serious about video. A wide-gamut display with an sRGB mode will be more versatile, because you can use it two ways. You can use it in a wide-gamut mode with color management turned on, or you can use because it in sRGB mode with color management turned off. The latter will show colors correctly in applications that don't support color management. A wide-gamut display without an sRGB mode can't do that. Also, being able to operate in sRGB mode or a wide-gamut mode gives you an extra point of comparison to be sure that your color management is working correctly. Color management can get a bit hairy. You avoid all the complications of color management by doing everything in the sRGB color space.

    In my opinion, even if it is high end monitor with 12bit, or at least 10bit panel for grading non professionals it is best to avoid wide gamut monitors, with 8bit panels it'll always be easy to spot issues. No emulation mode actually can change spectrums of pure R, G and B pixels. All it can do is to make some mix from existing ones based on CIE models.

    To be short - ideal monitor to making sRGB work is monitor with R, G, B matching primaries.

  • Hi, @Nino_Ilacqua.

    sRGB is the standard color space of computer and HDTV monitors. You want to be able to accurately preview in that color space to produce content that will look the same on other people's displays. BT.709 is a standard way of rendering images for sRGB displays.

    The LG UltraFine 4k display for Macs uses a P3 gamut, and it has no sRGB emulation mode. To properly display sRGB content on that display, you'll be relying on the ColorSync color management of macOS and color management support in your applications. Final Cut Pro X is supposed to support ColorSync, and it has a Rec.709 working mode in project settings. So in theory it should work for FCPX with Logarist.

    But some applications don't support color management. E.g., I'm not sure if Premiere Pro does. Therefore it's good to have a display with an sRGB mode, so that colors will look correct even in applications that don't support color management.

    In my experience, the monitors with the most accurate colors are wide-gamut displays with an sRGB mode. And that's what I recommend for anyone who's serious about video. A wide-gamut display with an sRGB mode will be more versatile, because you can use it two ways. You can use it in a wide-gamut mode with color management turned on, or you can use because it in sRGB mode with color management turned off. The latter will show colors correctly in applications that don't support color management. A wide-gamut display without an sRGB mode can't do that. Also, being able to operate in sRGB mode or a wide-gamut mode gives you an extra point of comparison to be sure that your color management is working correctly. Color management can get a bit hairy. You avoid all the complications of color management by doing everything in the sRGB color space.

  • I used Logarist in FCPX and i found it very useful. I am considering to buy the LG 4k to use as external monitor with my Macbook 15" mid 2014, which uses a sRGB color space. Would you think Logarist would benefit from this possible configuration?

  • In Vegas Pro, turning off GPU acceleration will often resolve stability problems.

  • I sent balazer an e-mail about this, but use of this procedure crashes Vegas Pro 14 on my 16GB i7 computer with 4GB of video RAM (GTX1050). The original files were 4K UHD ProRes HQ files shot in Vlog L 10bit, attempting to render in XAVC S 8bit.

    A screen shot with the settings is attached.

    Logarist Fail.jpg
    3840 x 2160 - 792K
  • If you use FCPX and the Canon C300mk2 and you record Canon Log 1/2/3 internally and you upgraded the camera's firmware to version 1.0.6, I have new Logarist LUTs that compensate for the levels shift caused by Canon setting the video_full_range_flag erroneously in recordings made with this firmware version. Contact me to obtain the LUTs.

  • Sorry, I overlooked the BT.709 output section. Thanks wslagter

  • @wslagter, the BT.709 output options are described here. The only differences are how highlights are rendered. The choice will depend on the shot and on your personal preference.

  • @balazar, Thanks so much for all your great work, really helpful in Vegas Pro . Question: The Output LUT BT.709 has 11 options. You recommend Logarist to BT.709, LumaComp 0.66 cube. What is the difference between LumaComp and HighComp and when to use it. ? I don't really notice much differences between the BT.709 LUTS

  • @balazer In Resolve there is no hue shift when applying contrast, luma curves or using the primary controls in rec709 colour space. I have never experienced the issues you mention. Which software displays this problem?

    I just finished a more extensive test and I do not like the shadow artefacts that your logarist process adds to the image. I was able to perfectly match the logarist result in Resolve using a single node in rec709 YRGB colour space using only primary sliders for gain, gamma and a little mid tone detail. The main visible difference was that using standard Resolve setting maintained a better and cleaner skin tone. Logarist introduces a contrast change that made the skin tones gradation from the lit side of the face to the shadowed side less smooth and changed the rendition of skin colour in those shadow areas. The skin tone in shadows was less saturated and was a bit less pleasing to the eye.

    I think for FCPX it may be OK but for an experienced colourist in Resolve, I think it is unnecessary and creates more issues than it solves.

    Don't get me wrong; I think it's a really good idea and I applaud your hard work and thorough approach, but I don't need it for working in Resolve. In Premiere it would be great because of how awful the grading tools are.

  • Sony S-Log1/2/3 users: the previous version of Logarist for DaVinci Resolve (Mac and Win) had a level mapping bug. Contrast was lower than it was supposed to be, with elevated blacks. This problem is fixed in Logarist 1.2.0. FCPX and Vegas Pro were not affected. No other color spaces were affected.

    Panasonic GH4 users: if you record V-Log L externally and you use a Mac, I have added a "Panasonic GH4 V-Log L external" transform in Logarist 1.2.0. It maps levels differently from the normal (internal) transform. This distinction is necessary on the Mac due to the way QuickTime decodes the video. No such distinction is necessary in Windows.

    logarist.com

  • I've always wanted color correction to be this simple. And exposure compenstation has to be done in the linear space... Lift gamma gain and the color wheels, though powerful. Have never felt like simple math, like lightroom does, those wheels are more of a dark art

  • @caveport, you don't need three nodes. You can do it all in one node.

    Logarist is for people who are very particular about good color, novice or pro. I was never able to achieve the same look with standard tools, no matter how hard I tried. You can come close, sure, but standard tools always left me wanting better color.

    A lot of it has to do with how changes in contrast are applied. Anytime a color correction operation adds a curve with a slope other that one, that alters the saturation and shifts the hues. Increasing the contrast, for example, shifts skin tones towards green. In a small-gamut space, that shift is larger than in a wide-gamut space like Logarist. You can compensate for the hue shifts, but that's just extra work. The shift depends on how much contrast you are adding. Of course some shots don't need extra contrast, and so none of this matters. But in my experience, almost every shot benefits from at least a little contrast adjustment. I prefer to minimize the hue shift problem from the outset by operating in the better color space.

    Maybe you can post some comparison frames for us and say a little about how you did the correction. It's really hard to say what's going on without seeing the images and knowing what you did.

  • Thanks for the LUTs. I tried them in Resolve and found nothing I could not replicate easily and exactly, using the standard Resolve tools. I don't really understand the benefit of using this approach. Why use 3 nodes when one will create the exact same look? Is this for people who don't know how to grade?

  • Can Magix Video Pro X7 load LUTs?

    I'm not sure, @balazer I will check next week, I'm on journey now

  • @wslagter, yes, disable ACEScc by setting the View Transform to Off. Once you do that, the color space settings in the media properties and in the render dialog no longer have any effect, so it doesn't matter what you set them to.

  • Hi, jbalazar Sorry. correction, indeed I use your ACEScc Cine 1 setting with color Still or Pro successfully. So in order to use S-Gamut3.Cine color and Cine 1 with Logarist, I can disable ACEScc by setting View Transfer to OFF (in Vegas Pro Project Properties) ?.

    To render a project in Vegas Pro with the ACEScc workflow you recommended ACES RRT(rec 709 video file) in the color space settings but when I render with the Logarist workflow should the color space setting set to log or default ? (as the only non ACES choices)

    thanks for your help wslagter

  • Hi, @wslagter. I never had settings for S-Gamut3.Cine in my ACEScc config. If you were shooting with the camera set to S-Gamut3.Cine color, the colors would not have been quite right in ACEScc. Logarist does support S-Gamut3.Cine with Sony Cine1 gamma, and also Sony Cine1 gamma with the standard sRGB/BT.709 gamut. The Cine1 profile in my ACEScc config is the same as in Logarist.

    On the Sony A6300, I recommend shooting in Cine1 gamma with Still color. You can also shoot in Cine1 gamma with S-Gamut3.Cine color if you really want the wider gamut. Just choose the matching Logarist input transform. (Cine1 with or without S-Gamut3.Cine)

    The workflow in ACEScc is very similar to, but not identical to Logarist's. Logarist uses different primaries and a different log scale. So you can't just take your existing ACEScc project and switch it to Logarist and have everything look the same. But the color correction process is the same. Use the same filters, just with slightly different correction values, and you'll get a nearly identical result, assuming it's the same output transform. The Logarist BT.709 output transform is the same as the "Standard Rec.709" transform in my ACEScc config. It's not the same as the ACES RRT.

  • @balazar . Your Logarist tutorials are greatly appreciated! Thank you very much. Before I found your Logarist for Vegas Pro, I was using your ACEScc workflow with gamma Cine 1/ color mode S-Gamut3.Cine settings on Sony's A6300 and used the same recommended Bright/Contrast and Channel Blend filters. Would the end result be any different when I switch to the Logarist workflow and use the LUT's ?

    thanks wslagter