Personal View site logo
Logarist Color Correction for DaVinci Resolve, Vegas Pro, and Final Cut Pro X
  • 92 Replies sorted by
  • @balazar, Thanks so much for all your great work, really helpful in Vegas Pro . Question: The Output LUT BT.709 has 11 options. You recommend Logarist to BT.709, LumaComp 0.66 cube. What is the difference between LumaComp and HighComp and when to use it. ? I don't really notice much differences between the BT.709 LUTS

  • @wslagter, the BT.709 output options are described here. The only differences are how highlights are rendered. The choice will depend on the shot and on your personal preference.

  • Sorry, I overlooked the BT.709 output section. Thanks wslagter

  • If you use FCPX and the Canon C300mk2 and you record Canon Log 1/2/3 internally and you upgraded the camera's firmware to version 1.0.6, I have new Logarist LUTs that compensate for the levels shift caused by Canon setting the video_full_range_flag erroneously in recordings made with this firmware version. Contact me to obtain the LUTs.

  • I sent balazer an e-mail about this, but use of this procedure crashes Vegas Pro 14 on my 16GB i7 computer with 4GB of video RAM (GTX1050). The original files were 4K UHD ProRes HQ files shot in Vlog L 10bit, attempting to render in XAVC S 8bit.

    A screen shot with the settings is attached.

    Logarist Fail.jpg
    3840 x 2160 - 792K
  • In Vegas Pro, turning off GPU acceleration will often resolve stability problems.

  • I used Logarist in FCPX and i found it very useful. I am considering to buy the LG 4k to use as external monitor with my Macbook 15" mid 2014, which uses a sRGB color space. Would you think Logarist would benefit from this possible configuration?

  • Hi, @Nino_Ilacqua.

    sRGB is the standard color space of computer and HDTV monitors. You want to be able to accurately preview in that color space to produce content that will look the same on other people's displays. BT.709 is a standard way of rendering images for sRGB displays.

    The LG UltraFine 4k display for Macs uses a P3 gamut, and it has no sRGB emulation mode. To properly display sRGB content on that display, you'll be relying on the ColorSync color management of macOS and color management support in your applications. Final Cut Pro X is supposed to support ColorSync, and it has a Rec.709 working mode in project settings. So in theory it should work for FCPX with Logarist.

    But some applications don't support color management. E.g., I'm not sure if Premiere Pro does. Therefore it's good to have a display with an sRGB mode, so that colors will look correct even in applications that don't support color management.

    In my experience, the monitors with the most accurate colors are wide-gamut displays with an sRGB mode. And that's what I recommend for anyone who's serious about video. A wide-gamut display with an sRGB mode will be more versatile, because you can use it two ways. You can use it in a wide-gamut mode with color management turned on, or you can use because it in sRGB mode with color management turned off. The latter will show colors correctly in applications that don't support color management. A wide-gamut display without an sRGB mode can't do that. Also, being able to operate in sRGB mode or a wide-gamut mode gives you an extra point of comparison to be sure that your color management is working correctly. Color management can get a bit hairy. You avoid all the complications of color management by doing everything in the sRGB color space.

  • In my experience, the monitors with the most accurate colors are wide-gamut displays with an sRGB mode. And that's what I recommend for anyone who's serious about video. A wide-gamut display with an sRGB mode will be more versatile, because you can use it two ways. You can use it in a wide-gamut mode with color management turned on, or you can use because it in sRGB mode with color management turned off. The latter will show colors correctly in applications that don't support color management. A wide-gamut display without an sRGB mode can't do that. Also, being able to operate in sRGB mode or a wide-gamut mode gives you an extra point of comparison to be sure that your color management is working correctly. Color management can get a bit hairy. You avoid all the complications of color management by doing everything in the sRGB color space.

    In my opinion, even if it is high end monitor with 12bit, or at least 10bit panel for grading non professionals it is best to avoid wide gamut monitors, with 8bit panels it'll always be easy to spot issues. No emulation mode actually can change spectrums of pure R, G and B pixels. All it can do is to make some mix from existing ones based on CIE models.

    To be short - ideal monitor to making sRGB work is monitor with R, G, B matching primaries.

  • with 8bit panels it'll always be easy to spot issues.

    Yes. If the display is doing any kind of color space transformation, color precision errors (banding) will be minimized if the panel has 10-bit or higher precision.

    No emulation mode actually can change spectrums of pure R, G and B pixels. All it can do is to make some mix from existing ones based on CIE models.

    Display standards don't specify the spectra of the red, green, and blue pixels. They specify the primary chromaticities in terms of the CIE 1931 XYZ color space. Display makers must invoke the CIE color model. It is unavoidable. Maybe someday in the era of laser and LED displays we will have display standards with spectral primaries, but not today.

    To be short - ideal monitor to making sRGB work is monitor with R, G, B matching primaries.

    The problem with making a display with a natively sRGB gamut is that you rely entirely on the physical properties of the materials used to make the display. You depend on the emission spectra and transmission spectra of your phosphors, backlight, and/or color filters. It's very difficult to control those physical properties to achieve exactly the primary chromaticities you want. That's why we have a whole lot of ostensibly sRGB displays that are sort of close to sRGB, but not really that close.

    The better approach is to aim for a gamut somewhat wider than sRGB, measure the chromaticities of the resulting primaries, and use math to make up the difference. Done correctly, it yields almost perfectly accurate color. You can read about it at http://www.displaymate.com/Color_Accuracy_ShootOut_1.htm Again, there is no downside to using math and the CIE color model. You need to do that anyway, because the display standards are specified in terms of the CIE color model.

  • Display standards don't specify the spectra of the red, green, and blue pixels. They specify the primary chromaticities in terms of the CIE 1931 XYZ color space. Display makers must invoke the CIE color model. It is unavoidable.

    CIE 1931 model is model derived from specific task, if you read it. And large display is far from such conditions. Yet, it makes mostly accurate predictions, except in cases where they are wrong.

    Maybe someday in the era of laser and LED displays we will have display standards with spectral primaries, but not today.

    With Rec 2020 you already are literally forced to extremely narrow variation.

    And current idea is to use QLED further cut with filter (can be very close to Rec 2020), quite cheap actually.

    The problem with making a display with a natively sRGB gamut is that you rely entirely on the physical properties of the materials used to make the display. You depend on the emission spectra and transmission spectra of your phosphors, backlight, and/or color filters. It's very difficult to control those physical properties to achieve exactly the primary chromaticities you want. That's why we have a whole lot of ostensibly sRGB displays that are sort of close to sRGB, but not really that close.

    It is not really hard as you think it is. Making proper P3 primaries that you suggest to look for is much harder.

    Also, if you just look at actual statistical experiments you will see that variation of human spectral sensitivity actually is not as small as pro display manufacturers want you to believe :-) Actually, with wider gamut it becomes real issue, and due to spectrums it stays such issue after calibration.

    The better approach is to aim for a gamut somewhat wider than sRGB, measure the chromaticities of the resulting primaries, and use math to make up the difference. Done correctly, it yields almost perfectly accurate color. You can read about it at http://www.displaymate.com/Color_Accuracy_ShootOut_1.htm Again, there is no downside to using math and the CIE color model.

    http://www.displaymate.com is kind of for sell guys (loving Samsung lately) with not very deep understanding of that they are doing and promoting.

    Yet I agree that on good pro level monitor with high bit panel you can do it with proper tool (note that colorimeter is made as eyes sensors analog, so with wider gamut you have rising issues due to sample variation or filters degrading). I am just not sure that this advice is good for average Joe reading this. Average Joe has instead big chance to get something with strange absolutely non standard wide gamut, 8bit panel and lot of trouble on his head.

    You need to do that anyway, because the display standards are specified in terms of the CIE color model.

    Well, to be really pro level accurate as stated, display must reconstruct specific viewer curves, instead of CIE standard observer. Plus add requirements for working space, display borders material and colors, etc, etc.

  • I don't agree with most of that and I don't want to debate it with you. Plus it's not really relevant to this topic.

  • I don't agree with most of that and I don't want to debate it with you. Plus it's not really relevant to this topic.

    You got tired. :-) Your topic, your preference. Yet, suggest guys selecting monitor to read and think hard.

  • Balazer, Having a consumer computer display calibrated with x-rite i1 - is it good enough for home color grading? What would it take to make grading acceptable by pro standards?

  • Hi @balazer It is not easy to grasp all the implications of wide-gamut vs sRGB. I see the advantages of using a monitor with at the same time wide gamut and sRGB (by the way, would you give me any suggestion about which 4K model to chose?). What I don't see yet is the advantage of a wide gamut monitor actually.... At this point why don't use just a sRGB monitor? May be the wide gamut monitor could be useful only in this scenario: color grading at the same time for big screen projection and for web and home vision. In this scenario I would grade for wide gamut delivery and control on a second display (or also the sole display if it has sRGB mode) how it works in sRGB. In this scenario therefore I would use Logarist how and when?

  • @Joshua_G, I'm not an expert on calibrators. I had an i1Display Pro for a while. Every time I tried to calibrate my display with it, the picture looked worse than without calibration. I concluded that my display was more accurate than the calibrator, and therefore calibration couldn't help. I think calibrators made sense back in the days of analog displays, which had a bunch of adjustments that could be grossly misadjusted. I don't think they make as much sense with modern displays. A lot of modern displays are quite good, and you need a really accurate calibration device if you want to improve your display. Frankly I wouldn't expect any calibration device that costs less than a few thousand dollars to be very good. Precision measurement devices are not cheap. Also, calibration on your PC can only apply some 1-D LUTs. It can't do any gamut remapping. To have an accurate gamut, that needs to be done in the display, or with profiling and color management, which are tricky. In my opinion, rather than having any kind of calibrator, you're better off putting the money towards a good factory-calibrated display. I'm really not sure what professionals would say is good enough, but lower Delta-Es are better. An average Delta E under 2 is pretty good. The Dell UltraSharp UP3216Q is an example of a monitor that checks off all the boxes that would be important to me:

    • Wide gamut

    • sRGB mode

    • 10-bit panel and 10-bit input

    • factory calibrated

    Again, I'm not recommending the Dell. I've never seen it. It's just an example. I have a laptop with a Dell PremierColor display with similar specs, and I love it.

    @Nino_Ilacqua, see the above for some guidelines on choosing a display. One reason to choose a wide-gamut display would be for previewing wide-gamut images. But if you are delivering standard HD (or 4k) video for the web, TV, or Blu-ray, it's all sRGB, and there's no reason to preview in a wide gamut mode. The reason to choose a wide-gamut display, as I said before, is because wide-gamut displays with an sRGB mode tend to have more faithful reproduction of the sRGB color space than standard gamut displays do. I don't recommend that LG UltraFine 4k display for Macs, because it lacks an sRGB mode. Keep in mind you don't need a 4k display to see accurate color. A good 1080p display probably costs a lot less than a similar 4k display.

  • I'm not an expert on calibrators. I had an i1Display Pro for a while. Every time I tried to calibrate my display with it, the picture looked worse than without calibration. I concluded that my display was more accurate than the calibrator, and therefore calibration couldn't help.

    Well, it is colorimeter and require proper adjustment matrix for specific backlight, ideally. Some old also can't measure wide gamut displays. Besides that latest models are pretty accurate devices.

    I think calibrators made sense back in the days of analog displays, which had a bunch of adjustments that could be grossly misadjusted. I don't think they make as much sense with modern displays.

    Cool, especially talking into account proposal to buy wide gamut displays and calibrate them. :-)

    Of course it is wrong statements. Modern display can have like 10x more adjustments :-)

    Frankly I wouldn't expect any calibration device that costs less than a few thousand dollars to be very good. Precision measurement devices are not cheap. Also, calibration on your PC can only apply some 1-D LUTs. It can't do any gamut remapping.

    Well, Colormunki (same as i1Display Pro) is considered among best consumer colorimeter ever made. Spyder 5 is worse, but still better comparing to anything older.

    Also, calibration on your PC can only apply some 1-D LUTs. It can't do any gamut remapping.

    If it possible to look for documents stating it?

    ICC documentation list two options, one being 3x3 matrix of the colorant primaries tristimulus values and one-dimensional tone curve for each colorant , but it can be more complex same 3x3 matrix, one-dimensional tone curve for each colorant , 3 dimension LUT, second 1D tone curve for each channel.

  • Final Cut Pro X users:

    I have released Logarist for Final Cut Pro X 1.3.0, with these changes:

    • Fixed a level mapping bug for V-Log L and V-Log

    • Reduced the number of LUT points, to improve loading times in the mLUT Plugin and to fix out-of-memory conditions (especially for Cinelike D, which used too many points and could experience failures during rendering)

    • Added support for Canon C300 Mark II internal XF-AVC recordings made with firmware version 1.0.6. This firmware version incorrectly sets the video_full_range_flag in its internal Canon Log 1/2/3 recordings, which causes levels to be mapped incorrectly in FCPX. For these recordings, use the full_range_flag LUTs, which compensate for the problem.

    I've also updated the documentation to note that Final Cut Pro's built-in log processing should be turned off for Canon Log, S-Log, and VariCam V-Log.

    http://www.logarist.com/

  • @balazar, A question about the Camera to Logarist LUTS:

    The Sony Cine 1 (Hypergamma 4), S-Gamut3.cine to logarist.cube LUT corresponds with my camera's PP Cine 1 and S-Gamut3.cine as Color Mode. What would be the best Color Mode to use for the other Cine 1 (Hypergamma4) to logarist.cube LUT ? ( In ACEScc , I remember that you recommended Still or Pro as Color Mode )

    Thanks

  • @wslagter, on Sony cameras I recommend setting the color mode to Still. But feel free to experiment and pick your favorite. None of my cameras support S-Gamut3.Cine, so I'm not sure how it compares.

  • Logarist now supports Adobe Premiere Pro and Premiere Elements in Windows. Please download and test and let me know how it works: logarist.com

  • Thank you so much @balazar - I admit to being a noob at most things video, but had struggled with one clip that was just a bit underexposed and couldn't get it looking half decent. I had tried "Photon Pro" noise reduction, but was unable to run it on the 4k clip (Panny G7) in temporal mode, as FCPX would just quit, so I used the spacial settings only. The best I could do is shown here:

    However once I downloaded and followed the instructions for Logarist - well it was night and day! No the end result is not perfect - but, to me, it's fully acceptable and make the clip watchable. The result is here (about 2:30 min, in the full video which was shot with GH1, G7 and G85 cameras):

    So, a big thank you for all the time and effort you put into this tool. I use it regularly for WB and exposure correction. I will try it with noise reduction at the same point as the WB/Exposure, next time I have noise issues!

  • Thanks, @JanH. Glad to hear it.

    I'm not really clear on how noise fits into this. Logarist doesn't do anything to improve noise. If the noise is in the source, it will be in the output too, but at least it won't be any worse. To minimize the noise, increase the exposure in the camera.

    Which Logarist input transforms did you end up using for each of your cameras?

  • hi @balazar You are correct - the original file did not appear to have much noise in it, but after all the (over) processing I did, originally there it was. Going back to the original source and using Logarist to correct WB and exposure "up front" may have helped to keep the noise level / blockiness low. In fact in the final version, I didn't use Color Finale. I just used the FCPX color board between transforms and that was it.
    Transforms used: GH4 cinelike D 0-255 to Logarist cube, then Logarist to Rec 709 Lumacomp 1.33 with exposure and colour correction in between. Yes, I know I didn't shoot in Cine D - I was not too impressed with how it worked for me when I first tried it, but the transforms seemed to work, anyway! I'm happy with the end result. Thanks!

  • Dear balazer,

    I just started using your 3D LUTs with Davinci Resolve to color correct videos produced with a Panasonic FZ1000.

    I found out that the output of this camera works great with the GH4 LUTs. I've been able to rescue clips that I thoght were completely useless before finding your code (mostly overexposed clips).

    So first of all I would like to thank you for your great work!

    There is one thing that puzzles me that is that the FZ1000 has two different Luminance level adjustments: 16-255 and 0-255. When I realized that I had to chose one of the Luminance levels to transform my footage to the logarist color space I checked what were my settings and I found out that they were adjusted to 16-255, wich seemed odd to me. After searching through forums I found people stating that some video editors have problems managing this 16-255 range and therefore people that care about that, tended to use either 16-235 (for compatibilty reasons) or 0-255 (to have a slightly higher color depth). On the other hand, from panasonic you get the following statement: 16-255 - best for normal video recording / 0-255 - best for creating still pictures from recorded video.

    As all those statements seem quite confusing to me, I was wondering if you may have any idea of what could be the advantage of using the 16-255 data range instead of 0-255. And most important I would like to know if there is any way of checking what Luminance levels were used to record a given clip without having to check the diferent options and having to guess which one works best. I already tried with ffprobe output but didn't find any clue...

    Thanks again for your work!