Personal View site logo
Make sure to join PV on Telegram or Facebook! Perfect to keep up with community on your smartphone.
Please, support PV!
It allows to keep PV going, with more focus towards AI, but keeping be one of the few truly independent places.
Visualising encoder performance using Avisynth
  • I have observed that the GH2 encoder performance is quite variable at varying luma levels and image complexity. I authored this script to look into this and I am happy with how it turned out. It might even come in handy for looking at GH3 footage.

    # Better Living through Visualisation 0.2 by vicx
    # An Avisynth script inspired by the discussions at http://personal-view.com  
    # Visualise more plainly see the effects of quantisation and intra prediction at different luma levels using Avisynth
    #
    # Requires Avisynth or an Avisynth host - Tested with Avisynth 2.6 (x86).
    # For tweaking scripts I use and reccommend AvsPmod Ver 2.3  http://forum.doom9.org/showthread.php?t=153248.
    # It is easy to take screenshots inside AvsPmod.
    # When I want to share noise in motion I open the avs file with AnotherGui http://www.stuudio.ee/anothergui/ and render to a 6mbps X264 preset 
    # Presets available from here http://www.stuudio.ee/anothergui/Presets.html
    #
    # Brought to you by the number 4
    # watch?v=7XefWDGHtNA 
    
    # If your file won't open Google LAVfilters / If you want fullsize windows remove ReduceBy2()
    source = DirectShowSource("F:\___Video\New folder\00129.MTS")
    #source = Imagesource("F:\___Photo\image.png")
    global clip = source.converttoYV12(matrix="Rec709").ReduceBy2()
    
    function Render_Window(clip c, int min_luma, int max_luma,string label, int label_color, int label_halo)
    {
    Range = clip.Limiter(min_luma,max_luma,min_chroma=0, max_chroma=255, show="luma" )
    Range_Luma = Range.histogram("luma")
    Range_clip = overlay(Range_Luma,clip,0,0,clip,chroma_background,"blend")
    Range_clip_labeled = Range_clip.Subtitle(label + " = " + string(min_luma) + "-" + string(max_luma), font="Tahoma", size=20, text_color=label_color, halo_color=label_halo, align=9)
    return Range_clip_labeled
    }
    
    #chroma_background is 0.0-1.0 and is how bright the background appears behind the luma maps
    global chroma_background = 1
    
    #Set this lower than your source to slow things down if you want
    framerate = 24
    
    #White/Blacklimit you could make 16,235 but 0,255 makes sure you catch everything
    black_limit = 0 
    white_limit = 255 
    
    #Set the luma limits for the four windows 
    mud_limit = 32
    coarse_limit = 128
    fine_limit = 128
    
    #Overlay Highlights in green
    clip_limit = clip.Limiter(min_luma=0,max_luma=234,min_chroma=0, max_chroma=255, show="luma" )
    clip_mask = clip_limit.ConverttoRGB32.ColorKeyMask($00FF00)
    clip_mask = Mask(clip_limit.ConverttoRGB32(), clip_mask.ConverttoRGB32())
    
    #Render Windows
    Top_Left = clip.Subtitle( "Original", font="Tahoma", size=20, text_color=$FFFF00, halo_color=$000000, align=9).converttoRGB32.layer(clip_mask).ConvertToYV12
    Top_Right = Render_Window(clip, fine_limit, white_limit, "Fine Mids and Highlights Luma ", $FFFFFF,$000000)
    Bottom_Left = Render_Window(clip, black_limit, mud_limit, "Mud Luma ", $555555, $DDDDDD)
    Bottom_Right = Render_Window(clip, mud_limit, fine_limit, "Coarse Mids Luma ", $BBBBBB, $000000)
    
    #Arrange Windows
    StackVertical(StackHorizontal(Top_Left,Top_Right) ,StackHorizontal(Bottom_Left,Bottom_Right) )
    
    #EOF
    

    I attach some images which are NOT commentary about the GH2 encoder. I just included them to demonstrate the script using a low luma image and a high luma image.

    I also attach an image of kitchen re-visualised with a basic avisynth one liner.

     DirectShowSource("F:\___Video\New folder\kitchen.MTS").histogram("luma")
    
    BLtV_kitchen.png
    1920 x 1080 - 1M
    BLtV_Campos_Luma_levels_in_AVCHD.png
    1280 x 720 - 944K
    BLtV_Street_Luma_levels_in_AVCHD.png
    1920 x 1080 - 2M
  • 11 Replies sorted by
  • Are there any dependencies or should I just save it as an .AVS and expect it to run with a standard installation?

  • @thepalalias It uses Directshowsource so you need Directshow decoders for MTS,MOV etc containers and Directshow H.264 decoders which are well behaved. To see what filters are being used on your machine you could use Graphstudio http://blog.monogram.sk/janos/tools/monogram-graphstudio/

    "should I just save it as an .AVS"

    Yes

  • With my FZ200 AVCHD PSH clip I got an error. "Convert to YV12: invalid matrix parameter (RGB data only)"

    I dropped the "(matrix="Rec709")" and the default matrix seemed to display OK.

    ps: pardon the mess in the workshop. ISO 1600 f2.8 1/30 (yes, it's dark in here. That white woofer pipe in center middle measures 3 Lux with my meter. I usually shoot, and test, indoors )

    pps: thanks for the script :)

    clip00001.jpg
    1920 x 1080 - 1M
  • @trevmar That is pretty clean for such low light. FZ200 looks like a contender.

  • Well, I also have an LX7, and the lens of the LX7 gives it an extra 2 f stops even over the FZ200, very handy at these low light levels, as long as you can handle the bokeh as the LX7 lens widens to f1.4. But the parfocal x24 reach of the FZ200 is amazing ... Luckily, both these sensors have pretty good control of their noise levels.

    I am surprised how well the new Panny AVC encoder handles the noise at these light levels. It reports "High@L4.2" CABAC, 2 ReFrames, 27.4Mbps, taking an overall 0.22 bits per pixel. Stream parser says there is a good mix of I, P and B frames. Attached is analysis of the FZ200 video shown above.

    IPBframes.jpg
    1289 x 672 - 186K
    DataRate.jpg
    1289 x 672 - 202K
  • [Deleted question already answered]

    @vicx Here are I P B frames from ISO 3,200 footage shot at 1/125 in 60P SH with Sedna Q20 A. Avg bitrate for the footage was 87 mpbs.

    vicx BLtV 60P SH - Sedna Q20A - I frame0.png
    1280 x 720 - 1M
    vicx BLtV 60P SH - Sedna Q20A - P frame0.png
    1280 x 720 - 1M
    vicx BLtV 60P SH - Sedna Q20A - B frame0.png
    1280 x 720 - 1M
  • Just posting this information regarding Avisynth or AnotherGUI usage in case anyone has run into problems encoding video. The 32 bit and 64 bit video systems in Windows are completely separate. This means that if you are using 32bit Avisynth you must use 32bit ffmpeg/ffmbc/x264 executable.

    @trevmar I went online and looked at some FZ150/FZ200 footage and it is pretty good at full range. If VK can hack one or both cameras I might choose an FZ camera over a Lumix telephoto lens for my GH2.

    @thepalalias It looks like you have an interest in what I have always thought were unacceptably high ISO settings. I might have to reconsider ignoring ISOs over 1250. It is clear from the images you post that putting more of the image in the fine luma region by dropping the shutter to 1/60 would give a better encoded image.

    As for ISO >=3200 I suspect that at high ISO settings - chroma noise becomes more prominent than luma noise and the encoder won't handle this in the same way. To evaluate high ISO chroma noise I might have to make a script that isolates chroma noise instead of luma. I will have to think about how to visualise it.

  • @vicx I find that the way each of the settings handles high ISOs varies greatly - so I often use them to test settings and illustrate differences. I don't generally use sideways shots of messy stoves in my professional work nor shoot above ISO 2,000 in professional work unless I am specifically targeting a grainy look. Sometimes you get an edgy music video or a documentary project using available light that seems to call for it, though.

    The shot above was stopped down by at least 2 stops from that lenses maximum aperture because the purpose of that shot (and the 20+ like it with other settings) was to try and have a combination of static detail, motion, different colors and underexposure in one shot. As you can see, the cereal box in the back is the focus point but the aperture been much wider, the flames in the front would have been so completely blurred that I was unsure that they would do much good in testing the motion. If I was shooting it for narrative purposes, I probably would opened the aperture to wide open and then lowered the ISO to no higher than 2000, without a need for a slower shutter.

    Still haven't found an ideal way to get everything that I wanted to test in a single shot, though some of the guys like LPowell have provided ideas.

    As far as chroma noise, if you find a way to visualize it, I would be interested. The chroma noise at ISO 2500 and above starts to get in there and by 12,800 you pretty much want to knock the whole thing down to black and white for most shots. The blue specks really stand out in the shadows.

    The different settings handle it in their own ways, but there's no getting around just how much noise is there for the sensor.

  • @vicx

    I also attach an image of kitchen re-visualised with a basic avisynth one liner.

    That is some awesome surrealist art, I absolutely have to try this!

  • @LPowell It's not entirely unlike the "chrome" filter for Photoshop that a lot of people started using right around the middle of the 90s. :)

  • Yeah that chrome effect looks even better in motion. Cyberdelic!