Personal View site logo
Make sure to join PV on Telegram or Facebook! Perfect to keep up with community on your smartphone.
Please, support PV!
It allows to keep PV going, with more focus towards AI, but keeping be one of the few truly independent places.
We need a benchmark for codec testing
  • Seriously, shooting static torture charts doesn't cut it, nor does it bring any useful information on how a patch is doing except for Full Intra. We need something that is pure (uncompressed), has texture, has rich tones, has complex non linear motion (like fire or water), that is repeatable and easily accessible to everyone to play on a Full HD screen.

    I've been searching the Web for flash animations to no avail. Does anyone have an idea or can come up with something that fits this bill?

    An OpenCL/GL or DirectX demo perhaps?

  • 17 Replies sorted by
  • It will be quite hard to make. And I doubt that results on different moniotrs will be the same.

  • Difficult challenges for the AVCHD encoder (brightly-lit, sharp-focus):

    1. Cross-hatched chart
    2. Gnarly bush
    3. Running water
    4. ETC mode
    5. High ISO patch
    6. Lumix lens correction
  • @Vitaliy_Kiselev : I'm sure every user would get different results, but the idea of being repeatable is that I could put the camera on a tripod, test one patch, load another and test the other against the first on similar conditions (similar exposure, lighting & framing). It's not even near perfect but it's a step forward from static resolution/torture charts where at the moment the P/B-frames have a chance to make the codec shine without making much of an effort.

    I'm looking at DirectX demos ATM, but I'd like to have something that would be more cross-platform.

  • OTH something like this ticks a lot of the boxes:

    I wonder if we could get hold of something like that but with the least compression possible...

  • @duartix

    You don't need any 3D or similar stuff.

    Simple 2D graphics is absolutely enough.

    Look at demoscene and find something suitable and ready to use.

  • Thanks @Vitaliy_Kiselev !

    It's amazing what all those groups could do with 64k and still find the space to host a virus. :O

    Meanwhile I found a few that look safer but then I stumbled into this:

    I'll have a look and then I'll post back.

  • Over the weekend I've downloaded over 65GB of uncompressed video from here:

    I now have the following synthesized videos as PNG sequences:




    I've got a pretty good idea of what scenes tax the codec and I've confirmed it by exporting the files to AVI with x264 compression (Superfast preset) and looking at the x264.stats file. Now before I compress all the footage there are a few issues that need addressing:

    The scene I've chosen from BBB weights in at ~3GB (Lagarith lossless) and around 960MB when encoded with x264 (QP=2, no B-Frames, GOP12). Even if I compress it at QP=8 (~500MB) it still looks absolutely clean and it has peaks over 200Mbps. Considering that the codec can be stressed even more by raising ISO, I'm wondering:

    • What minimum QP is reasonable for encoding these test videos?

    Since these are PNG sequences, they can be sped up at will, however playback issues have been a constant. I have an Intel i7 870 @3.2 and a nVidia GT430 but trying to play it when encoded @60fps means lowering the quantizer almost to QP=20 and using the UltraFast preset (no cabac use, IIRC), so I'm also wondering:

    • Is 50/60 fps really necessary for FullHD tests?

    P.S. I'm still looking at OpenGL renders as they could solve the FPS and distribution issues, but they probably won't come to represent real footage conditions as good as these synths.

  • @duartix I'm not sure how useful CGI-rendered and recompressed footage samples will be in evaluating patch performance. The filtering and artifacts that occur in synthetic images are quite different than real-life scenery, and I don't think many people use the GH2 to film CGI sequences off an LCD monitor.

  • @LPowell : Yes I know that CGI usually compressed much better than real footage, but that's probably because in the early days CGI wasn't as rich, dense, full of texture and physics as it is today.

    OTOH members these days are testing the patches in unrealistic situations, against static torture charts. I mean, how much farther from real footage can that be? :(

    I know this isn't perfect but at least it's a better fit for the task, unless we can get our hands on uncompressed HD footage. OH WAIT!!! We can...

    And it doesn't get better than this (downloading as you read):

    800 x 450 - 302K
  • Thanks for pushing it @LPowell , I think I'll give these new sources a go instead. I'll keep the CGI on a close pocket though if these ones fail to stress the codec.

    Now, about re-compression, that's why I asked how far on QP we could go, because on the CGI footage, I can't tell a difference to the original even with QP=16! I'm not really sure for example if @driftwood 's 146Mbps patches will even go below that. Should I encode them in a better CODEC instead of h264 before redistribution?

    To tell you the truth, in such a pure source I'm sure the GH2 will introduce so much noise that it will completely shadow the re-compression artifacts, and those would be the least worry.

  • @duartix What encoder are you planning on using? QP only controls the coarseness of quantization. I'd want to use something with a higher color depth, such as ProRes 4444. That's probably the most practical distribution format, since Adobe CS5 can read ProRes files on PC's as well as on Macs.

  • I was also thinking of going ProRes (which I know little to almost nothing about except that is has less chroma subsampling in some of it's variants) but the goal here is wide playback (also in PC's that aren't cutting edge). I have AE 5.5 installed. I'll probably need Quicktime Pro to do it. Any tips on encoding parameters? Perhaps this is finally time to check @shian storm's tutorial on export.

  • @duartix For distribution of high quality reference video, playback is not the issue, fidelity and portability are. Most likely few people would have a suitable projection system anyway.

  • Fidelity unfortunately will always be a casualty since, as you said, few people will have a suitable projection system and even less could ever match one another's. The real main goals are easy distribution, playback and repeatability for each tester.

    This way each tester can easily put different patches on a fixed camera (read tripod) and challenge them for such things as bitrate usage, IQ, spanning, etc. taking the footage content variation out of the equation.

  • @duartix Yes, but what I'm pointing out is that few will have a suitable projection system for displaying reference videos. To obtain useful results, you need to shoot a surface that reflects light. You can't use an LCD monitor for this purpose - the camera's shutter won't be synced with the monitor's refresh rate.

  • If the codecs could be implemented as QT Component (like the PANASONIC AVC-Intra encoder for Compressor) it could be possible to test how the different patches behave with different kind of stuff. Marco Solorio tested few years ago most of the available QT codecs ( and the test charts and graphics can still be downloaded from his web site.

  • Bloody hell! Can't say I'm happy about it...

    There are REAL flicker issues. It's serious enough that on my IPS OLED LCD it leaves me with just one and a half usable shutter speed (1/40s is OK and 1/80s isn't fully stable

    On top of that, when I fill the frame on the camera with the LCD, the pixel structure comes through in such a way that this benchmark becomes a total FAIL to access IQ. All I'm left with is a repeatable way to stress the codec...

    The interest in it now is a lot more limited, so I'll hibernate this project for now. Next time I'm off to d/l 100GB I'll remember what fast prototyping is all about... :(

    Off to try a a sheet of € bank notes on a rotating display now...

    @LPowell : Thanks for the reality check, I know myself and know I'd be submerged in the compression/playback playground for the rest of the week before I'd come to my senses and started recording ...

    @Rafa : Thanks for the link. It's very valuable.