Personal View site logo
What CPU and GPU to edit 10bit 4k video?
  • 54 Replies sorted by
  • I believe a minimum ideal computer cpu for 4k 10bit would be a total of 26ghz to keep processor below 50% and make the editing to be smooth no matter the size of clips and the amount of cuts.

    6 cores at 4.4ghz each

    8 cores at 3.3 ghz each

    10 cores at 2.6 ghz each

  • In these two pages there are original 10 bit clips from GH5 if you want to do tests, find the links!!!

    http://nofilmschool.com/2017/01/exclusive-filmmaker-luke-neumann-shooting-panasonic-gh5

    https://vimeo.com/search?q=gh5+cameralabs

  • I overclocked the quadcore to 4.5ghz, maximum stable clock at auto voltage. Now the files play fine in timeline without sttuter, 7 files in sequence, 10bit. So it was a 18ghz total, so a computer above 18ghz should play fine in timeline.

    Processor varies from 52% to 100%. I perceived when the play is near a new file the processor increases to 100% probably because the software is buffering the next file. After some seconds playing the next file the processor gets lower and varies between 52% and 73%.

    So every cut is a processor intensive task. Lots of cuts near to each other can make sttuter to happen. I did a try in small cuts, 3 seconds each, and I mixed different clips cut in 3 seconds parts. The first play was full of sttuter, the second play was fine because everything was buffered to memory. So a good solution to see the editing is to do two plays. First play will sttuter, second play will be smooth.

    This test was done in a 23,976 timeline. All clips was interpreted to 23,976. More FPS will mean more strong processor...

    The computer just have one monitor. Size was set to 25% and quality was set to maximum. To my surprise quality set to maximum was smoother.

  • if it's any use,

    I just tested the neumann 10bit footage on my setup, the cpu is a bit old but it runs real time at full with lumetri lut etc. applied

    windows 10 latest premier cc 2017 2 x xeon 3.30GHz 16 threads 64 gig ram with 4 gig x 16 dim 2 x titan x 12 gig cards dedicated pci usb (SSD 750 800Gig) which is insanely fast, best investment I've made in a long time but obviously not relevant for this test, this is all about decoding and shifting the decoded data around and old hard drive should cope with one steam of 150mbps.

    the 180fps stuff scales up nicely as well that's so much better than the old gh4 variable frame rate.

    cpu ranged from 7 - 40% as it played, didn't check gpu load allocated ~15 gig of memory

  • Its more important Raid and HD speed then Processor in same cases this days

    For video it is not necessary. Until you edit heavy multicam footage, especially raw.

    And all problems is solved bow by using PCI-E based big SSDs.

  • Its more important Raid and HD speed then Processor in same cases this days

  • My i7-5820k, 32GB RAM, GTX980 and 2 Samsung 950pro NVMe SSD computer cannot edit 4K footage of Panasonic LX100 smoothly with Premiere Pro CC 2015.

  • It seems a 6 core is a must have for 4k 10bit 422.

    I did an overclock in the quadcore to 4.4GHZ, I got a better play in timeline, it sttuter less, it is possible to edit, but not a comfortable experience.

  • @apefos I can playback those clip quite smooth on the Premiere cc 2017 timeline, keeping full preview resolution. I'm using a Hp z840 six core machine, nvidia TitanX Gpu, and a dedicated Pci express SSD to read the footage. I get a cpu usage around 50% but then goes a bit lower, and a Gpu usage around 15%.

  • There are several other bottleneck situations in Premiere Pro, and generally, that can produce that 'fatigue', for GH5 10 bit footage it would bring us nowhere to talk in this topic without precise tests in somewhat controlled situations.

  • @inqb8tr I did some more tests, the sttuter is worst when there are some clips one after other, I put 4 clips in timeline and then the sttuters start to be worst.

    I did an interpret footage to 23,976 to make all clips to play in same framerate, in a 23,976 timeline, but when there are 4 clips, ths sttuter is worst.

  • Since we have continued this discussion from another thread, just to be precise for everyone that would read this without the previous thread. On my system (i7 5820k 6-core @ default clock 3.3GHz, 32 Gb of DDR4 ram, GTX970 4GB, Cuda accelerated Mercury engine in Premiere Pro CC2015) 10 bit 4k GH5 files do not play smoothly from the timeline at any framerate (50p, 25p and 24p fps tested original files downloaded from some guy's Vimeo), but they do play smoothly when playback resolution is set to 1/2.

  • Yes, especially if overclocked a bit it could be an easy ride.

  • Ok, so a 6 core cpu is ok for GH5.

  • Yes, sorry, hardware accelerated Mercury engine on 970 GTX. I was just pointing out that I don't use anything like BM decklink or similar.

  • by software mode, let me understand:

    are you using mercury engine cuda acceleration? or not?

  • Just by software mode, no dedicated playback monitoring card

  • Every file, 50, 25 and 24 is totally smooth full screen on 2560x1440 screen, with 1/2 playback resolution set

    @inqb8tr are you using two monitors? one for premiere and other for preview?

    Yes, two monitors.

  • I did a test enabling the Intel HD Graphics. Now the computer have two GPU, Nvidia GeForce and Intel.

    But the Intel HD Graphics is not used when playback videos in timeline, the GPUZ measurements shows 0% usage.

  • "may be it is possible by dedicating second 10bit monitor as main monitoring/playback device?"

    I think it is not possible if the video is in premiere timeline. Premiere timeline needs a Quadro GPU for display 10bit. I am not sure but probably only media players will show 10bit with a GeForce.

    There is the idea of using a I/O card for 10bit preview in second monitor. Maybe it can work.

  • If you buy a 4k 10 bit monitor and a 4k ready Nvidia GeForce, so you can see the final video in 10bit, but not in timeline, you will see in a 10bit player after render, you need a player compatible with 10bit.

    may be it is possible by dedicating second 10bit monitor as main monitoring/playback device?

  • I found this in Adobe Forum:

    "Actually yes the Geforce cards can output 10bit color via HDMi and displayport. However they only support that with applications that use Direct x for the API. Quadro's support 10bit color via Open GL. Adobe uses Open GL so you would need a Quadro card for 10 bit with Adobe with the video cards. However you can get a I/O card like a Blackmagic card or Aja card and get 10bit color preview that way which is what I would suggest since those a raw output from the application and the color space is not effected by the OS or video driver."

  • So I found that 10bit per channel editing can be made in two ways:

    1 - edit 10bit but see 8 bit: any GeForce GPU and any default monitor will do. You will edit the 10bit footage and will see 8bit in the computer monitor, but you can render the video to 10 bit, no problem. So you do not need to by another monitor or another GPU.

    If you buy a 4k 10 bit monitor and a 4k ready Nvidia GeForce, so you can see the final video in 10bit, but not in timeline, you will see in a 10bit player after render, you need a player compatible with 10bit.

    I am using two players to see the GH5 10bit footage downloaded from vimeo: KMPlayer and Media Player Classic (with codec pack)

    https://www.mediaplayercodecpack.com/

    http://www.kmplayer.com/

    2 - edit 10bit and see 10bit: you need a Quadro Nvidia GPU and a 10bit monitor like Samsung LU28E590DS/ZD or Asus PB287Q or similar. This way you will edit and see 10bit in timeline.

    See Nvidia Quadro website: http://www.nvidia.com/object/quadro-desktop-gpus.html

    In my calculations, to playback the GH5 10bit 422 files 24p and 25p you will need an Intel 6 core 3.4GHZ cpu, and for 30p you will need a Intel 6 core 4.2GHZ cpu. It would be better to get a 8 core CPU... (remember: this is just some math calculations after see my quadcore bad performance)

  • Well, it seams that it is true. It is intentional block in drivers.

    Nvidia allows only DirectX surfaces to work in 10bits.