Personal View site logo
Cinema gear deals, direct from factories - Gear deals and Gear deals section. Also check Cameras, lenses, software, gear deals.
You support is vital for us. To keep this place ad free and independent, select one of the options below.
Donations are going to laboratory, support costs, hosting, etc. Your support allows to improve and expand this site.
What CPU and GPU to edit 10bit 4k video?
  • GH5 offer 10 bit 422 4k video.

    There are some 4k 10 bit monitor from Samsung, Asus, and so on...

    So the question is: what Nvidia GPU will offer 10 bit support for editing 4k 10 bit 422 in premiere cuda aceleration? Will the GeForce GPU be good enough? Or there is a need for highend Quadro nvidia gpu?

  • 39 Replies sorted by
  • I did some research and I found this information:

    The GeForce GPU cannot show 10bit when editing with Premiere. The Quadro GPU can do this. But you do not need the Quadro to do the editing, with the GeForce the editing is possible but you will see the image in 8 bit in the monitor.

    I do not know if this is complete true, I found in a forum.

  • Can be old information.

    By idea last generation must support 10bit without issue, as it is standard UHD modes.

  • There is a difference in how the GPU shows the 10bit: by directx or by opengl

    If the uhd playback and games are directx, so the GeForce can do the 10bit

    If the editing is opengl, maybe the GeForce cannot do the 10bit

    This is how nvidia separate the Quadro and GeForce. Quadro can do 10bit in opengl/cl, GeForce cannot do.

  • Nvidia calls this feature "30bit color" 10bit per channel, in the pdf text they say 30bit ready, see:

    http://images.nvidia.com/content/pdf/quadro/data-sheets/75509_DS_NV_Quadro_K420_US_NV_HR.pdf

  • Well, it seams that it is true. It is intentional block in drivers.

    Nvidia allows only DirectX surfaces to work in 10bits.

  • So I found that 10bit per channel editing can be made in two ways:

    1 - edit 10bit but see 8 bit: any GeForce GPU and any default monitor will do. You will edit the 10bit footage and will see 8bit in the computer monitor, but you can render the video to 10 bit, no problem. So you do not need to by another monitor or another GPU.

    If you buy a 4k 10 bit monitor and a 4k ready Nvidia GeForce, so you can see the final video in 10bit, but not in timeline, you will see in a 10bit player after render, you need a player compatible with 10bit.

    I am using two players to see the GH5 10bit footage downloaded from vimeo: KMPlayer and Media Player Classic (with codec pack)

    https://www.mediaplayercodecpack.com/

    http://www.kmplayer.com/

    2 - edit 10bit and see 10bit: you need a Quadro Nvidia GPU and a 10bit monitor like Samsung LU28E590DS/ZD or Asus PB287Q or similar. This way you will edit and see 10bit in timeline.

    See Nvidia Quadro website: http://www.nvidia.com/object/quadro-desktop-gpus.html

    In my calculations, to playback the GH5 10bit 422 files 24p and 25p you will need an Intel 6 core 3.4GHZ cpu, and for 30p you will need a Intel 6 core 4.2GHZ cpu. It would be better to get a 8 core CPU... (remember: this is just some math calculations after see my quadcore bad performance)

  • I found this in Adobe Forum:

    "Actually yes the Geforce cards can output 10bit color via HDMi and displayport. However they only support that with applications that use Direct x for the API. Quadro's support 10bit color via Open GL. Adobe uses Open GL so you would need a Quadro card for 10 bit with Adobe with the video cards. However you can get a I/O card like a Blackmagic card or Aja card and get 10bit color preview that way which is what I would suggest since those a raw output from the application and the color space is not effected by the OS or video driver."

  • If you buy a 4k 10 bit monitor and a 4k ready Nvidia GeForce, so you can see the final video in 10bit, but not in timeline, you will see in a 10bit player after render, you need a player compatible with 10bit.

    may be it is possible by dedicating second 10bit monitor as main monitoring/playback device?

  • "may be it is possible by dedicating second 10bit monitor as main monitoring/playback device?"

    I think it is not possible if the video is in premiere timeline. Premiere timeline needs a Quadro GPU for display 10bit. I am not sure but probably only media players will show 10bit with a GeForce.

    There is the idea of using a I/O card for 10bit preview in second monitor. Maybe it can work.

  • I did a test enabling the Intel HD Graphics. Now the computer have two GPU, Nvidia GeForce and Intel.

    But the Intel HD Graphics is not used when playback videos in timeline, the GPUZ measurements shows 0% usage.

  • Every file, 50, 25 and 24 is totally smooth full screen on 2560x1440 screen, with 1/2 playback resolution set

    @inqb8tr are you using two monitors? one for premiere and other for preview?

    Yes, two monitors.

  • Just by software mode, no dedicated playback monitoring card

  • by software mode, let me understand:

    are you using mercury engine cuda acceleration? or not?

  • Yes, sorry, hardware accelerated Mercury engine on 970 GTX. I was just pointing out that I don't use anything like BM decklink or similar.

  • Ok, so a 6 core cpu is ok for GH5.

  • Yes, especially if overclocked a bit it could be an easy ride.

  • Since we have continued this discussion from another thread, just to be precise for everyone that would read this without the previous thread. On my system (i7 5820k 6-core @ default clock 3.3GHz, 32 Gb of DDR4 ram, GTX970 4GB, Cuda accelerated Mercury engine in Premiere Pro CC2015) 10 bit 4k GH5 files do not play smoothly from the timeline at any framerate (50p, 25p and 24p fps tested original files downloaded from some guy's Vimeo), but they do play smoothly when playback resolution is set to 1/2.

  • @inqb8tr I did some more tests, the sttuter is worst when there are some clips one after other, I put 4 clips in timeline and then the sttuters start to be worst.

    I did an interpret footage to 23,976 to make all clips to play in same framerate, in a 23,976 timeline, but when there are 4 clips, ths sttuter is worst.

  • There are several other bottleneck situations in Premiere Pro, and generally, that can produce that 'fatigue', for GH5 10 bit footage it would bring us nowhere to talk in this topic without precise tests in somewhat controlled situations.

  • @apefos I can playback those clip quite smooth on the Premiere cc 2017 timeline, keeping full preview resolution. I'm using a Hp z840 six core machine, nvidia TitanX Gpu, and a dedicated Pci express SSD to read the footage. I get a cpu usage around 50% but then goes a bit lower, and a Gpu usage around 15%.

  • It seems a 6 core is a must have for 4k 10bit 422.

    I did an overclock in the quadcore to 4.4GHZ, I got a better play in timeline, it sttuter less, it is possible to edit, but not a comfortable experience.

  • My i7-5820k, 32GB RAM, GTX980 and 2 Samsung 950pro NVMe SSD computer cannot edit 4K footage of Panasonic LX100 smoothly with Premiere Pro CC 2015.

  • Its more important Raid and HD speed then Processor in same cases this days

  • Its more important Raid and HD speed then Processor in same cases this days

    For video it is not necessary. Until you edit heavy multicam footage, especially raw.

    And all problems is solved bow by using PCI-E based big SSDs.