Personal View site logo
Make sure to join PV Telegram channel! Perfect to keep up with community on your smartphone.
What CPU and GPU to edit 10bit 4k video?
  • GH5 offer 10 bit 422 4k video.

    There are some 4k 10 bit monitor from Samsung, Asus, and so on...

    So the question is: what Nvidia GPU will offer 10 bit support for editing 4k 10 bit 422 in premiere cuda aceleration? Will the GeForce GPU be good enough? Or there is a need for highend Quadro nvidia gpu?

  • 53 Replies sorted by
  • At that point, I might as well just use the Sony and have better autofocus and low light, plus higher-resolution stills (at the cost of bigger/heavier lenses). :)

  • @eatstoomuchjam

    Just use 8bit modes on Panasonic.

  • @Vitaliy_Kisilev - I haven't tried generating optimized footage in Resolve for that footage yet, but I suspect that would also let me work with it without too much of a headache (though probably with really long export times). It also just cements that I'll probably be bringing the A7r III as my travel camera most of the time if I'm planning to edit on the go. :)

  • @eatstoomuchjam

    Well, means that GPU hardware decoder can't decode 10bit footage properly. Or it is Resolve thing.

    So thing you can do is to make fast adjustments, export it to H,264 8bit and be happy.

  • This seems like a nice topic to resurrect for this...

    I recently got a pretty maxed out XPS 15 9570 and I went for the i9 (planning to repaste it to reduce thermal throttling issues) (disclaimer: my day job is for a company which is mostly owned by Dell). The GPU is a 1050ti MaxQ. I imported one of my Resolve projects with a bunch of 10-bit GH5 long GOP footage and I was fairly happy at first when things were playing back at around 22-30fps before repasting (a little choppy, but tolerable) - but then I was bummed after about 10 seconds when thermal throttling kicked in and my playback rate dropped to about 6fps (pretty much unusable). I messed around a little bit with undervolting and underclocking in XTU to see if I could keep the envelope low enough to keep a relatively decent FPS without the 6fps crater and I had minimal/no luck.

    Then I got curious and tried some A7R Mark III 4K footage and everything was crazy smooth playing back at 29.97fps, even with 2 LUTs and some exposure/color tweaks applied. I only let it go for about 30 seconds to a minute, but there were no signs of anything heating up too much.

    I started checking resource monitor in Windows and I've noticed that when I'm trying to play back the GH5 footage, my CPU load is at about 85-90% and GPU is at about 10%. When I play back the A7R III footage, my CPU is at about 10% utilization and the GPU is at about 75-80% utilization.

    I double-checked on my desktop (8700k w/ 1080TI) and the pattern stays similar - CPU is heavily-used for GH5 footage and barely gets touched for Sony footage - but the beefy/non-throttling desktop CPU is able to power its way through the GH5 footage so it stays at a clean 29.97 fps.

    Has anybody else noticed this? I'm not seeing a lot of discussions about it online, but maybe I'm just searching for the wrong terms. I'm also thinking about trying Resolve 15 since I'm currently using 14 - maybe Black Magic's engineers were able to move more 10-bit h.264 processing to the GPU for the new version...

    Also, now that I have this fairly nailed down, I'm going to look to see if there's a "performance mode" that cranks up the fans and try that and then I'll try repasting the CPU/GPU soon and see if that lets me keep better FPS with GH5 footage.

    This should also answer the question for people looking for a new laptop to edit with Resolve - as long as you're using a codec that Resolve implements on GPU, getting a laptop with a 1050Ti should be enough to get you through general basic editing, no need for i9 (or maybe even i7). YMMV if you're doing fancier stuff than I am (lots of compositing, etc).

  • @cls105

    Well, most probably it'll be no difference.

    Normal editor must use build in hardware decoder in GPU or CPU itself. But I am not sure about Avid, as they very much push you to buy pro version of boards (same as consumer, but with special drivers and such).

  • I have 3 4k monitors with Windows 10 home creators edition displaying all at 4k. I get slightly choppy video when I play my timeline in Avid Media Composer. I have various media resolutions on my timeline but everything is rendered at my project setting 23.98 1080

    Is my 980ti most likely the cause of this subtle choppiness when using all 3 screens and 1 playing full screen video?

    cpu is 4790k w/ a z97 asus gryphon mobo (id say relatively high end) and 32gb of ram.

    I'm curious about what people think. However, I'll be getting my 1080ti tomorrow. So ill post an update either way.

  • As I remember latest Premiere broke GH5 support.

  • hi guys, I have a quadcore, GTX1080, 64GB RAM and I just downloaded the Neumann's test footage. the clips in Premiere only show audio. Am I the only one?

  • I've been using a Ryzen 1700 (found it for $290 on slickdeals) oc'd to 3.9 ghz ($75 refurb corsair i115 watercooler) with a gigabyte AB350M motherboard ($100). My 32gb of ddr4 ram cost me $225 Still using a 980ti from my previous intel 4790k build. It sucks for avid media composer which is picky as fuck w/ hardware. Just kind of choppy.

    But great for premiere and everything else. Great for transcoding with adobe encoder and FFMPEG

    Here is a screenshot of task manager while transcoding to prores with ffmpeg - 16 threads at 100% :)

    I haven't used resolve much with it so can't say much about that. Reason is, when I transcode, I just like to drag and drop everything, and keep my source's framerate and resolution the same.

    RYZEN FFMPEG.JPG
    1016 x 1017 - 126K
    RYZEN CUDA.JPG
    1306 x 1402 - 180K
  • Yes, h265 is very much an issue for all but a few NLE's. However h265 is a very compression heavey correct, so conversion at some point to a proxy, or alternative format (DNXHD or Prores) is a good idea anyway. If you are on Mac then I can recommend iFFmpeg. Amazing! (Give ffmpeg a lovely simple gui).

    Have you played with 60p 4k V-LogL?

  • @alcomposer Only tried 10bit as that's all I've been shooting, it's so nice and easy to shot with vlog and the lut! once I saw how nice it came out I never bothered trying 8bit :) I will as some point for some when I need to do some slow mo stuff I'm sure. I just rolled it back to the old version and it's working ok now. I like resolve / their ethos but I use audition a bit to re-time some of the stuff I do and other bits so it's just convenient. Logged a bug for this one and also logged another bug as it wont load the 6k 265 stuff at all just errors so thought I'd do it at the same time. I get the impression adobe taking it seriously, well as seriously and any big corporation does until it hits their revenue / reputation :-)

  • @Umii all GH5 files? Including h264 8 bit?

    Maybe try resolve until PP is fixed.

  • FYI latest Adobe build 11.1 on windows fully breaks GH5 files for me on windows it only imports the audio so suggest you don't upgrade if you want to edit any footage.

  • @eatstoomuchjam I thought that it was a h264 10 bit issue. I could be wrong. H265 from the GH5 isn't even advertised by Panasonic yet. Also h265 files that large are very demanding. Prores that large is also very demanding.

    So currently 4k 10 bit files from GH5 are h264

  • "No real issues other than h265 playback."

    Isn't h.265 playback the exact problem that most people are complaining about in Premiere?

  • @apefos, currently fcpx works lovely with GH5. Just mentioning. No real issues apart from h265 playback. However ffmpeg does a good job of transcoding.

  • @jclmedia I think there is something else going on with Adobe, I'm running it on windows my spec is further up this thread. now I've got a gh5 I've had time to play a bit more, as I put there it can play full quality with 10bit (I have 10bit monitor as well) in vlog + lut + some adjustments just fine. What I have noticed is if I ingest say 20+ 10bit clips then Adobe has a real hard time getting all the thumbnails sorted out and it really impacts the performance overall for quite some time while it sorts it's self out so I think there is probably something going on there that Adobe need to address, I did see someone else post this was a known issue they were working on but can't find a link (edit link here https://forums.adobe.com/thread/2299160)

    As a workaround now Adobe can read gopro ciniform in a mov wrapper natively 64 bit with no quicktime at last! I just auto proxy to that and then flip it to proxy mode and it flys + ciniform you can preserver the bit depth so you get a really good idea about colour correction even if it's a reduced size.

    So for me, it's fine at the moment, if I have a couple of clips I just want to tidy up and bang out I can with no proxy, if there are a lot and I want to edit properly I just let it proxy in the background and grab a cup of coffee. (I did try ingest and create proxy at the same time but that took longer, easier to just copy them then proxy it)

  • @apefos I believe that 6k photo mode on GH5 is 10bit 420, however using h265, so without transcoding it would be even more processor intensive.

  • So Id need a 6 core to run 4k 10bit 422 smoothly on premiere pro? right now I have i7 3770k @ 4.2 ghz and a 980 ti with 16gb ram and it starts to play them smoothly then half way into a clip or a whole clip will just be super choppy even if I drop the res. scale on pp. I was told a i7 7700k would be fine for editing in pp, a friend told me his plays back smoothly. maybe its combination of the ssd or pci ssd he has too.

  • J4206 that is uim

  • J4205 ?

  • It would be good idea Panasonic and other manufacturers implement a 4k 10bit 420 recording mode, because it would be less processor intensive than 4k 10bit 422, and quality would be very similar, enough for SLOG grading.

    This would save lots of people who have a quadcore computer.

    This would be a simple firmware implementation. How can we ask Panasonic to do this?

  • I believe a minimum ideal computer cpu for 4k 10bit would be a total of 26ghz to keep processor below 50% and make the editing to be smooth no matter the size of clips and the amount of cuts.

    6 cores at 4.4ghz each

    8 cores at 3.3 ghz each

    10 cores at 2.6 ghz each