Personal View site logo
Make sure to join PV on Telegram or Facebook! Perfect to keep up with community on your smartphone.
What CPU and GPU to edit 10bit 4k video?
  • 54 Replies sorted by
  • if it's any use,

    I just tested the neumann 10bit footage on my setup, the cpu is a bit old but it runs real time at full with lumetri lut etc. applied

    windows 10 latest premier cc 2017 2 x xeon 3.30GHz 16 threads 64 gig ram with 4 gig x 16 dim 2 x titan x 12 gig cards dedicated pci usb (SSD 750 800Gig) which is insanely fast, best investment I've made in a long time but obviously not relevant for this test, this is all about decoding and shifting the decoded data around and old hard drive should cope with one steam of 150mbps.

    the 180fps stuff scales up nicely as well that's so much better than the old gh4 variable frame rate.

    cpu ranged from 7 - 40% as it played, didn't check gpu load allocated ~15 gig of memory

  • I overclocked the quadcore to 4.5ghz, maximum stable clock at auto voltage. Now the files play fine in timeline without sttuter, 7 files in sequence, 10bit. So it was a 18ghz total, so a computer above 18ghz should play fine in timeline.

    Processor varies from 52% to 100%. I perceived when the play is near a new file the processor increases to 100% probably because the software is buffering the next file. After some seconds playing the next file the processor gets lower and varies between 52% and 73%.

    So every cut is a processor intensive task. Lots of cuts near to each other can make sttuter to happen. I did a try in small cuts, 3 seconds each, and I mixed different clips cut in 3 seconds parts. The first play was full of sttuter, the second play was fine because everything was buffered to memory. So a good solution to see the editing is to do two plays. First play will sttuter, second play will be smooth.

    This test was done in a 23,976 timeline. All clips was interpreted to 23,976. More FPS will mean more strong processor...

    The computer just have one monitor. Size was set to 25% and quality was set to maximum. To my surprise quality set to maximum was smoother.

  • In these two pages there are original 10 bit clips from GH5 if you want to do tests, find the links!!!

  • I believe a minimum ideal computer cpu for 4k 10bit would be a total of 26ghz to keep processor below 50% and make the editing to be smooth no matter the size of clips and the amount of cuts.

    6 cores at 4.4ghz each

    8 cores at 3.3 ghz each

    10 cores at 2.6 ghz each

  • It would be good idea Panasonic and other manufacturers implement a 4k 10bit 420 recording mode, because it would be less processor intensive than 4k 10bit 422, and quality would be very similar, enough for SLOG grading.

    This would save lots of people who have a quadcore computer.

    This would be a simple firmware implementation. How can we ask Panasonic to do this?

  • J4205 ?

  • J4206 that is uim

  • So Id need a 6 core to run 4k 10bit 422 smoothly on premiere pro? right now I have i7 3770k @ 4.2 ghz and a 980 ti with 16gb ram and it starts to play them smoothly then half way into a clip or a whole clip will just be super choppy even if I drop the res. scale on pp. I was told a i7 7700k would be fine for editing in pp, a friend told me his plays back smoothly. maybe its combination of the ssd or pci ssd he has too.

  • @apefos I believe that 6k photo mode on GH5 is 10bit 420, however using h265, so without transcoding it would be even more processor intensive.

  • @jclmedia I think there is something else going on with Adobe, I'm running it on windows my spec is further up this thread. now I've got a gh5 I've had time to play a bit more, as I put there it can play full quality with 10bit (I have 10bit monitor as well) in vlog + lut + some adjustments just fine. What I have noticed is if I ingest say 20+ 10bit clips then Adobe has a real hard time getting all the thumbnails sorted out and it really impacts the performance overall for quite some time while it sorts it's self out so I think there is probably something going on there that Adobe need to address, I did see someone else post this was a known issue they were working on but can't find a link (edit link here

    As a workaround now Adobe can read gopro ciniform in a mov wrapper natively 64 bit with no quicktime at last! I just auto proxy to that and then flip it to proxy mode and it flys + ciniform you can preserver the bit depth so you get a really good idea about colour correction even if it's a reduced size.

    So for me, it's fine at the moment, if I have a couple of clips I just want to tidy up and bang out I can with no proxy, if there are a lot and I want to edit properly I just let it proxy in the background and grab a cup of coffee. (I did try ingest and create proxy at the same time but that took longer, easier to just copy them then proxy it)

  • @apefos, currently fcpx works lovely with GH5. Just mentioning. No real issues apart from h265 playback. However ffmpeg does a good job of transcoding.

  • "No real issues other than h265 playback."

    Isn't h.265 playback the exact problem that most people are complaining about in Premiere?

  • @eatstoomuchjam I thought that it was a h264 10 bit issue. I could be wrong. H265 from the GH5 isn't even advertised by Panasonic yet. Also h265 files that large are very demanding. Prores that large is also very demanding.

    So currently 4k 10 bit files from GH5 are h264

  • FYI latest Adobe build 11.1 on windows fully breaks GH5 files for me on windows it only imports the audio so suggest you don't upgrade if you want to edit any footage.

  • @Umii all GH5 files? Including h264 8 bit?

    Maybe try resolve until PP is fixed.

  • @alcomposer Only tried 10bit as that's all I've been shooting, it's so nice and easy to shot with vlog and the lut! once I saw how nice it came out I never bothered trying 8bit :) I will as some point for some when I need to do some slow mo stuff I'm sure. I just rolled it back to the old version and it's working ok now. I like resolve / their ethos but I use audition a bit to re-time some of the stuff I do and other bits so it's just convenient. Logged a bug for this one and also logged another bug as it wont load the 6k 265 stuff at all just errors so thought I'd do it at the same time. I get the impression adobe taking it seriously, well as seriously and any big corporation does until it hits their revenue / reputation :-)

  • Yes, h265 is very much an issue for all but a few NLE's. However h265 is a very compression heavey correct, so conversion at some point to a proxy, or alternative format (DNXHD or Prores) is a good idea anyway. If you are on Mac then I can recommend iFFmpeg. Amazing! (Give ffmpeg a lovely simple gui).

    Have you played with 60p 4k V-LogL?

  • I've been using a Ryzen 1700 (found it for $290 on slickdeals) oc'd to 3.9 ghz ($75 refurb corsair i115 watercooler) with a gigabyte AB350M motherboard ($100). My 32gb of ddr4 ram cost me $225 Still using a 980ti from my previous intel 4790k build. It sucks for avid media composer which is picky as fuck w/ hardware. Just kind of choppy.

    But great for premiere and everything else. Great for transcoding with adobe encoder and FFMPEG

    Here is a screenshot of task manager while transcoding to prores with ffmpeg - 16 threads at 100% :)

    I haven't used resolve much with it so can't say much about that. Reason is, when I transcode, I just like to drag and drop everything, and keep my source's framerate and resolution the same.

    1016 x 1017 - 126K
    1306 x 1402 - 180K
  • hi guys, I have a quadcore, GTX1080, 64GB RAM and I just downloaded the Neumann's test footage. the clips in Premiere only show audio. Am I the only one?

  • As I remember latest Premiere broke GH5 support.

  • I have 3 4k monitors with Windows 10 home creators edition displaying all at 4k. I get slightly choppy video when I play my timeline in Avid Media Composer. I have various media resolutions on my timeline but everything is rendered at my project setting 23.98 1080

    Is my 980ti most likely the cause of this subtle choppiness when using all 3 screens and 1 playing full screen video?

    cpu is 4790k w/ a z97 asus gryphon mobo (id say relatively high end) and 32gb of ram.

    I'm curious about what people think. However, I'll be getting my 1080ti tomorrow. So ill post an update either way.

  • @cls105

    Well, most probably it'll be no difference.

    Normal editor must use build in hardware decoder in GPU or CPU itself. But I am not sure about Avid, as they very much push you to buy pro version of boards (same as consumer, but with special drivers and such).

  • This seems like a nice topic to resurrect for this...

    I recently got a pretty maxed out XPS 15 9570 and I went for the i9 (planning to repaste it to reduce thermal throttling issues) (disclaimer: my day job is for a company which is mostly owned by Dell). The GPU is a 1050ti MaxQ. I imported one of my Resolve projects with a bunch of 10-bit GH5 long GOP footage and I was fairly happy at first when things were playing back at around 22-30fps before repasting (a little choppy, but tolerable) - but then I was bummed after about 10 seconds when thermal throttling kicked in and my playback rate dropped to about 6fps (pretty much unusable). I messed around a little bit with undervolting and underclocking in XTU to see if I could keep the envelope low enough to keep a relatively decent FPS without the 6fps crater and I had minimal/no luck.

    Then I got curious and tried some A7R Mark III 4K footage and everything was crazy smooth playing back at 29.97fps, even with 2 LUTs and some exposure/color tweaks applied. I only let it go for about 30 seconds to a minute, but there were no signs of anything heating up too much.

    I started checking resource monitor in Windows and I've noticed that when I'm trying to play back the GH5 footage, my CPU load is at about 85-90% and GPU is at about 10%. When I play back the A7R III footage, my CPU is at about 10% utilization and the GPU is at about 75-80% utilization.

    I double-checked on my desktop (8700k w/ 1080TI) and the pattern stays similar - CPU is heavily-used for GH5 footage and barely gets touched for Sony footage - but the beefy/non-throttling desktop CPU is able to power its way through the GH5 footage so it stays at a clean 29.97 fps.

    Has anybody else noticed this? I'm not seeing a lot of discussions about it online, but maybe I'm just searching for the wrong terms. I'm also thinking about trying Resolve 15 since I'm currently using 14 - maybe Black Magic's engineers were able to move more 10-bit h.264 processing to the GPU for the new version...

    Also, now that I have this fairly nailed down, I'm going to look to see if there's a "performance mode" that cranks up the fans and try that and then I'll try repasting the CPU/GPU soon and see if that lets me keep better FPS with GH5 footage.

    This should also answer the question for people looking for a new laptop to edit with Resolve - as long as you're using a codec that Resolve implements on GPU, getting a laptop with a 1050Ti should be enough to get you through general basic editing, no need for i9 (or maybe even i7). YMMV if you're doing fancier stuff than I am (lots of compositing, etc).

  • @eatstoomuchjam

    Well, means that GPU hardware decoder can't decode 10bit footage properly. Or it is Resolve thing.

    So thing you can do is to make fast adjustments, export it to H,264 8bit and be happy.