Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
rithvikp2856yI think it depends on the type of encoding you are doing. AFAIR x264 needs a lot of CPU whereas NVENC takes advantage of the GPU.
-
donuts238486y@rithvikp I tried a h265 1080 preset. One file can it smaller but the other came out 10x bigger.
what do the 2 u mentioned mean? -
rithvikp2856y@billgates From what I remember from all the youtube videos I watch: h264, h265, nvenc etc are all video encoding codecs, they determine the algorithms that are used to compress the video. Some codecs like the h264 or h265 use algorithms that are more cpu intensive and may sometimes use gpu for acceleration. So, these codecs naturally make your cpu the bottleneck. Nvenc is a codec developed by Nvidia that is specifically written to take advantage of the gpu.
I don't know why the file sizes are behaving like that, I personally never did any video encoding.
I mentioned x264 earlier, that was my bad. h264 is the codec standard, x264 is just an open source library that implements h264 -
it mostly depends on the codec used, and how the enconding process is implemented
afaik, mainconcept suite takes huge advantage of gpus
anyway, i know a couple of professionals in this field, i understand that gpus are used mostly for intermediate formats, when instead for final output apparently full cpu encoding is still preferred. not much different from what happens with 3D renderings -
donuts238486y@thatsnotnice but for video games it's the gfx card that matters right? And that would be 3d rendering?
That's also why I'm confused, GPU is the workhorse for rendering games (Ultra setting still runs very smoothly) but it's not used for "rendering" videos... -
@billgates for real-time 3D rendering like in games, or realtime video-encoding when working on video editing, yeah you actually *NEED* the GPU. in general this is true every time that time is a constraint.
normally i understand that CPU is still preferred in case of offline-rendering, but it's mostly a matter of final output quality right now.
i don't know much more about that, but my guess is that algorithms used in GPUs are at the moment optimised mostly/only for speed and to keep the working experience as fast and smooth as possible
nonetheless it's true that GPUs are capable or doing a good-enough job in most scenarios, in terms of final output
Finally got around to some real video encoding work on my new computer but noticing it's not blazing fast...
And more work is still handled by the CPU... But I thought video processing is handled by the GPU, which seems to be barely used at all. I'm using Handbrake but I thought the whole point of dedicated GPU was for intensive graphics and video processing?
question