Hello,
I've been investigating a bit on Transcoding but I still have a question I don't find answered. If I missed it, please refer me to the existing answer and sorry for reposting.
I wanted to know what has an impact in CPU usage when transcoding. Is it only the Bitrate and the Resolution of the original file or also to what quality this is transcoded to. For example: If you have a 10Mbps 1080p movie and you only slightly downscale it to let's say 8Mbps 1080p will this stress the CPU more or less then for example taking a 4Mbps 720p movie but downscaling it all the way to let's say 64Kbps.
In the first case the origin is a bigger but it only needs to downscale 2Mbps while in the second case the origin is smaller but it needs to downscale almost 4Mbps.
I'm tented to think that it is only the original size of the file that marks the workload and that very high bitrates and resolutions put a high workload on the CPU, regardless if it has to be taken to 8Mbps or 64Kbps, but I'm not sure at all.
Can anyone clarify this for me?
Thank you very much!