A
A
Artemy2021-01-13 13:11:28
Video
Artemy, 2021-01-13 13:11:28

Is it true that when the video playback speed changes, the resource consumption increases linearly?

  • Let's say that at 100% (x1) video playback speed, 10% of system performance is required.
  • At a video playback rate of 200% (x2), would 20% be required, or would the load vary non-linearly, or would it depend on the device, video codec, etc.?


For example, is it possible that at x2 and x2.5 speeds the device will be able to play H265 video, but at some x2.35 it will slow down.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
Sergey, 2021-01-13
@Artemonim

Generally linear, but depends on the codec and encoding method.
For example, codecs break sequences of frames into frames, and only the first frame out of 12-20 is completely preserved.
It may happen that only the first frame of a frame needs to be played, in which case the others may be discarded. But I don't know if there is such logic in codecs.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question