Answer the question
In order to leave comments, you need to log in
Is it possible to say that the frequency in the monitor is equivalent to the frame rate in the video?
Bought a 4K monitor. I'm a programmer myself, I don't play games at all. After buying a laptop with a monitor with 167 DPI, I could not look at a FullHD monitor without tears and bought a 4K one. I connected via HDMI to the onboard Intel 630, the picture quality is tolerable, but it seems that now the mouse pointer does not keep up with my movements and this is terribly annoying.
From childhood and my father's stories, I remember that initially the scanning frequency in TVs was essentially the frame rate. And if you compare 30 and 60 frames per second on video recorded on the phone, the movement of objects at 30 fps is about the same as if a little jerky. A question for those who have already connected a 4K monitor via HDMI, and then switched to DP: was there a jerky mouse movement on HDMI and did it go to DP? Is it worth it to buy a discrete graphics card for this?
Answer the question
In order to leave comments, you need to log in
Frame rate has little to do with monitor refresh rate. The monitor from the video memory gets its legal 120 frames per second, but the video card can put 20 frames per second there, just for 6 frames the picture will be the same. So you will perceive as a result 20 FPS.
The refresh rate of a monitor is an upper frame rate limit that you won't get above on your hardware.
However, if your source material is 20FPS, you will perceive 20 FPS. If your card doesn't have time to draw the cursor, you'll get stuttering even at 120Hz monitor refresh.
Many modern TV sets are able to interpolate the picture by sticking in the missing frames, which turns the dynamic scenes in old blockbusters into a soapy movie.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question