F
F
First Name Last Name2015-12-07 14:57:54
Monitors
First Name Last Name, 2015-12-07 14:57:54

Relationship between frame rate and refresh rate or is there more than 60 fps?

Hello everyone :-)
We had a fierce argument at lunch here in the office about the subject: it is known that the refresh rate of most monitors is set to 60 Hz. It is known about the existence of a popular in games (and not only) parameter frame rate (FPS).
The essence of the question is this: it is required to provide arguments proving or refuting the assertion that on this very "most monitors" it is impossible to see / feel the difference if the FPS is greater than the refresh rate.
For the sake of purity of the experiment, let's assume that our hardware is top-end and in the games under study without vsync enabled, the fps numbers are always higher.
I'll clarify the question:Let's say the refresh rate of the LCD monitor is set to 60Hz. What actually happens when we see in the game, say, 100 fps (here you can substitute any value > 60)?

Answer the question

In order to leave comments, you need to log in

5 answer(s)
V
Vladimir Sergeev, 2015-12-07
@Rockbass

The refresh rate of a monitor is how many different frames it can display per second. The frame rate in games is how many frames per second the video card sends to the monitor. It should be obvious that if the monitor can only update the image 60 times per second, then, kill yourself, there will be no visible difference. It will only happen if you connect a monitor with an extended scan (there are gaming LCD monitors at 120 Hz, for example, or a CRT, where this parameter could reach up to 250 Hz at low resolutions) or try to overclock the matrix; this will not give a strong increase, but there is a risk of spoiling it. Typically, the matrix is ​​​​overclocked to bring it to a frequency that is a multiple of 23.976 (for example, 71.928 Hz) so that you can watch movies without twitching; this cannot achieve a critical increase in smoothness in games, and if it is still desperately needed,
What can be useful for increasing the frame rate relative to the sweep rate:
1) the frame output delay error is reduced (purely homeopathically - the monitor matrix, the type of its connection, the polling rate of the port to which the input devices are connected, buffering in the video card settings, etc. .), which in especially dynamic games can be felt as a delay in the reaction to pressing a button;
2) in some games, the accuracy / granularity of the processing of ongoing events (for example, movement physics, as in Half-Life) and artificial intelligence are increased;
3) a warm video card will create comfort on a cold winter evening, and a laptop battery will remind you of the existence of real life earlier.
Well, it's worth remembering that the frames per second counter shows the current or average value; ideally, the hardware should provide this frequency at a level not lower than the monitor sweep in all situations - without drawdowns. Therefore, the right decision would be to lower the graphics settings in the game to a level at which fps never drops below 60, and then limit the value from above using RivaTuner or something similar, so that the electricity and iron resources are not burned in vain.

K
kazmiruk, 2015-12-07
@kazmiruk

In general, I don’t really rummage around in this matter, but at least there are monitors with 3D support, which give out 120 Hz or more (this is to your answer to the comment). And as arguments - the human brain processes information from the eyes in an average of 13 ms (googled in the first article that came across), which gives us about 77 frames per second. Accordingly, a person should notice the difference between 60 and 77 FPS with a monitor with a frequency of 120 Hz. But from my own experience I will say that I no longer notice the difference between 40 and 60 FPS. If the picture does not slow down, then for me it is equally good at 40 FPS and at 1000. Although some individuals see how frames are redrawn before their eyes if the FPS is below 60.

S
Stalker_RED, 2015-12-07
@Stalker_RED

On a CRT monitor, I personally see the difference between 70 and 85Hz. And yes, FPS and monitor scan are not the same thing.
The Internet is full of videos with different FPS to test. Or here's an emulator: https://frames-per-second.appspot.com/
In games, emnip, the average person stops feeling an increase in FPS from about 50-60 frames. But most, not all.
If the FPS does not match the refresh rate of the monitor, then artifacts on dynamic scenes are possible.
To get rid of these effects, they come up with all sorts of tricky things like www.geforce.com/hardware/technology/adaptive-vsync

V
Viktor, 2015-12-07
@nehrung

To get a complete answer, you need to clarify the question. As I understand it, FPS is tied here based on the update rate of the image buffer of the software used (specifically, toys, where this is set by the required image quality, or a video player in which the frame rate is set by the video / movie being played: 24, 25, 30, 60 frames / sec). But then you have to specify whether you mean window or full screen. The fact is that in windowed mode, the frame rate is completely determined by the settings of the OS video driver. If the game is opened in full-screen mode, then the OS driver is often replaced by the video driver built into it, with its own separate settings determined by the ordered picture quality and hardware performance (at least, this was the case in the old ones, like Warcraft 2).
In 1 case, the FPS of the system has nothing to do with the FPS of the program. In case 2 it will most certainly be the same. Although I could be wrong.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question