Answer the question
In order to leave comments, you need to log in
Does it make sense to turn on 144 Hz if the game gives out 80-90 FPS?
2k monitor, DisplayPort, 144Hz monitor. There are games in which fps is under 200, and some jump 65-90.
So, does it make sense to always set 144 Hz, or can you leave 75 so as not to load the PC once again?
Answer the question
In order to leave comments, you need to log in
here you need to understand the specifics of the issue.
First of all, current monitors do not have
Hertz in nature. Tube monitors had Hertz, as it’s not difficult to guess, this is 1 pixel running and updating the entire monitor (yes, yes, 1 is the only one)
And it runs over the monitor 60 times per second
. There are no LCD and other CRT monitors in nature, and there either a light bulb shines or a crystal turns.
But it physically stands still and does not move, no matter how many times it is updated, it is still present in place in the rest of the time in the form of a fixed point.
And the hertz on the monitors of the new generation remained for compatibility.
Yes, they have a certain parameter that can be pulled up to this concept, but it is already closer to FPS
Therefore, there will no longer be such a direct relationship.
By turning on 140 hertz on old monitors, you could get a better picture (less tired eyes) even when playing at 20 FPS, another question is that they didn’t pull so much.
And now we come to the main
thing. HERTZ can only show themselves in the VGA wire - since it is analog, but you are not so stubborn, are you?
And there will be no interference on the digital cable, just the picture will be at 60 FPS edge
. And here it’s really only the comfort of the game, since you can work at 60, you won’t notice the difference.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question