C
C
Chelovec2022-03-27 09:37:28
Monitors
Chelovec, 2022-03-27 09:37:28

What is the real reason why a 144Hz monitor will cost more than a 60Hz monitor?

Why if you apply a fhd 90 Hz signal to a regular fhd 60 Hz monitor, then a window "not optimal mode" will appear and there will be no image on the screen? What, besides the bandwidth of the cable and the limitations of the video card itself, can cause a 60 Hz monitor to fail to work in fhd 90 Hz mode? Why do you have to pay extra to support a high refresh rate monitor? After all, the monitor is not a video card, it does not need to process anything, it simply receives a signal from the video card and that's it.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
1
15432, 2022-03-27
@Chelovec

no, well, if you have an analog CRT VGA monitor, then indeed, you can just select 90, 120, 240 Hz and it (most often) shaves.
But now digital LCD and OLED monitors are more common in the world, which do not just "display a signal on the screen", but have rather complex electronics inside, this signal is processed.
And to support higher refresh rates, they need:
- low response time pixels to switch without a daisy chain
- high-speed pixel switching components that can operate at both 60 and 90 Hz
- more powerful electronics that can decode more data over HDMI in the same time
all this costs money, plus since 60 Hz is enough for the bulk, components for high-speed monitors are corny more expensive (premium class)

D
Dr. Bacon, 2022-03-27
@bacon

Why do you have to pay extra for things with the best parameters?

he does not need to process anything, he just receives a signal from the video card and that's it
what naivety, but this is not the main thing here, the cost of the most components that can do with 144 is more.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question