Answer the question
In order to leave comments, you need to log in
C# how to synchronize image output (Bitmap) with monitor refresh rate?
The problem is that the application displays the image (Bitmap) on the monitor for, for example, 1/75 of a second (or even less), but at the same time it may not be displayed on the monitor @ 75Hz if the start of rendering did not coincide with the start of the monitor update. Whereas it is required to me that in that case the picture was displayed with the minimum interval with which the monitor works.
The application uses GDI+. Does it have a mechanism for synchronization with the monitor?
If there is no such mechanism, then there may be a more or less simple way to implement this in C # in conjunction with DirectX, which has this feature. Those. I would like an example of outputting a two-dimensional image to a DirectX window from C # with FPS > 70.
UPD. I want to focus on the functionality - the program does not display a stream of frames that should move without flickering and smoothly, as in most graphic tasks being solved. Only 2 frames need to be displayed: 1 frame with an image during the period between two monitor hardware updates, and on the next update, erase the image with the background color. In this way, a constant minimum duration of the image on the screen is achieved. And for this, vertical sync must work for each of the two frames.
Double buffering, as far as I know, does not solve this problem.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question