Answer the question
In order to leave comments, you need to log in
How to extend desktop to 3 monitor 2 video adapter?
Good day everyone! Please help me, or direct me to the right path. I moved from win to linux mint, and I'm running a graphical login with alternate success. The bottom line is, I have 3 monitors:
- LG w2241s
- SAMSUNG 226sw
which are connected to ASUS gtx 560.
- BENQ FP93G
which is plugged into the motherboard, in HDMI, and which previously worked from Core I5 4690 (Intel HD Graphics)
When OS the mint icon is loaded and reloaded on the BENQ monitor, if you press ctrl + alt + f1 - the console will also be on it. But how to make this monitor part of the desktop?
~ $ lspci -v | grep VGA
01:00.0 VGA compatible controller: NVIDIA Corporation GF114 [GeForce GTX 560] (rev a1) (prog-if 00 [VGA controller])
~ $ lspci -v | grep Display
00:02.0 Display controller: Intel Corporation Xeon E3-1200 v3/4th Gen Core Processor Integrated Graphics Controller (rev 06)
Answer the question
In order to leave comments, you need to log in
To combine several consoles into one (the display is part of the output console that you will "cut" to place on separate monitors), the system must be synchronized.
Previously, this meant a certain standard for connecting several interfaces (SLI as an example), now with the advent of hybrid graphics directly on a chip, manufacturers with grief in half combine dissimilar nodes into 1 array, but, as a rule, this is of little use.
The essence of the process can be twofold:
1) A more powerful node reads the entire frame buffer, displays two .. and gives the third one to the inline. In this case, control should be carried out by a more powerful node (or external) - read this as "digging in the wood"
2) control comes from the chipset or directly from the responsible node (are there any?) in the CPU... Look for better support for Lucidlogix Virtu in the description of your motherboard. You can read, for example, on NIKS.
I’m not sure that the 560th implemented the ability to cut taking into account an external source, and if so, then that it, having frequencies, volumes and memory characteristics that are different from the CPU video core, will be able to give the buffer so that it does not crumble into lines, or does not break up into funny artifacts ..
I also have an asus, directCU II 560th, there are two digital inputs and a mini HDMI. Don't you? Maybe it's better to infiltrate the native port?
Don't you think that it would be more logical and cheaper for the nerves to use a card with three ports to separate the consoles (and up to 6 with AMD). nvidia 640 is also, for example, rich in 4 ports
. I can't remember such a solution right off the bat ... I would change the card.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question