Hi,
I have a dvi port on my display adapter, and a dvi port on my lcd display...But when I connect the dvi cable to them I get nothing! All new one and I have applied multiplexed cables! Any ideas on which I do wrongly?......Vga works fine!
Hi,
I have a dvi port on my display adapter, and a dvi port on my lcd display...But when I connect the dvi cable to them I get nothing! All new one and I have applied multiplexed cables! Any ideas on which I do wrongly?......Vga works fine!
Do not use ConnectedMonitor. Try using UseDisplayDevice instead of CRT . This will tell the driver to use a CRT when one is detected, and it'll fall back to its normal logic if there isn't one. Then you should be able to use the display configuration page in nvidia-settings to switch display devices.
DVI output is better qualitative than that of the VGA output simply because DVI's are digital, although VGA analogous. Pictures generated by High-definition TV's are matched digital pixel by the pixel in their panels, and they may section these by their DVI ports. The graphic display cards of a computer reads and generates a computer figure as digital and converts it as an analog signal when by a VGA output by signalled. A screen with a VGA can benefit from output clearly, more graphical figures of HIGH definition television become when signals by a VGA to DVI converter box affluent.
Make sure that you have the right drivers for the display adapter, Windows will let you make use the vga port but to make use the dvi how to right click tell on the desktop screen and click on properties, click onsettings click on advanced, click on adapter if the page list the same answer down the list the card drivers have not been installed get your disk out and install the video drivers.
Bookmarks