Nvidia geforce 210 limitation?

Organground

Active member
Hiya, I need three separate displays for my work so specified a PC with two Nvidia Geforce 210 graphics cards. I have one monitor that is 1920x1200 and two monitors that are 1920x1080. Each has a different image but they are quite static images, I don't need a high refresh rate like for gaming or anything.

I can get 1920x1200 out the DVI of the first card and 1920x1080 out the DVI of the second card no problem. However if I add the third monitor (via VGA) the maximum resolution I can get is 1600x1024. (Actually on one occasion, I did briefly manage to get the third monitor to display 1920x1080 after spending ages clicking on resolutions and trying the VGA cable in each card etc, trying a different VGA cable etc, but I've never been able to repeat it after a reboot and can't get VGA at the right resolution regardless of which card or cable I try). I've reinstalled both Windows and the graphics cards to no success. What am I doing wrong - or are Nvidia just wrong when they say that the 210 supports multliple monitors and maximum VGA resolution of 2548 by 1536, which on paper is way in excess of my needs? Do I really need to fit a third graphics card for this?

Nvidia technical support very helpfully just tells me I have to contact the supplier of my PC - which is PC Specialist...
 

Gibbs

Gold Level Poster
Are all monitors the same make/model? and is the monitor on the VGA port recognised as the correct model etc?
And finally, have you been able to achieve the 1920x1080 on the VGA port when it is the only monitor on the card? If it works at the desired res without the other DVI monitor in, this may suggest a limitation of the card.
 

Organground

Active member
Two monitors are HP 2310 (portrait, touchscreen, 1920x1080) the third is an HP ZR24x (landscape, 1920x1200). The third is the main monitor, the desktop is extended onto the other two.

Interesting you ask if the monitor is recognised as the correct make on the VGA port. If I connect both 2310ti monitors via VGA, one will be recognised as 2310ti the other just says "analogue display". Is that significant?

Most of the time, even if I click on 1920x1080 when selecting my resolutions, after the monitors flicker and apply the new resolution, the "analogue display" just goes back to 1600x1024 even though 10920x1080 is clickable.

If I connect both 2310s by VGA, select whichever chooses to display at the right resolution, disconnect the other and connect that one by DVI, I can, occasionally, succeed in keeping the second monitor to stay on 1920x1080 at VGA and the third goes automatically onto 1920x1080 on DVI, ie exactly what I want. But I tried saving those settings and the nvidia file just said corrupted when I tried reloading after a reboot, also I can't be faffed spending half the day randomly pulling cables in and out of different sockets until the magic combination comes along.

So it can't be a limitation of the graphics card as I can occasionally persuade it to display all three monitors at the correct resolution at the same time. But it's hit and miss. Surely there must be a way of retaining the resolutions? Or should I cut my losses and get a displayport to DVI cable and run all three monitors on DVI? Or buy a third graphics card?
 
Top