GeForce FX5500 won’t display 1680 X 1050
As mentioned briefly in a previous post (http://www.computer-aid.com.au/blog/2008/09/14/geforce-7300gt-sometimes-blanks-for-2-seconds/), I had installed an AGP Galaxy GeForce FX5500 into my PC, but had a few problems with it.
I thought I’d give a few more details, and explain how I fixed it (by coincidence, the solution is the same as the 7300GT problem from my previous post).
Anyway, after installing the video card, I could only get the screen resolution to go as high as 1440 X 900.
If I tried 1680 X 1050, then I’d get something like a virtual 1680X1050 on a smaller 1280 X 1024 display. I could scroll around the “larger” display, but the monitor would not display the native resolution.
I tried all of the monitors OSD settings, but nothing that really worked.
None of the Nvidia display settings (XP control panel) worked either
It seems the DVI output for the FX5500 is rather limited… apparently, the max resolution is something like 1600 X 1200 (but the analog VGA output can go up to 2049 X 1536
So I connect the analogue VGA cable, and the card can now display 1680 X 1050
Grrr.
Hi there,
I tried posting at http://www.computer-aid.com.au/blog/2008/09/17/geforce-fx5500-wont-display-1680-x-1050/
However, I get a error message about a temporary server problem, so i thought I’d try posting here.
That’s an odd situation. I’m not sure if you tried this or not, but a utility called Powerstrip gives you the ability to tweak your GPU endlessly. Of course, leaving it attached to a VGA cable would be the easiest solution, but the result would be intriguing to know. John
Hi John D,
Yep, somethings weird with my hosting (the perils of shared hosting, I suppose).
I’ve used powerstrip a long time ago, but stopped using it when I stuffed up my CRT monitor by using a wrong setting 🙁
Anyway, I ain’t gonna bother trying DVI until my next upgrade to a PCI-e… then I’ll look at getting an even higher-end graphics card which might actually be able to handle DVI.
Oh dear, stuffing up a CRT monitor is not good. Yea, most GPU’s today with handle 1680 X 1050 through DVI no problem. Well, at least the PC I built that had a 640MB 8800GTS didn’t give me those problems.
All,
I have the same card (Nvidia GeForce FX5500), and have just updated from a CRT to the DELL SP2208WFP (being a 22″ LCD with native 1680×1050 resolution).
Like “Computer Help”, it will work on VGA but not DVI. This is extremely frustrating. I did some research online to discover that Nvidia cards have had this exact problem dating back to 2004 (that is the date of a forum post I discovered describing the same problem back then, obviously with a lower resolution). Apparently the Nvidia cards do not comunicate properly with some flat panel monitors EDID program. The result in a 4:3 aspect ratio image being stretched (overscan) beyond the perametres of the moitor display area.
The author back then recommended bypassing the EDID by going into the create custom resolution function and forcing a resolution with true 16:9 settings, rather than the card simply stretching 4:3 1280×1024 setting that it defaults to.
I have tried this numerous times and the closest I get is the opposite of what I have now. That is actual 1680×1050 resolution but scaled to the 4:3 centre screen (ie black bars either side).
Bottom line I will never buy Nvidia rubbish again. The ATI cards apparently do not have this problem according to the research I have done online. I realise that this is compiled by other self taught novices like myself, but when the suppliers of these products refuse to help, what other option do we have.
I hope this post helps others.
What a fantastic posting! Thankyou to all of you. I have the Galaxy GeForce fx5500 nVidia AGP card also and could not get it to run my LG 21.5″ 1920×1024 in any mode but 1280×1024 or lower resolution without the same stretched, off-the-screen result described above. I have been searching the internet everywhere to find out if the problem was with the screen, the drivers, the card or some other setting. No luck though lots of people asking the same question and very knowing people giving the wrong answers.
I changed cables from DVI to analogue VGA as suggested and was able to set the resolution in the nVidia manager to 1920 by 1024 with no problems. It looks great. I am not sure whether there is any advantage in a DVI digital connection over analogue but at least I know now that I would need an ATI card or a later vVidia card if I wanted to go digital.
Well, at least the PC I built that had a 640MB 8800GTS didn’t give me those problems.