HomeTechnicalGeForce FX5500 won’t display 1680 X 1050

Comments

GeForce FX5500 won’t display 1680 X 1050 — 6 Comments

  1. Hi there,

    I tried posting at http://www.computer-aid.com.au/blog/2008/09/17/geforce-fx5500-wont-display-1680-x-1050/

    However, I get a error message about a temporary server problem, so i thought I’d try posting here.

    That’s an odd situation. I’m not sure if you tried this or not, but a utility called Powerstrip gives you the ability to tweak your GPU endlessly. Of course, leaving it attached to a VGA cable would be the easiest solution, but the result would be intriguing to know. John

  2. Hi John D,

    Yep, somethings weird with my hosting (the perils of shared hosting, I suppose).

    I’ve used powerstrip a long time ago, but stopped using it when I stuffed up my CRT monitor by using a wrong setting 🙁

    Anyway, I ain’t gonna bother trying DVI until my next upgrade to a PCI-e… then I’ll look at getting an even higher-end graphics card which might actually be able to handle DVI.

  3. Oh dear, stuffing up a CRT monitor is not good. Yea, most GPU’s today with handle 1680 X 1050 through DVI no problem. Well, at least the PC I built that had a 640MB 8800GTS didn’t give me those problems.

  4. All,

    I have the same card (Nvidia GeForce FX5500), and have just updated from a CRT to the DELL SP2208WFP (being a 22″ LCD with native 1680×1050 resolution).

    Like “Computer Help”, it will work on VGA but not DVI. This is extremely frustrating. I did some research online to discover that Nvidia cards have had this exact problem dating back to 2004 (that is the date of a forum post I discovered describing the same problem back then, obviously with a lower resolution). Apparently the Nvidia cards do not comunicate properly with some flat panel monitors EDID program. The result in a 4:3 aspect ratio image being stretched (overscan) beyond the perametres of the moitor display area.

    The author back then recommended bypassing the EDID by going into the create custom resolution function and forcing a resolution with true 16:9 settings, rather than the card simply stretching 4:3 1280×1024 setting that it defaults to.
    I have tried this numerous times and the closest I get is the opposite of what I have now. That is actual 1680×1050 resolution but scaled to the 4:3 centre screen (ie black bars either side).

    Bottom line I will never buy Nvidia rubbish again. The ATI cards apparently do not have this problem according to the research I have done online. I realise that this is compiled by other self taught novices like myself, but when the suppliers of these products refuse to help, what other option do we have.

    I hope this post helps others.

  5. What a fantastic posting! Thankyou to all of you. I have the Galaxy GeForce fx5500 nVidia AGP card also and could not get it to run my LG 21.5″ 1920×1024 in any mode but 1280×1024 or lower resolution without the same stretched, off-the-screen result described above. I have been searching the internet everywhere to find out if the problem was with the screen, the drivers, the card or some other setting. No luck though lots of people asking the same question and very knowing people giving the wrong answers.

    I changed cables from DVI to analogue VGA as suggested and was able to set the resolution in the nVidia manager to 1920 by 1024 with no problems. It looks great. I am not sure whether there is any advantage in a DVI digital connection over analogue but at least I know now that I would need an ATI card or a later vVidia card if I wanted to go digital.