New nVidia drivers & NWN
Moderator: Event DM
- Manuel the White
- Team Member; Retired with Honors
- Posts: 7567
- Joined: Wed Mar 05, 2003 6:45 pm
- Timezone: CST
- DM Avatar: Ra-Ghul
New nVidia drivers & NWN
This from the Vault:
New & Faster NVIDIA Drivers
NVIDIA has released new ForceWare drivers (Release 55) for their GeForce brand of graphics cards. According to the release it "includes many new features, tools and enhancements, including customised application and game profile settings, and support for PCI Express." Most interestingly they claim GPU performance increases of 48% specifically in NWN.
Here's the excerpt:
Thanks to the NVIDIA unified driver architecture (UDA), NVIDIA ForceWare Release 55 delivers a free performance upgrade for NVIDIA GPUs, both past and present. With the new driver, GeForce(TM) FX 5900XT GPU performance in F1 Challenge '99-'02 can increase as much as 120(i) per cent. In Gunmetal, GeForce FX 5200 GPU performance increases up to 22(ii) per cent and GeForce FX 5950 Ultra GPU performance in Neverwinter Nights goes up as much as 48(iii) per cent faster.
New & Faster NVIDIA Drivers
NVIDIA has released new ForceWare drivers (Release 55) for their GeForce brand of graphics cards. According to the release it "includes many new features, tools and enhancements, including customised application and game profile settings, and support for PCI Express." Most interestingly they claim GPU performance increases of 48% specifically in NWN.
Here's the excerpt:
Thanks to the NVIDIA unified driver architecture (UDA), NVIDIA ForceWare Release 55 delivers a free performance upgrade for NVIDIA GPUs, both past and present. With the new driver, GeForce(TM) FX 5900XT GPU performance in F1 Challenge '99-'02 can increase as much as 120(i) per cent. In Gunmetal, GeForce FX 5200 GPU performance increases up to 22(ii) per cent and GeForce FX 5950 Ultra GPU performance in Neverwinter Nights goes up as much as 48(iii) per cent faster.
- Lafferty
- Scholar
- Posts: 1129
- Joined: Tue Feb 11, 2003 5:08 pm
- Location: look at my hands... they are HUGE. And they cant touch themselves...
- Contact:
Is the performance gain for nwn the same on other gpus from nvidia? (GForce 4Ti)
Tool for crafters Do you want some human to your salt? nomanisanisland.monolar.de
- FunkOdyssey
- Sage
- Posts: 1954
- Joined: Wed Jan 22, 2003 2:46 pm
- Location: Newington, CT (GMT -5)
- Contact:
- Brock Fanning
- Apprentice Scholar
- Posts: 720
- Joined: Tue Mar 04, 2003 6:19 pm
- Location: Baltimore
- Contact:
- Uvatha
- Apprentice Scholar
- Posts: 584
- Joined: Tue May 27, 2003 8:57 pm
- Location: Valencia, Spain (GMT+2)
Re: New nVidia drivers & NWN
Seems is only for new cards, not old ones.Manuel the White wrote:... and GeForce FX 5950 Ultra GPU performance in Neverwinter Nights goes up as much as 48(iii) per cent faster.
"The destiny of every person is already written. Every decision, even little, is a piece of a pre-arranged drawing."
- Lafferty
- Scholar
- Posts: 1129
- Joined: Tue Feb 11, 2003 5:08 pm
- Location: look at my hands... they are HUGE. And they cant touch themselves...
- Contact:
Re: New nVidia drivers & NWN
I asked because i assume the 48% statement is verified for that perticular gpu simply because maximus of nwvault tested it. i suppose.Manuel the White wrote:This from the Vault:
NVIDIA ForceWare Release 55 delivers a free performance upgrade for NVIDIA GPUs, both past and present.
Tool for crafters Do you want some human to your salt? nomanisanisland.monolar.de
- Manuel the White
- Team Member; Retired with Honors
- Posts: 7567
- Joined: Wed Mar 05, 2003 6:45 pm
- Timezone: CST
- DM Avatar: Ra-Ghul
People actually own the new nVidia cards?
... why?
... why?
<Sili> I've seen septic tanks with less shit in them than Fuzz.
<Ronnin> damm not even a kiss??
<Chasmania> Kiss Fuzz? I'd rather fellate a goat.
<Chasmania> there are many roads to Rome..they just picked a shit filled alley full of scabby hookers and bums.
The shape of things to come...
<Ronnin> damm not even a kiss??
<Chasmania> Kiss Fuzz? I'd rather fellate a goat.
<Chasmania> there are many roads to Rome..they just picked a shit filled alley full of scabby hookers and bums.
The shape of things to come...
- Themicles
- CoPaP Ambassador
- Posts: 2673
- Joined: Wed Jan 29, 2003 10:45 pm
- Location: Wolverine Lake, MI
- Contact:
Because it only cost me $50 for my 5900 XT. (long story)Fuzz wrote:People actually own the new nVidia cards?
... why?
And IT WORKS.
Not a single bug in any game I play, unlike my stepfather's ATI cards.
-Themicles
A wise man does not dwell on his past. He learns from it, he grows from it, and then moves ahead into his future.
And some wise words from a wise man.
And some wise words from a wise man.

Orleron wrote:You have to excuse Themi. Tact, diplomacy, and softness are not his best traits, but he does not mean anything by his writing. He's a nice guy. You just get used to it after a while because he doesn't seem to learn.
- FunkOdyssey
- Sage
- Posts: 1954
- Joined: Wed Jan 22, 2003 2:46 pm
- Location: Newington, CT (GMT -5)
- Contact:
w00t!?Forceware Release 55 delivers awesome new features which are only available on NVIDIA GPUs,? said Dan Vivoli, executive vice president of marketing at NVIDIA. ?Along with these new features, every customer using an NVIDIA-based graphics card manufactured in the last six years?all the way back to the NVIDIA TNT??will get another free performance upgrade simply by installing this software.?
-
- Squire of Babble
- Posts: 39
- Joined: Thu Jul 24, 2003 5:49 pm
- Location: San Clemente, CA
-
- Prince of Bloated Discourse
- Posts: 183
- Joined: Mon Mar 15, 2004 7:04 am
- Location: S. NJ, USA (GMT-5/-4DST)
- Contact:
What I'd like to know is how bad they screwed up originaly to get that much extra performance out of thier hardware. If they are doing this now, why not when the card was released?
I can think of only two things: 1) Allows the card to work beyond original factory approved conditions (like overclocking, but not quite as bad) or 2) Pushes some of the GPU work back to the CPU, which souldn't be a big deal unless your system gets taxed easily like mine does.
I can think of only two things: 1) Allows the card to work beyond original factory approved conditions (like overclocking, but not quite as bad) or 2) Pushes some of the GPU work back to the CPU, which souldn't be a big deal unless your system gets taxed easily like mine does.
- Themicles
- CoPaP Ambassador
- Posts: 2673
- Joined: Wed Jan 29, 2003 10:45 pm
- Location: Wolverine Lake, MI
- Contact:
Its not an issue of screw ups.Kreetogg wrote:What I'd like to know is how bad they screwed up originaly to get that much extra performance out of thier hardware. If they are doing this now, why not when the card was released?
I can think of only two things: 1) Allows the card to work beyond original factory approved conditions (like overclocking, but not quite as bad) or 2) Pushes some of the GPU work back to the CPU, which souldn't be a big deal unless your system gets taxed easily like mine does.
Its an issue of optimization.
I mean, think about it. Their driver architecture is damn near universal.
Inherently, that wont be 100% optimized. Software advances over time.
Would you call the first version of Linux a screw-up just because today's versions are better?
When AGP 8x first came out, it was known that most video cards didn't use the interface to its full ability. Perhaps the optimizations are related to that?
Or, perhaps they learned something when working on the PCI-X standards? nVidia is one of the many companies involved in that.
No one company has infinit knowledge. When new knowledge is gained, there are bound to be improvements.
I'm sure they could make even better improvements on performance speeds if they were willing to make a buggy release.
-Themicles
A wise man does not dwell on his past. He learns from it, he grows from it, and then moves ahead into his future.
And some wise words from a wise man.
And some wise words from a wise man.

Orleron wrote:You have to excuse Themi. Tact, diplomacy, and softness are not his best traits, but he does not mean anything by his writing. He's a nice guy. You just get used to it after a while because he doesn't seem to learn.
-
- Prince of Bloated Discourse
- Posts: 183
- Joined: Mon Mar 15, 2004 7:04 am
- Location: S. NJ, USA (GMT-5/-4DST)
- Contact:
Optimizations would certinaly be part of it. I could see how one would get a few percent increace in performance but not those numbers.Themicles wrote: Its not an issue of screw ups.
Its an issue of optimization.
I mean, think about it. Their driver architecture is damn near universal.
Inherently, that wont be 100% optimized. Software advances over time.
If they used a one-size-fits-all type of driver archetecture that was previously preventing them from leveraging the full power of thier GPU, "screw up" wouldn't be fitting words but lazy, cheap, etc... certinaly would be.
One could paralell this situation to that of your average car. Most cars can be tweaked (adjusting timing and values from the on-board computer, mostly) to get a few extra HP out of the engines. This is without adding or replacing hardware. The performance gain is in the neiborhood of 4-6% or so from what I've seen. But something like 22%-120% (which could be something like 20-300 horse power) is unheard of.
If there really is that much room for impovement they did pretty much screw up in the begining. They didn't botch anything, but they certinaly did fail from main points of view.
Then one also has to consider the pratices of NVIDIA (and even ATI, too) in the past. Both vendors (NVIDIA more than ATI, if I can remember) included special code in thier drivers that would artificially increace critical stats. They did this by dropping quality. The numbers looked good, but the rendered scenes did not.
- Themicles
- CoPaP Ambassador
- Posts: 2673
- Joined: Wed Jan 29, 2003 10:45 pm
- Location: Wolverine Lake, MI
- Contact:
My rendered scenes, in benchmarks and games, looked great before, and great now.
You really need to read their data though.
Many of the increases are NOT OVERALL.
They are primarily game specific.
I got a good boost in performance with my 5900 XT on America's Army (Unreal Tournament 2003 engine), but almost NO increase in performance with NWN.
The "once size fits all" as you make it sound, is not about laziness.
Its about supporting all cards from a certain point on. This is for user convenience, but also... Do you see any other company actively updating drivers for cards as old as the TNT2? The unified architecture allows such.
Why do people have to take something like this and continually bash a company?
Do you think BioWare is a shithole too?... when they've done more than just about any other gaming studio to date.
I see mindless bashing of companies that work hard.
If you're an ATI fanboy, trying to bash nVidia, then take it to the ATI forums on any number of hardware sites.
People have preferences.
I admit, ATI's cards are fast. But they also tend to be buggy from game to game.
I buy nVidia, because I have yet to ever install an nVidia driver that caused problems in any game I play.
And I play a lot.
Out of 160 total gigs of storage on my computer, 120 gigs of that is games.
Am I an nVidia fanboy? No. I'm just sick of people bashing companies or brands.
I'll be the first one to buy a powerful ATI card, when they get their drivers consistently straight.
-Themicles
You really need to read their data though.
Many of the increases are NOT OVERALL.
They are primarily game specific.
I got a good boost in performance with my 5900 XT on America's Army (Unreal Tournament 2003 engine), but almost NO increase in performance with NWN.
The "once size fits all" as you make it sound, is not about laziness.
Its about supporting all cards from a certain point on. This is for user convenience, but also... Do you see any other company actively updating drivers for cards as old as the TNT2? The unified architecture allows such.
Why do people have to take something like this and continually bash a company?
Do you think BioWare is a shithole too?... when they've done more than just about any other gaming studio to date.
I see mindless bashing of companies that work hard.
If you're an ATI fanboy, trying to bash nVidia, then take it to the ATI forums on any number of hardware sites.
People have preferences.
I admit, ATI's cards are fast. But they also tend to be buggy from game to game.
I buy nVidia, because I have yet to ever install an nVidia driver that caused problems in any game I play.
And I play a lot.
Out of 160 total gigs of storage on my computer, 120 gigs of that is games.
Am I an nVidia fanboy? No. I'm just sick of people bashing companies or brands.
I'll be the first one to buy a powerful ATI card, when they get their drivers consistently straight.
-Themicles
A wise man does not dwell on his past. He learns from it, he grows from it, and then moves ahead into his future.
And some wise words from a wise man.
And some wise words from a wise man.

Orleron wrote:You have to excuse Themi. Tact, diplomacy, and softness are not his best traits, but he does not mean anything by his writing. He's a nice guy. You just get used to it after a while because he doesn't seem to learn.
- tindertwiggy
- Legacy DM
- Posts: 6905
- Joined: Tue Jul 16, 2002 12:20 am
- Location: Newish England
- Contact:
- Themicles
- CoPaP Ambassador
- Posts: 2673
- Joined: Wed Jan 29, 2003 10:45 pm
- Location: Wolverine Lake, MI
- Contact:
My monitor does that with any video card.
It does it whenever theres a resolution change, and then kicks back in.
Check your refresh rate overrides, and set them all to default.
It might be trying to use a refresh rate that your monitor doesn't support.
-Themicles
It does it whenever theres a resolution change, and then kicks back in.
Check your refresh rate overrides, and set them all to default.
It might be trying to use a refresh rate that your monitor doesn't support.
-Themicles
A wise man does not dwell on his past. He learns from it, he grows from it, and then moves ahead into his future.
And some wise words from a wise man.
And some wise words from a wise man.

Orleron wrote:You have to excuse Themi. Tact, diplomacy, and softness are not his best traits, but he does not mean anything by his writing. He's a nice guy. You just get used to it after a while because he doesn't seem to learn.
-
- Prince of Bloated Discourse
- Posts: 183
- Joined: Mon Mar 15, 2004 7:04 am
- Location: S. NJ, USA (GMT-5/-4DST)
- Contact:
I completely understand the over all thing. Some games rely on graphics cards more than others. NWN has a whole lot of graphics features that rely almost exclusively on the host system and not the graphics card to implement.
One size fits all IS lazyness, or cheapness, or one of several other things. NVIDIA COULD maintain independant drivers for a lot of thier cards and users would probably benifit much from an approach like that. How is it user convenience? Unless users keep swapping out one line of NVIDIA cards for another, recycling the same driver, I don't see any convenience.
First, I don't think NVIDIA is a shithole, so I can't "think BioWare is a shithole too". Bioware has neglected to do some things they really should have done, but that is a different discussion.
You misunderstand my intentions in this thread. I'm no ATi fanboy, and I have nothing against NVIDIA either. I was not suggesting anyone use one product over another. However I am a programmer. When I see really stupid programming issues, or what I suspect to be really stupid programming issues, I comment and discuss. That is the point of discussion boards, is it not?
Would you want a 42 inch 16:9 HDTV in your house that could only display a 24 inch diagonal image at an aspect ratio of 4:3?
One size fits all IS lazyness, or cheapness, or one of several other things. NVIDIA COULD maintain independant drivers for a lot of thier cards and users would probably benifit much from an approach like that. How is it user convenience? Unless users keep swapping out one line of NVIDIA cards for another, recycling the same driver, I don't see any convenience.
First, I don't think NVIDIA is a shithole, so I can't "think BioWare is a shithole too". Bioware has neglected to do some things they really should have done, but that is a different discussion.
You misunderstand my intentions in this thread. I'm no ATi fanboy, and I have nothing against NVIDIA either. I was not suggesting anyone use one product over another. However I am a programmer. When I see really stupid programming issues, or what I suspect to be really stupid programming issues, I comment and discuss. That is the point of discussion boards, is it not?
Would you want a 42 inch 16:9 HDTV in your house that could only display a 24 inch diagonal image at an aspect ratio of 4:3?
- tindertwiggy
- Legacy DM
- Posts: 6905
- Joined: Tue Jul 16, 2002 12:20 am
- Location: Newish England
- Contact:
- Dirk Cutlass
- Elder Sage
- Posts: 4691
- Joined: Mon Jan 27, 2003 9:42 am
- Location: GMT
Well I tried it - no detectable effect on my old GeForce video card. I guess if I was clever (and less impatient) I should have checked the fps before and then after to see if it made any improvement! Maybe I'll wind-back to the old version and try it out.
Even if there is no improvement I always like to keep drivers up to date, just in case they've fixed any bugs or something.
Even if there is no improvement I always like to keep drivers up to date, just in case they've fixed any bugs or something.