• Welcome to hamsterserver.com :: demented rodent gaming.
 

News:

Salford Man Distraught After Passport Photo Makes Him Look Like Hitler

Main Menu

44.03 Nvidia Dets

Started by RichardWhitely, May 16, 2003, 02:12:51 PM

Previous topic - Next topic

RichardWhitely

Heard some good things about these ones, dunno if they are true or not as Im just about to give them a go myself :D (They are WHQL approved as well, but that usually means sod all as some WHQL drivers are total pants 8O).

Grab them from here:- http://download.guru3d.com/detonator/

Dont forget to uninstall the old drivers first ;-) (Use the add/remove programs thingy in control panel)
I play games on a:
ZX Spectrum 48K | Grundig C410 cassette recorder (adjustable head) | 20BT TV Philips Multistandard Color V37cm | ZX Interface 2 | New Kempston Compatible Competition Pro Switched Joystick | Sinclair BASIC OS

Undergrid

You might want to read this article at slashdot and the original report at Extreme Tech before benchmarking the 44.03 drivers.
QuoteIn the force if Yoda's so strong, construct a sentence with words in the proper order then why can't he?

powered_by_guinness

Ooohoo caught with their pants down? Thats dirty play :!:

Not likely to make much difference on my uber elite GF2 Ultra tho ;)

RichardWhitely

And from http://www.hardocp.com/

Quote3DMark Invalid?
Two days after Extremetech was not given the opportunity to benchmark DOOM3, they come out swinging heavy charges of NVIDIA intentionally inflating benchmark scores in 3DMark03. What is interesting here is that Extremetech uses tools not at NVIDIA's disposal to uncover the reason behind the score inflations. These tools are not "given" to NVIDIA anymore as the will not pay the tens of thousands of dollars required to be on the "beta program" for 3DMark "membership".


nVidia believes that the GeForceFX 5900 Ultra is trying to do intelligent culling and clipping to reduce its rendering workload, but that the code may be performing some incorrect operations. Because nVidia is not currently a member of FutureMark's beta program, it does not have access to the developer version of 3DMark2003 that we used to uncover these issues.

I am pretty sure you will see many uninformed sites jumping on the news reporting bandwagon today with "NVIDIA Cheating" headlines. Give me a moment to hit this from a different angle.

First off it is heavily rumored that Extremetech is very upset with NVIDIA at the moment as they were excluded from the DOOM3 benchmarks on Monday and that a bit of angst might have precipitated the article at ET, as I was told about their research a while ago. They have made this statement:


We believe nVidia may be unfairly reducing the benchmark workload to increase its score on 3DMark2003. nVidia, as we've stated above, is attributing what we found to a bug in their driver.

Finding a driver bug is one thing, but concluding motive is another.

Conversely, our own Brent Justice found a NVIDIA driver bug last week using our UT2K3 benchmark that slanted the scores heavily towards ATI. Are we to conclude that NVIDIA was unfairly increasing the workload to decrease its UT2K3 score? I have a feeling that Et has some motives of their own that might make a good story.

Please don't misunderstand me. Et has done some good work here. I am not in a position to conclude motive in their actions, but one thing is for sure.

3DMark03 scores generated by the game demos are far from valid in our opinion. Our reviewers have now been instructed to not use any of the 3DMark03 game demos in card evaluations, as those are the section of the test that would be focused on for optimizations. I think this just goes a bit further showing how worthless the 3DMark bulk score really is.

The first thing that came to mind when I heard about this, was to wonder if NVIDIA was not doing it on purpose to invalidate the 3DMark03 scores by showing how the it could be easily manipulated.

Thanks for reading our thoughts; I wanted to share with you a bit different angle than all those guys that will be sharing with you their in-depth "NVIDIA CHEATING" posts. While our thoughts on this will surely upset some of you, especially the fanATIics, I hope that it will at least let you possibly look at a clouded issue through from a different perspective.

Further on the topics of benchmarks, we addressed them earlier this year, which you might find to be an interesting read.

We have also shared the following documentation with ATI and NVIDIA while working with both of them to hopefully start getting better and more in-game benchmarking tools. Please feel free to take the documentation below and use it as you see fit. If you need a Word document, please drop me a mail and let me know what you are trying to do please.


Benchmarking Benefiting Gamers

Objective: To gain reliable benchmarking and image quality tools placed in upcoming retail games and demos, thus allowing for more valuable hardware analysis. This in turn should impact the games sales through ÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã…ââ,¬Å"freeÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã‚? publicity that will reach millions of advanced computer owners. Also better driver performance, hardware performance, and compatibility should be realized through this, as the major GPU/VPU companies will give games and demos with benchmarks more technical attention.

Please see our editorial at this link for more perspective on how we should be ÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã…ââ,¬Å"benchmarking rightÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã‚?.

The Basics: id Software has outlined the ÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã…ââ,¬Å"Benchmarking Must HavesÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã‚? in their previous tools very well. Their tool set has everything hardware analysts absolutely need but none of the things that "normal" gamers need in order to easily use the tool set. Making this tool set easy to use with a GUI interface has been successful for many companies in getting their games noticed, although a GUI is not ÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã…ââ,¬Å"neededÃÆ'Ã,¢Ã¢ââ,¬Å¡Ã,¬Ã‚?.

I have tried to convey my thoughts as quickly as possible so as to not make this a novel. Please do not hesitate to send any questions to: Kyle@HardOCP.com.


Benchmark Must Haves:

1. A "timedemo" scenario running a built in demo (not using bots or other non-static AI) that gives us an average Frames Per Second score.

2. The ability to change AA, AF, Resolution and any other Quality Settings important to the particular gaming experience. I think it is important that we be able to set different levels of AA and Anisotropic Filtering forced by the game as the drivers are getting a bit ambiguous as to what they do now days. This helps facilitate "apples to apples" benchmarking across different mfg's cards.

3. The ability to turn off or override any video card autodetection done by the gaming engine that sets image quality levels on the fly. Case in point: Serious Sam requires us to run custom scripts to set filtering levels as it detects the video card being used and auto-initiates settings that best fit the users system. This makes it difficult for end users to get comparable scores from gamer to gamer.

4. Ability to take screen shots from identical frames in the demo in .tif or .bmp formats. This would allow for much more proficient evaluation of image quality as this would remove many variables and points of failure when having to take screen shots manually.

5. Ability to toggle sound on and sound off.

Benchmark Wants:

There are certainly some items here that are specific to the content developers and others to the engine programmers. I would think of some of these could be much easier implemented by the content creator if a bit of forethought by the programmer was applied. (Asterisk indicates more importance)

1. ** GUI control for all benchmarking options. This of course is a big draw as ease of use will facilitate more people using your technology and downloading it for their own use in evaluating their current hardware.

2. A setting to either drastically increase/decrease polygons to illustrate any fill-rate handicap on card and how it might affect overall system performance in the engine. Possibly the ability to turn LOD off?

3. ** Ability to force Pixel Shader to a lower subset. (I am not sure this is possible, but it seems that most GaveDevs would be coding for this to give them a larger install base.)

4. Demos specific to Lighting techniques, PS and VS instructions.

5. ** Detailed Frame Rate Reports. Ideally the engine would generate a .txt log of average frames per second on a per second interval. This gives us the ability to look at highs and lows as well as median scores.

6. Detailed Reports for polygons drawn, textures sizes, and PS/VSs being used. It would be great to be able to see per second data on this as well.

7. Ability to record our own demos to be used for benchmarking.

8. ** Filtering identification tools like "r_colormiplevels" in Quake3 and Sam. (like shown here):


Thanks for your time and reviewing our thoughts on better benchmarking. Surely these tools could help out the hardware evaluation community. Bottom line is that we want gamers to have the best experience possible and that means for them to be able to reach a high level of immersion in the virtual worlds you create.

A gaming experience that delivers more impact will be good for the gaming and hardware industries as a whole. We think tools of this nature will help us help the gamers have an overall better experience.
I play games on a:
ZX Spectrum 48K | Grundig C410 cassette recorder (adjustable head) | 20BT TV Philips Multistandard Color V37cm | ZX Interface 2 | New Kempston Compatible Competition Pro Switched Joystick | Sinclair BASIC OS

Undergrid

Also just to note, the WHQL certification only applies when used with a card based on the FX chipset, not to any other card.  Check out the small print (note the * next to WHQL certified) at nVidia's download page
QuoteIn the force if Yoda's so strong, construct a sentence with words in the proper order then why can't he?

icky

well ive downgraded to ME and ive got windows certified drivers which are 44.03

RichardWhitely

I play games on a:
ZX Spectrum 48K | Grundig C410 cassette recorder (adjustable head) | 20BT TV Philips Multistandard Color V37cm | ZX Interface 2 | New Kempston Compatible Competition Pro Switched Joystick | Sinclair BASIC OS

Undergrid

Wonder how much nvidia paid them  :roll:

Edit:

Quote
"In the light of this, FutureMark now states that Nvidia's driver design is an application specific optimisation and not a cheat."

Different name, same meaning.

Why the heck would any company put "application specific" optomisations into a driver?  And the only reason for doing it with 3DMark03 is to get higher scores which to me is the same as ceating.


Edit 2:

Quote
Nvidia may now re-join FutureMark's beta programme, which it quit earlier this year.

There ya go, loads of money.
QuoteIn the force if Yoda's so strong, construct a sentence with words in the proper order then why can't he?

P-Dub

Seems UG has become quite the fanATIc since his move to the dark side........ wonder how much ATI are paying him to say all these nice things about their POS drivers and POS cards......

Bend thou'st knees for the true king, the NV35 shalt be upon us soon, fanATIc's should feel ph34r for FX5900 shalt take no prisoners......

Wolfey

Calibrax

Typical Nvidia cheating bar stewards...

ATI rocks, and doesn't cheat on scores. And they don't leak dodgy drivers left, right and centre either. :)

And Wolverine, when my hearing finally goes, then I will buy a GeForce FX. Apparently the fan on it is louder than a Boeing 747 at takeoff. And it still ain't nowhere near as good as a 9700 Pro when using 16x FSAA.

/me puts on flame-proof suit  :twisted:

Steve  :D  :D


|fury|

i thought ati had got caught cheating not too long ago? :D

anyhow... anyone who puts too much faith in these benchmark apps deserves a pos card :D

hmm.. fsaa.. still never tried that.. 1600x1200 looks reasonable enough without it!

Undergrid

Quote from: WolverineSeems UG has become quite the fanATIc since his move to the dark side........ wonder how much ATI are paying him to say all these nice things about their POS drivers and POS cards......

Bend thou'st knees for the true king, the NV35 shalt be upon us soon, fanATIc's should feel ph34r for FX5900 shalt take no prisoners......

Wolfey

Sorry, whats that? Couldn't quite hear you over that industrial hover you seem to be using...

Anyway, where did I say anything in this thread about ATI?  And as for POS card and drivers, I've not had a single problem with my card (appart from a bug with the Mobo which effects all video cards) since I got it, and the catalyst drivers are just as good as nVidia's Detonators (they really have turned the driver round in the last year, good job ATI).

Anyway's better a fanATIc than an nvIDIOT  :lol:
QuoteIn the force if Yoda's so strong, construct a sentence with words in the proper order then why can't he?

SCoob [NL]

It wasn't only NVIDIA that was cheating with their drivers, ATI also had a big hand in this.

P-Dub

Quote from: Calibrax
Wolverine, when my hearing finally goes, then I will buy a GeForce FX. Apparently the fan on it is louder than a Boeing 747 at takeoff. And it still ain't nowhere near as good as a 9700 Pro when using 16x FSAA.

Sorry Steve, all i hear is blah blah blah.... nothing of any substance.....just like ATI's bullshit FSAA thing, who actually uses it in game?? no one cos the hit on performance is so high its not worth the drop in resolution......

Quote from: Undergrid
I've not had a single problem with my card (appart from a bug with the Mobo which effects all video cards) since I got it, and the catalyst drivers are just as good as nVidia's Detonators (they really have turned the driver round in the last year, good job ATI).

Anyway's better a fanATIc than an nvIDIOT

Again, i didnt really hear anything but blah blah blah i'm a fanATIc who is happy to say he has had no problems yet spent almost 40 mins moaning on guild chat in EnB about the "jaggies" when he first upgraded his drivers. You notice me moaning about the dets?? nope cos they usually give a performance increase and a distinct lack of Jaggies....

Quote from: |Fluffy|
hmm.. fsaa.. still never tried that.. 1600x1200 looks reasonable enough without it!

ahhh finally a bloke i can understand..... yer right, 1600x1200 looks the bomb on most games, in fact all the games i currently play look great at 1600x1200...... including the 450 fps i am getting in MCM2 with everything at Max detail..... :D

Wolfey

Undergrid

Quote from: Wolverine
Sorry Steve, all i hear is blah blah blah.... nothing of any substance.....just like ATI's bullshit FSAA thing, who actually uses it in game?? no one cos the hit on performance is so high its not worth the drop in resolution......

Spoken like a true nVidiot, may I point out that ATI cards arn't hit headly as hard by FSAA and AF as nVidia cards?  Maybe if you had an ATI card you'd actually use what you paid for.

Quote from: Wolverine
Again, i didnt really hear anything but blah blah blah i'm a fanATIc who is happy to say he has had no problems yet spent almost 40 mins moaning on guild chat in EnB about the "jaggies" when he first upgraded his drivers. You notice me moaning about the dets?? nope cos they usually give a performance increase and a distinct lack of Jaggies....

Yup, I did have problems, because the drivers turned on fastwrite which are known to cause problems with my MOBO (some random data corruption under some circumstances).  What did the Det's do when I first installed them?, surprise surprise exactly the same thing.


Quote from: Wolverine
ahhh finally a bloke i can understand..... yer right, 1600x1200 looks the bomb on most games, in fact all the games i currently play look great at 1600x1200...... including the 450 fps i am getting in MCM2 with everything at Max detail..... :D

MCM2?  I presume you mean Motocross Maddness 2?  A game that so old you can get 100+ fps on a TNT2 and turning on all the features means you can actually see your wheels!
QuoteIn the force if Yoda's so strong, construct a sentence with words in the proper order then why can't he?