NVIDIA caught cheating....AGAIN!

A

Asestar

Any one who ever owned a Kyro or Kyro2 based graphic card, and looked into
registry, will know that some settings are used for indivisual games, to
make them work properly.

If renaming "farcry.exe" removes those settings, thus making it slower, it's
natural. If nvidia can gain speed by making game specific settings, *as long
as it does not reduce the IQ* it should be acceptable, or even more
applausable!

However, if these speed enhancement mean reducing the IQ, shame on them!
 
J

John Russell

Asestar said:
Any one who ever owned a Kyro or Kyro2 based graphic card, and looked into
registry, will know that some settings are used for indivisual games, to
make them work properly.

At least Nvida are taking problems with specific games seriously, with their
soundforce APU as well. So what do they do if they can't come up with
a"generic" driver which solves all the problems without cuasing new
problems? Do they just say "tough"? If you have Nvidia hardware and a
problem game you want a solution which works for you, and if that means game
specific settings than so be it. After all even microsoft has game specific
settings in it's XP game compatibility mode. We have to live in the Real
World, not the Utopian world which microsoft imples Directx is, where games,
drivers and hardware work seemlessly together on any PC setup without a
problem.

If these "fix's" where targeted at game benchmarks then I would say to
Nvidia they should focus of fixing real problems, which I believe this story
is saying they are.
 
J

Joachim Trensz

John Russell wrote:
....
If these "fix's" where targeted at game benchmarks then I would say to
Nvidia they should focus of fixing real problems, which I believe this story
is saying they are.

The (German original) story says that if you rename Farcry.exe to
something else, it runs a lot slower on Nvidia cards. I take it that the
fps optimization occurs by reducing the graphics quality.

If course, this is a valid means of increasing fps, but I'd rather the
decision were left to the user.

And of course, it does have an effect in benchmarks...

Achim
 
R

redTed

If these "fix's" where targeted at game benchmarks then I would say to
The (German original) story says that if you rename Farcry.exe to
something else, it runs a lot slower on Nvidia cards. I take it that the
fps optimization occurs by reducing the graphics quality.

If course, this is a valid means of increasing fps, but I'd rather the
decision were left to the user.

And of course, it does have an effect in benchmarks...

Explain why anybody would rename the farcry.exe file ? I don't get it.
Sounds like an April Fools that missed the boat.
 
A

Andrew

Explain why anybody would rename the farcry.exe file ? I don't get it.
Sounds like an April Fools that missed the boat.

The point is that NVidia drivers are maybe being written especially to
target certain games to make them look better in benchmarks, as they
got caught doing with Q3 a couple of years back. The drivers are
written to modify their behaviour dependent on the name of the game
being run, and if you change the name of the executable, the drivers
get fooled and don't "cheat".
 
B

Brian Gregory [UK]

Joachim Trensz said:
The (German original) story says that if you rename Farcry.exe to
something else, it runs a lot slower on Nvidia cards. I take it that
the fps optimization occurs by reducing the graphics quality.

The theinquirer.net article says that renaming the file turns off Z
culling.
That's not going to effect the quality.
Presumably one is free to try turning it on in any game or program to
see if it speeds things up without causing problems. Nvidia just happen
to know it works fine in Farcry.
 
R

redTed

Explain why anybody would rename the farcry.exe file ? I don't get it.
The point is that NVidia drivers are maybe being written especially to
target certain games to make them look better in benchmarks, as they
got caught doing with Q3 a couple of years back. The drivers are
written to modify their behaviour dependent on the name of the game
being run, and if you change the name of the executable, the drivers
get fooled and don't "cheat".

I see. Kind of. Uhhh.....So what you're saying is, if a different company
uses the Crytek engine to produce a game that, to all extents and purposes
for this theoretical discussion, is exactly the same as Far Cry then it
would not run as well as Far Cry ?
 
C

cK-Gunslinger

No one is going to rename farcry.exe. The point is that many people use
benchmark data to help determine which card they will purchase. So by
"cheating" on any standard benchmark test, you are deceiving the comsumers.

Consider benchmarking a car. Say there is a common stretch of road
typically used to determine a car's fuel use (miles per gallon, mpg.)
If a certain manufacturer realized that a strong wind was blowing in a
favorable direction on Tuesdays afternoon at 3:45pm, and this wind would
raise fuel efficiency from 28mpg to 36mpg (yeah, it's a strong wind =P),
so this company *always* performed its tests at this time and reported
that the car gets 36MPG, would this be fair? No. Perhaps if you lived
on this particular stretch of road and had to commute each Tuesady at
3:45, then you might consider it. Otherwise, for all intents and
purposes, the car gets 28mpg, not 36, as they would have you believe.

Likewise, if all you play is Farcry, then the benchmark is valid, but
you cannot use the Farcry results to extrapolate how any future game
will perform. Therefore, the benchmark data is invalid and the compnay
is *cheating.*

If they want to enable an optimization for a particular game, then that
should be a *user option* in the drivers. That is, you should be able
to enable/disable the feature with a few mouse clicks. Benchmarking for
comparative purposed should be peformed with all
(quality-reducing/modifying) optimizations off. You have to at least
strive for apples-to-apples comparisons.

That's the way I see it, anyway.
 
C

cK-Gunslinger

Wait, I thought it was ATi that optimized for quake? A quick google for
"quake quack ATi" backs this up.
 
A

Asestar

The theinquirer.net article says that renaming the file turns off Z
culling.
That's not going to effect the quality.
Presumably one is free to try turning it on in any game or program to
see if it speeds things up without causing problems. Nvidia just happen
to know it works fine in Farcry.

--

Brian Gregory (In the UK).
(e-mail address removed)
To email me remove the letter vee.

Exactly what i was talking about. I think it's a good thing tough. Nvidia
should get some credit for that. However, their drivers should kind-off
auto detect, so that not every major games need optimisations to run faster.
 
L

Les

redTed said:
Explain why anybody would rename the farcry.exe file ? I don't get it.
Sounds like an April Fools that missed the boat.

There is an article somewhere where they force XP to recognise a 6800U as a
9800XT. The IQ goes up and the FPS go down. Similarly when they forced
(somehow) a 9800XT to be recognised as a 6800U the IQ went down and the FPS
went up.

--
Les
AMD64 3200+
2x512 MB corsair platinum 3500
Gigabyte GA-K8VNXP
Herc 9700 Pro
SB Audigy
 
C

cK-Gunslinger

Brian said:
The theinquirer.net article says that renaming the file turns off Z
culling.
That's not going to effect the quality.
Presumably one is free to try turning it on in any game or program to
see if it speeds things up without causing problems. Nvidia just happen
to know it works fine in Farcry.

Well, that's all good for *end users.* Let them put a check box in the
drivers that says "Make Farcry run better? (y/n)" But for a reviewer
running benchmarks, it is imperative that they know *exactly* what
functionality is enabled/disabled in order to make valid comparisons.

Like most people, I don't make hardware purchase decisions based on a
*single* game. I don't use Farcry benchmark data to determine how well
a card plays Farcry, but rather as an indicator about how well the card
performs in a typical, retail, DX9-feature-using game. When you start
optimizing for a single title, those performance numbers don't mean as
much anymore. I can't speak for everyone else, but I plan to use my
video card for longer than the 4-5 days I'll be playing Farcry.
 
J

Joachim Trensz

Brian Gregory [UK] wrote:
....
The theinquirer.net article says that renaming the file turns off Z
culling.
That's not going to effect the quality.
Presumably one is free to try turning it on in any game or program to
see if it speeds things up without causing problems. Nvidia just happen
to know it works fine in Farcry.

The original article says that an Nvidia spokesperson explained that in
the 61.11 driver, a driver bug was fixed (having to do with z-culling,
he didn't say 'disabled') and that this 61.11 driver was further
optimized for the game.

The article points out that by renaming the exe as described, the fps
dropped by ~10 fps.

With the optimization ON (exe name FarCry) the Nvidia card was faster
than the X800 XT, when they renamed the exe the ATI was faster.

That's what the original article says.

Achim
 
J

Joachim Trensz

Joachim Trensz wrote:

....
The original article says that an Nvidia spokesperson explained that in
the 61.11 driver, a driver bug was fixed (having to do with z-culling,
he didn't say 'disabled') and that this 61.11 driver was further
optimized for the game.

The article points out that by renaming the exe as described, the fps
dropped by ~10 fps.

With the optimization ON (exe name FarCry) the Nvidia card was faster
than the X800 XT, when they renamed the exe the ATI was faster.

That's what the original article says.

Achim

Short addendum - another article on that mag's hp says that even with
the optimization on, the 6800 was still slightly slower than the X800
XT, for whatever it's worth.

Achim
 
J

John Russell

If they want to enable an optimization for a particular game, then that
should be a *user option* in the drivers. That is, you should be able
to enable/disable the feature with a few mouse clicks. Benchmarking for
comparative purposed should be peformed with all
(quality-reducing/modifying) optimizations off. You have to at least
strive for apples-to-apples comparisons.

That's the way I see it, anyway.

If you use XPcompatibility mode to get an old game working you have no idea
what game specific OS tweeks microsoft have implemented withour delving very
deeply. If you don;t use compatibilty mode you won't get the game to work at
all, so as a user you have no choice. Is microsoft cheating?

I have no objection to game specific fix's if as a user of that game I
wanted to run the game without problems on my hardware. I am not interested
in my game play being ruined so that others can make a better choice of
buying a graphics card. Ther purpose of a game graphics card is to run
games, not tests, and the purpose of games is entertainment, not testing
cards!
 
S

Sleepy

theres ppl dying in Iraq still - AIDs epidemic in Africa
and ppl here getting in a tiz about graphics card drivers
- GROW THE FECK UP.
 
C

cK-Gunslinger

John said:
If you use XPcompatibility mode to get an old game working you have no idea
what game specific OS tweeks microsoft have implemented withour delving very
deeply. If you don;t use compatibilty mode you won't get the game to work at
all, so as a user you have no choice. Is microsoft cheating?

With all due respect, that's a stupid comparison. Who is Microsoft
competing against? Who are they cheating? Are they trying to benchmark
WinXP versus Win95 in order to make WinXP look better? Again,
irrelevant comparison.
I have no objection to game specific fix's if as a user of that game I
wanted to run the game without problems on my hardware. I am not interested
in my game play being ruined so that others can make a better choice of
buying a graphics card. Ther purpose of a game graphics card is to run
games, not tests, and the purpose of games is entertainment, not testing
cards!

Agreed that gaming performance is the purpose of hardware, not tests.
But hardware also serves a greater purpose to behave as the user
requests. Would it be acceptable to set your options for a game to
1280x960x32 with 8X AA, but have the hardware say "Bah, we'll just let
him *think* that's what he's running, but I'll just go ahead and lower
the res to 800x600 and run in 16-bit color with no AA, without telling
him"? I don't think so. And *that's* why *some* driver optimizations
are indeed *cheating.* They let you request one thing, but they deliver
another and tell you that's what you asked for.

It's like having a deli that advertises "fresh, made-to-order"
sandwiches, ready in 3 seconds, but really just reaches under the
counter and pulls out a plain ham sandwich that was made 3 days ago, and
tells you that it's what you ordered.

Companies are here to serve consumers. So are their products. Don't
let them shit in a bag and tell you that's what you want. Demand better.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top