New 3DMark patch out, Nvidia still cheating in newest drivers

A

Andrey Tarasevich

rms said:
Nvidia is still cheating in the latest drivers. Why why why haven't they
learned yet?

http://www.beyond3d.com/forum/viewtopic.php?t=8952
...

Learned what? Modern video-hardware manufactures _learned_ to use
cheats/optimization in their drivers several years ago. The approach
pioneered by ATI is accepted now as a legitimate "optimization"
technique and used by virtually anyone these days. The only difference
is in the stance each particular manufacturer takes when its
"optimization" techniques receive some bad publicity. For example the
relatively recent bad publicity around ATI's Futuremark cheats cased ATI
to remove these particular cheats from their Catalyst drivers (which was
publically announced). However, other ATI's cheats which received less
public attention are still there in their latest drivers. nVidia, on the
other hand, seems to be more calm about these issues and prefers not to
make any sudden moves.
 
J

jeffc

Andrey Tarasevich said:
Learned what? Modern video-hardware manufactures _learned_ to use
cheats/optimization in their drivers several years ago. The approach
pioneered by ATI is accepted now as a legitimate "optimization"
technique and used by virtually anyone these days. The only difference
is in the stance each particular manufacturer takes when its
"optimization" techniques receive some bad publicity.

Isn't it obvious? When they've learned not to receive bad publicity.
 
L

Lenny

The approach
pioneered by ATI is accepted now as a legitimate "optimization"

What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.

By the way, S3 I think, was caught red-handed by cheating in Winbench by not
rendering all the frames in that program's 3D test (which was terribly lame
even by those days' standards), that should give you a hint about how long
ago THAT was. A further hint is that it was pre-Savage era too.

So I would suggest you take those lies of yours and shove em.
For example the
relatively recent bad publicity around ATI's Futuremark cheats cased ATI
to remove these particular cheats from their Catalyst drivers

Which cheatS in particular are you talking about? The only
application-specific optimization ATi did for 3DMark 2003 was to re-order a
shader for the Mother Nature test. It still produced the same output
(differing in about four pixels out of a 1024*768 screen), only difference
was it ran better on their hardware.

Instruction re-ordering is not a cheat. Main processors have done
re-ordering of instructions for over a decade now, it's a common enough
procedure. Only real difference is that GPUs lack the neccessary hardware to
do it in real-time (it is extremely costly in not only transistors and die
area, but also in research and development), so it has to be done in
software in the driver's shader compiler.
However, other ATI's cheats which received less
public attention are still there in their latest drivers.

Which cheats are those, exactly?

Are you suggesting ATi is untruthful in their statements that they have no
application-specific optimizations in their drivers?
 
D

Don Burnette

rms said:
Nvidia is still cheating in the latest drivers. Why why why haven't
they learned yet?

http://www.beyond3d.com/forum/viewtopic.php?t=8952

rms


Do folks really take that seriously results from 3D Mark in determining
what brand of video card they buy?
Personally, I only use it when making tweaks with the same hardware,
checking out drivers, overclocking, to see what might give me best
perfomance - for what I have.

When I decide on purchasing a video card, I pay more attention to reviews
and these forums to help me make my decision. And so far, I have not been
dissappointed in any of my purchasing decisions over the years.
 
A

Andrey Tarasevich

Lenny said:
What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.

The "evidence" is that ATI was first caught on cheating in good old
3DMark2001 a couple years ago. Nobody really paid much attention to the
discovery because at that time this topic wasn't as hot as it is now.
You can either do some Google searches yourself or simply test even the
_current_ ATI's drivers for the presence of 3DMark2001 cheats - you'll
be surprised.
By the way, S3 I think, was caught red-handed by cheating in Winbench by not
rendering all the frames in that program's 3D test (which was terribly lame
even by those days' standards), that should give you a hint about how long
ago THAT was. A further hint is that it was pre-Savage era too.

So I would suggest you take those lies of yours and shove em.


Which cheatS in particular are you talking about? The only
application-specific optimization ATi did for 3DMark 2003 was to re-order a
shader for the Mother Nature test. It still produced the same output
(differing in about four pixels out of a 1024*768 screen), only difference
was it ran better on their hardware.

Exactly. The Nature test. This link leads to the high-resulution picture
showing the difference between the real (cheats disabled) and the
"optimized" (cheats enabled) picture produced by ATI cards for Nature test

http://www.ixbt.com/video2/images/antidetect/r300-difference.rar

Sorry to rain on your delusions, but this is a lot more than "four pixels".

And this link leads to the same type of picture produced by nVidia card

http://www.ixbt.com/video2/images/antidetect/nv25-difference.rar

The similarities are striking. While I can't deduce all "optimizations"
used by ATI by just looking at this picture, it is rather likely that in
both nVidia and ATI case they include forceful reduction of precision of
trigonometric calculations, which is activated for this particular test.
Instruction re-ordering is not a cheat. Main processors have done
re-ordering of instructions for over a decade now, it's a common enough
procedure. Only real difference is that GPUs lack the neccessary hardware to
do it in real-time (it is extremely costly in not only transistors and die
area, but also in research and development), so it has to be done in
software in the driver's shader compiler.


Which cheats are those, exactly?

Are you suggesting ATi is untruthful in their statements that they have no
application-specific optimizations in their drivers?

Which statements are taking about? I hope you remember that when ATI was
publically confronted with the facts showing that they do use Futuremark
cheats in 3DMark2003, ATI responeded with new version of their drivers
and _publically_ _stated_ _in_ _their_ _press-release_ _that_ _ATI_
_drivers_ _did_ _actually_ _contain_ _these_ _cheats_ and now they are
removed. I hope the fact that ATI publically acknowledged the presence
of Futuremark cheats in their drivers answers your "which cheats are
those" question.

A simple experiment shows that ATI did indeed remove _these_
_particular_ cheats from their drivers, while the older 3DMark2001
cheats are still there is their full glory. You could easily repeat all
these experiments at home, if you weren't that ignorant.
 
P

phobos

Andrey said:
rms wrote:




Learned what? Modern video-hardware manufactures _learned_ to use
cheats/optimization in their drivers several years ago. The approach
pioneered by ATI is accepted now as a legitimate "optimization"
technique and used by virtually anyone these days. The only difference
is in the stance each particular manufacturer takes when its
"optimization" techniques receive some bad publicity. For example the
relatively recent bad publicity around ATI's Futuremark cheats cased ATI
to remove these particular cheats from their Catalyst drivers (which was
publically announced). However, other ATI's cheats which received less
public attention are still there in their latest drivers. nVidia, on the
other hand, seems to be more calm about these issues and prefers not to
make any sudden moves.

I have a question which I think nobody has ever really brought up --

Do you really believe that the only way to optimize a game is by pure
mathematical optimizations?

With something like lossless compression you have no choice, but with
video cards, the end product is HIGHLY qualitative. Equally spread
between the differences between hardware implementations itself and
whatever methods of achieving the resulting image are considered valid
themselves.

Some believe that the hardware was built to be used one way and one way
only ("the best image will only be produced when used in thus manner"),
but since GPU's are becoming so infinitely programmable, their maximum
capabilities cannot be solely fixed.

If the image can be approximated or resolved with visually no difference
or minor imperfections, I say all the better.

But to think that all speed increases through driver updates have come
at the expense of IQ is pesimistic to say the least.
 
M

Mike B

lol, i don't even use benchmarks to tell me what my hardware is capable of
because i don't trust them. i only run them to see some cool 3D
demonstrations. i could care less about my synthetic score
 
C

Cyclone Owner

Lenny said:
What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.


Was it not ATI who was caught detecting the quake executable years ago
and then turning off internal features to get better performance?
This was one of the very first times that video card drivers were
detecting the software and adjusting to get better results. In this
case, a simple renaming of the executable give very different results.
Today these types of "cheats" are much more cleaver and harder to
detect. So I think that is why he said ATI pioneered the driver
"optimization" cheat.
 
J

John Lewis

Nvidia is still cheating in the latest drivers. Why why why haven't they
learned yet?

http://www.beyond3d.com/forum/viewtopic.php?t=8952

rms

Who cares ?

Except the benchmarking narcissii who haunt this newsgroup.

Get a life and enjoy USING your computer either for entertainment
or high-performance pro software with a useful end-product.

I am quite happy for Ati/nVidia to customize their drivers for each
video-performance-demanding game/application out there, and only use
the benchmarks to verify that something is not grossly underperforming
or broken compared to a previous release, plus verify that any new
driver features are properly implemented.

John Lewis
 
N

Nick

The benchmarks are not aimed at people who already have h/w and want
to know its performance.

The benchmarks are not aimed at you.


The benchmarks are there for people who are looking to make a purchase
and want some kind of comparison in order to make the correct
decision.
 
N

Nada

Who cares ?

That's naive. A lot of them and us do. Dads reading PC Hardware
magazines before Christmas do take notes of what they see benchmarked;
whether or not it's a synthetic or a real game benchmark. A lot of
people do buy cards based on what the benchmark results are.
 
X

xyzzy

phobos said:
I have a question which I think nobody has ever really brought up --

Do you really believe that the only way to optimize a game is by pure
mathematical optimizations?

The problem is when the graphics card/drivers behave in a manner
that's undocumented. Sure, there's nothing wrong with card drivers
detecting, say, Quake and involving a different set of optimizations
tailored for that game *as long as* this behaviour is documented. That
said, the best place with that kind of "optimization" is within the
game itself. However, there's no legitimate reason to artificially
boost performance by sacrificing quality when a benchmarking program
is detected. It's not even a matter of if you can tell the quality
difference or not, it's the principle that yields that benchmark
misleading. If a manufacturer is going to make these tradeoffs, it
should make it across the board. Otherwise, it's cheating and stealing
your money, plain and simple.
 
B

Ben Pope

Andrey said:
Exactly. The Nature test. This link leads to the high-resulution
picture showing the difference between the real (cheats disabled) and
the "optimized" (cheats enabled) picture produced by ATI cards for
Nature test

http://www.ixbt.com/video2/images/antidetect/r300-difference.rar

Sorry to rain on your delusions, but this is a lot more than "four
pixels".

Indeed.... there are quite a few.
And this link leads to the same type of picture produced by nVidia
card

http://www.ixbt.com/video2/images/antidetect/nv25-difference.rar

The similarities are striking. While I can't deduce all
"optimizations" used by ATI by just looking at this picture, it is
rather likely that in both nVidia and ATI case they include forceful
reduction of precision of trigonometric calculations, which is
activated for this particular test.


I thought I'd do a little test to see how much difference there is in the
images.

I downloaded all the images from http://www.ixbt.com/video2/antidetect.shtml
and loaded them into matlab, trimmed 64 pixels from the top and checked the
PSNR of them.

Between r300-antidetect and nv25-antidetect ~37dB
Between r300 and nv25 ~31dB

So I guess you have to set the bar at 37dB, since when antidetect is on for
each, there is still 37dB PSNR.

Between r300 and r300-antidetect ~37dB
Between nv25 and nv25-antidetect ~32dB

Clearly nVidia is altering the image more - as to whether the images look
better or worse between ATI and nVidia is still a tough one to call...
especially as the two antidetect images are not equal.

Ben
 
J

John Lewis

That's naive. A lot of them and us do. Dads reading PC Hardware
magazines before Christmas do take notes of what they see benchmarked;
whether or not it's a synthetic or a real game benchmark. A lot of
people do buy cards based on what the benchmark results are.

Ever been a passive observer of video-card purchasers at Fry's ?

I have spent a very amusing spare hour or so doing so.

Quite instructive. Come out of your ivory tower and try it
yourself.

John Lewis
 
P

phobos

xyzzy said:
The problem is when the graphics card/drivers behave in a manner
that's undocumented. Sure, there's nothing wrong with card drivers
detecting, say, Quake and involving a different set of optimizations
tailored for that game *as long as* this behaviour is documented. That
said, the best place with that kind of "optimization" is within the
game itself. However, there's no legitimate reason to artificially
boost performance by sacrificing quality when a benchmarking program
is detected. It's not even a matter of if you can tell the quality
difference or not, it's the principle that yields that benchmark
misleading. If a manufacturer is going to make these tradeoffs, it
should make it across the board. Otherwise, it's cheating and stealing
your money, plain and simple.

Heh, which is one reason I never thought of buying 3dmark Pro :)
 
I

Ian Carmichael

What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.

By the way, S3 I think, was caught red-handed by cheating in Winbench by not
rendering all the frames in that program's 3D test (which was terribly lame
even by those days' standards), that should give you a hint about how long
ago THAT was. A further hint is that it was pre-Savage era too.

So I would suggest you take those lies of yours and shove em.


Which cheatS in particular are you talking about? The only
application-specific optimization ATi did for 3DMark 2003 was to re-order a
shader for the Mother Nature test. It still produced the same output
(differing in about four pixels out of a 1024*768 screen), only difference
was it ran better on their hardware.

Instruction re-ordering is not a cheat. Main processors have done
re-ordering of instructions for over a decade now, it's a common enough
procedure. Only real difference is that GPUs lack the neccessary hardware to
do it in real-time (it is extremely costly in not only transistors and die
area, but also in research and development), so it has to be done in
software in the driver's shader compiler.


Which cheats are those, exactly?

Are you suggesting ATi is untruthful in their statements that they have no
application-specific optimizations in their drivers?
Another knobbing ATI employee. Go suck your nose twat.
 
M

Mark Leuck

John Lewis said:
On 12 Nov 2003 04:03:21 -0800, (e-mail address removed) (Nada) wrote:

Ever been a passive observer of video-card purchasers at Fry's ?

I have spent a very amusing spare hour or so doing so.

Quite instructive. Come out of your ivory tower and try it
yourself.

John Lewis

Been there and seen that, from what I could tell most bought the card with
the best looking artwork on the box
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top