R
rms
Nvidia is still cheating in the latest drivers. Why why why haven't they
learned yet?
http://www.beyond3d.com/forum/viewtopic.php?t=8952
rms
learned yet?
http://www.beyond3d.com/forum/viewtopic.php?t=8952
rms
rms said:Nvidia is still cheating in the latest drivers. Why why why haven't they
learned yet?
http://www.beyond3d.com/forum/viewtopic.php?t=8952
...
Andrey Tarasevich said:Learned what? Modern video-hardware manufactures _learned_ to use
cheats/optimization in their drivers several years ago. The approach
pioneered by ATI is accepted now as a legitimate "optimization"
technique and used by virtually anyone these days. The only difference
is in the stance each particular manufacturer takes when its
"optimization" techniques receive some bad publicity.
The approach
pioneered by ATI is accepted now as a legitimate "optimization"
For example the
relatively recent bad publicity around ATI's Futuremark cheats cased ATI
to remove these particular cheats from their Catalyst drivers
However, other ATI's cheats which received less
public attention are still there in their latest drivers.
rms said:Nvidia is still cheating in the latest drivers. Why why why haven't
they learned yet?
http://www.beyond3d.com/forum/viewtopic.php?t=8952
rms
Lenny said:What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.
By the way, S3 I think, was caught red-handed by cheating in Winbench by not
rendering all the frames in that program's 3D test (which was terribly lame
even by those days' standards), that should give you a hint about how long
ago THAT was. A further hint is that it was pre-Savage era too.
So I would suggest you take those lies of yours and shove em.
Which cheatS in particular are you talking about? The only
application-specific optimization ATi did for 3DMark 2003 was to re-order a
shader for the Mother Nature test. It still produced the same output
(differing in about four pixels out of a 1024*768 screen), only difference
was it ran better on their hardware.
Instruction re-ordering is not a cheat. Main processors have done
re-ordering of instructions for over a decade now, it's a common enough
procedure. Only real difference is that GPUs lack the neccessary hardware to
do it in real-time (it is extremely costly in not only transistors and die
area, but also in research and development), so it has to be done in
software in the driver's shader compiler.
Which cheats are those, exactly?
Are you suggesting ATi is untruthful in their statements that they have no
application-specific optimizations in their drivers?
Andrey said:rms wrote:
Learned what? Modern video-hardware manufactures _learned_ to use
cheats/optimization in their drivers several years ago. The approach
pioneered by ATI is accepted now as a legitimate "optimization"
technique and used by virtually anyone these days. The only difference
is in the stance each particular manufacturer takes when its
"optimization" techniques receive some bad publicity. For example the
relatively recent bad publicity around ATI's Futuremark cheats cased ATI
to remove these particular cheats from their Catalyst drivers (which was
publically announced). However, other ATI's cheats which received less
public attention are still there in their latest drivers. nVidia, on the
other hand, seems to be more calm about these issues and prefers not to
make any sudden moves.
Lenny said:What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.
Nvidia is still cheating in the latest drivers. Why why why haven't they
learned yet?
http://www.beyond3d.com/forum/viewtopic.php?t=8952
rms
Who cares ?
phobos said:I have a question which I think nobody has ever really brought up --
Do you really believe that the only way to optimize a game is by pure
mathematical optimizations?
Andrey said:Exactly. The Nature test. This link leads to the high-resulution
picture showing the difference between the real (cheats disabled) and
the "optimized" (cheats enabled) picture produced by ATI cards for
Nature test
http://www.ixbt.com/video2/images/antidetect/r300-difference.rar
Sorry to rain on your delusions, but this is a lot more than "four
pixels".
And this link leads to the same type of picture produced by nVidia
card
http://www.ixbt.com/video2/images/antidetect/nv25-difference.rar
The similarities are striking. While I can't deduce all
"optimizations" used by ATI by just looking at this picture, it is
rather likely that in both nVidia and ATI case they include forceful
reduction of precision of trigonometric calculations, which is
activated for this particular test.
That's naive. A lot of them and us do. Dads reading PC Hardware
magazines before Christmas do take notes of what they see benchmarked;
whether or not it's a synthetic or a real game benchmark. A lot of
people do buy cards based on what the benchmark results are.
xyzzy said:The problem is when the graphics card/drivers behave in a manner
that's undocumented. Sure, there's nothing wrong with card drivers
detecting, say, Quake and involving a different set of optimizations
tailored for that game *as long as* this behaviour is documented. That
said, the best place with that kind of "optimization" is within the
game itself. However, there's no legitimate reason to artificially
boost performance by sacrificing quality when a benchmarking program
is detected. It's not even a matter of if you can tell the quality
difference or not, it's the principle that yields that benchmark
misleading. If a manufacturer is going to make these tradeoffs, it
should make it across the board. Otherwise, it's cheating and stealing
your money, plain and simple.
Why not use a decent benchmark program like AquaMark 3.Nvidia is still cheating in the latest drivers. Why why why haven't they
learned yet?
http://www.beyond3d.com/forum/viewtopic.php?t=8952
rms
Another knobbing ATI employee. Go suck your nose twat.What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.
By the way, S3 I think, was caught red-handed by cheating in Winbench by not
rendering all the frames in that program's 3D test (which was terribly lame
even by those days' standards), that should give you a hint about how long
ago THAT was. A further hint is that it was pre-Savage era too.
So I would suggest you take those lies of yours and shove em.
Which cheatS in particular are you talking about? The only
application-specific optimization ATi did for 3DMark 2003 was to re-order a
shader for the Mother Nature test. It still produced the same output
(differing in about four pixels out of a 1024*768 screen), only difference
was it ran better on their hardware.
Instruction re-ordering is not a cheat. Main processors have done
re-ordering of instructions for over a decade now, it's a common enough
procedure. Only real difference is that GPUs lack the neccessary hardware to
do it in real-time (it is extremely costly in not only transistors and die
area, but also in research and development), so it has to be done in
software in the driver's shader compiler.
Which cheats are those, exactly?
Are you suggesting ATi is untruthful in their statements that they have no
application-specific optimizations in their drivers?
John Lewis said:On 12 Nov 2003 04:03:21 -0800, (e-mail address removed) (Nada) wrote:
Ever been a passive observer of video-card purchasers at Fry's ?
I have spent a very amusing spare hour or so doing so.
Quite instructive. Come out of your ivory tower and try it
yourself.
John Lewis
Ian said:Another knobbing ATI employee. Go suck your nose twat.
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.