[OT] More Nvidia cheating?

R

rms

This is not OT, it's directly relevant to this newsgroup.
Just noticed this thread about the performance difference in Far Cry if the
Nvidia 6800 Device ID is set to ATI R300.
http://www.driverheaven.net/showthread.php?s=&threadid=44253. Haven't had a
chance to read it yet in depth, and I do hope Nvidia isn't cheating.
Again.

"the Radeon 9800XT is very close in [FarCry] performance to the
6800Ultra at the same IQ levels. Only 7fps in average framerates separate
the cards which is a phenomenal result for the Radeon."

What does this mean? My guess is that FarCry uses a mix of PS1.1 & PS2
shaders for the GeforceFX, and these settings are being carried over
unchanged to the 6800u, while the Radeon has always used PS2.0 alone. If
true, present FarCry 6800u benchmarks are pretty much worthless, especially
with the reported 6800u artifacts.

rms
 
T

The Mighty MF

rms said:
This is not OT, it's directly relevant to this newsgroup.
"the Radeon 9800XT is very close in [FarCry] performance to the
6800Ultra at the same IQ levels. Only 7fps in average framerates
separate the cards which is a phenomenal result for the Radeon."

What does this mean? My guess is that FarCry uses a mix of PS1.1
& PS2 shaders for the GeforceFX, and these settings are being carried
over unchanged to the 6800u, while the Radeon has always used PS2.0
alone. If true, present FarCry 6800u benchmarks are pretty much
worthless, especially with the reported 6800u artifacts.

rms

I said OT only cause this involves NVIDIA moreso then ATI, at least it
sounded good in my head =). What disappoints me is I was kinda *hoping*
nvidia would perform that well (~60fps) in Far Cry. Instead they only get
~47fps, which isn't that much better then the 9800XT(~41). Now I really
can't wait to see what ATI will offer as their next-gen.

Khabs,
Mark
 
N

Nada

rms said:
This is not OT, it's directly relevant to this newsgroup.
Just noticed this thread about the performance difference in Far Cry if
the
Nvidia 6800 Device ID is set to ATI R300.
http://www.driverheaven.net/showthread.php?s=&threadid=44253. Haven't had
a chance to read it yet in depth, and I do hope Nvidia isn't cheating.
Again.

"the Radeon 9800XT is very close in [FarCry] performance to the
6800Ultra at the same IQ levels. Only 7fps in average framerates separate
the cards which is a phenomenal result for the Radeon."

What does this mean? My guess is that FarCry uses a mix of PS1.1 & PS2
shaders for the GeforceFX, and these settings are being carried over
unchanged to the 6800u, while the Radeon has always used PS2.0 alone. If
true, present FarCry 6800u benchmarks are pretty much worthless, especially
with the reported 6800u artifacts.

rms

To me, the real question is, what exactly can be achieved with PS2.0
and PS3.0 over PS1.1? I just didn't see a whole lot of differences
between my old GeForce 4200 and the newer 5900XT? Will the fish
shine? Do the birds change from fancy parrots to dirty seagulls when
using an old card? If the comparisons done in Geneva two weeks ago
were done in low quality PS1.1 against the highest quality PS3.0, then
that's really the wrong way to market the next generation card. If
the thugs have shining snot dribbling from the noses with PS3.0, maybe
I don't really want to see it!
 
T

The Mighty MF

Nada said:
To me, the real question is, what exactly can be achieved with PS2.0
and PS3.0 over PS1.1? I just didn't see a whole lot of differences
between my old GeForce 4200 and the newer 5900XT? Will the fish
shine? Do the birds change from fancy parrots to dirty seagulls when
using an old card? If the comparisons done in Geneva two weeks ago
were done in low quality PS1.1 against the highest quality PS3.0, then
that's really the wrong way to market the next generation card. If
the thugs have shining snot dribbling from the noses with PS3.0, maybe
I don't really want to see it!

Good question. From what I understand, Farcry is only using PS2.0 for
lights/lighting effects. This comes from an article recently posted at
FiringSquad: http://www.firingsquad.com/hardware/far_cry_nvidia/. It's an
interesting read if you haven't checked it out yet. I believe it also
mentions the PS3.0 vs PS1.1 comparision.
 
R

rms

To me, the real question is, what exactly can be achieved with PS2.0
Good question.

The biggest new feature seems to be Displacement Mapping, which could
make surfaces look more real. Bump-mapped surfaces look flat close up, but
DM should improve on that.

Other than that PS3 seems to mostly be about internal improvements.

rms
 
N

noman

The biggest new feature seems to be Displacement Mapping, which could
make surfaces look more real. Bump-mapped surfaces look flat close up, but
DM should improve on that.

Other than that PS3 seems to mostly be about internal improvements.

Displacement mapping isn't really tied to Shader Model 3.0 (and is in
no way related to Pixel Shader 3.0)

In SM3.0, texture lookup can be done directly by the "Vertex Shader"
unit. In theory, VS3.0 do it in parallel with its other tasks thus
resulting in speed improvements.

ATI cards have had displacement mapping since the 9700 but they
certainly can't do it using VS3.0 unless the new ATI cards support
SM3.0.

The few pictures that nVidia released (from Far Cry) don't actually
use displacement mapping. They have offset/parallax mapping in there,
using pixel shaders, to give a depth perception on the stone walls. It
can be done easily by cards such as GeForceFx or Radeon 9500+ cards
and above and it gives you much better results than regular bump
mapping.

By the way, "Displacement Mapping" is a technique, where a flat
surface is tweaked - that is, certain portions of it are raised or
lowered automatically based on a map in a texture. Unlike, bump
mapping or offset mapping, there is no trick involved there. The 3D
surface is really modified, instead of just giving viewers a depth
perception. Designers can then set up a flat surface in 3D with a
displacement map and the 3D card can then add details to that surface
in real time.

I can't think of a game that has used real displacement mapping
though.
 
A

Asestar

Displacement mapping isn't really tied to Shader Model 3.0 (and is in
no way related to Pixel Shader 3.0)
ATI cards have had displacement mapping since the 9700 but they
certainly can't do it using VS3.0 unless the new ATI cards support
SM3.0.

If in any doubt, check out that Ferrari demo for radeon9700. It has
displacement mapping on wheels. Or maybe check out ATI website for more info
on how dx9 works. DM is also discussed there.
 
M

Mr. Grinch

Good question. From what I understand, Farcry is only using PS2.0 for
lights/lighting effects. This comes from an article recently posted
at FiringSquad: http://www.firingsquad.com/hardware/far_cry_nvidia/.
It's an interesting read if you haven't checked it out yet. I believe
it also mentions the PS3.0 vs PS1.1 comparision.

I'm still wondering what the real differences will be between PS2.0 and
3.0 will be.

According to ATI, I get the impression they are saying they look the
same.

Listening to what nVidia has said in one interview about the 6800, what
I'm hearing is not that it looks any different, but that it has more
branching / decision making ability, and so makes it easier for the
developers, and in theory, allows the shader code to be smarter / faster.
So not necessarily better looking, but faster.

Of course then I would expect ATI to claim they can be just as fast with
PS2.0. Although I don't know if they can claim to have the same
flexiblity.

So is it a question of Image Quality? Speed? Or ease of use and
flexibility for the developer?

In the past, developers have definately had complaints about both
vendors, from driver issues to code paths. They've had to make
exceptions and write vendor specific paths. Most recently, it seemed ATI
was the easier card to develop for. They had buggy drivers but in the
end required less in the way of specific optimization for a given code
path. Maybe nVidia has learned and intends to pull off a win this time
around.

I'm still waiting for Doom 3 to come out before I buy a new system. The
wait is killing me.
 
D

Darthy

To me, the real question is, what exactly can be achieved with PS2.0
and PS3.0 over PS1.1? I just didn't see a whole lot of differences
between my old GeForce 4200 and the newer 5900XT? Will the fish
shine? Do the birds change from fancy parrots to dirty seagulls when
using an old card? If the comparisons done in Geneva two weeks ago
were done in low quality PS1.1 against the highest quality PS3.0, then
that's really the wrong way to market the next generation card. If
the thugs have shining snot dribbling from the noses with PS3.0, maybe
I don't really want to see it!

In the game like FarCry - My 9800 looks a lot nicer than my friends
Ti4200 (mine is on a shelf now, 5900 replaced it) - the PS and other
effects do come to play.

BTW: FarCry runs better with 1GB ram over 512mb on the PC.
 
B

Blig Merk

We are going to have to wait for both the Geforce 6800 and X800 to
start shipping and for DirectX9.0c to become available before we can
see some reputable benchmark results to know what the real answers
are. Even then, we have the Geforce 6800 Ultra coming out first,
followed by the Geforce 6800 Pro a few weeks later and the X800 coming
out first followed by the X800XT a month after that, so there is going
to be a time gap there before we have both sets of the highest end
cards to be compared with each other. ATI thinks they have the edge in
core and clock speed and they are saying that is going to make up for
some admitted feature ommissions and Nvidia is saying they have the
additional features that are going to make up for a slightly slower
core and clock speed. Right now, we have about two sites that have
done some sketchy benchmarks on pre-production Geforce 6800 cards with
basically beta drivers with beta DirectX9.0c with about 2 dozen sites
linking to those tests and making all kinds of conclusive assumptions
from those results. How can anybody come up with any meaningful
conclusions from that? It does seem like 3Dmark needs to get cooking
on 3Dmark04 and Aquamark needs an 04 update as well.
 
B

Blig Merk

rms said:
The biggest new feature seems to be Displacement Mapping, which could
make surfaces look more real. Bump-mapped surfaces look flat close up, but
DM should improve on that.

Other than that PS3 seems to mostly be about internal improvements.

No, this is what people in the ATI camp are saying. The strange thing
about this issue is that Nvidia have produced several videos of the
new features of Shader Model 3.0 (PS 3.0 and VS 3.0 have been wrapped
up into SM 3.0), while ATI has been strangely absent, with only verbal
announcements that PS 2.0 is the same as PS 3.0. There are NV40 tech
demos detailing more SM 3.0 features but they aren't gee-whiz-wowee
scenes (maybe because those take a lot of work to put together). They
do illustrate more features of SM 3.0 though. Here is one (6MB)
showing multiple lighting effects in SM 3.0:

ftp://download.nvidia.com/developer/Movies/NV40-LowRes-Clips/Many_Lights.avi

The fact that SM 3.0 supports HDRI SSS (Sub Surface Scattering) at
many frames per second presents some extremely interesting future
prospects.
 
N

Nada

Mr. Grinch said:
I'm still wondering what the real differences will be between PS2.0 and
3.0 will be.

According to ATI, I get the impression they are saying they look the
same.

Listening to what nVidia has said in one interview about the 6800, what
I'm hearing is not that it looks any different, but that it has more
branching / decision making ability, and so makes it easier for the
developers, and in theory, allows the shader code to be smarter / faster.
So not necessarily better looking, but faster.

Of course then I would expect ATI to claim they can be just as fast with
PS2.0. Although I don't know if they can claim to have the same
flexiblity.

So is it a question of Image Quality? Speed? Or ease of use and
flexibility for the developer?

In the past, developers have definately had complaints about both
vendors, from driver issues to code paths. They've had to make
exceptions and write vendor specific paths. Most recently, it seemed ATI
was the easier card to develop for. They had buggy drivers but in the
end required less in the way of specific optimization for a given code
path. Maybe nVidia has learned and intends to pull off a win this time
around.

I'm still waiting for Doom 3 to come out before I buy a new system. The
wait is killing me.

I'm already a withering corpse waiting for "Duke Nukem Forever". It's
almost like waiting for Van Halen to join with David Lee Roth.

I think nVidia's ace is more bound to the excellent speed and power of
6800 Ultra and its little brothers, rather than the visual tricks in
the upcoming games that still won't show up. "Half Life 2" is being
constantly pushed. It will take at least over a year before "Unreal
3" is out.
 
N

noman

ATI thinks they have the edge in
core and clock speed and they are saying that is going to make up for
some admitted feature ommissions and Nvidia is saying they have the
additional features that are going to make up for a slightly slower
core and clock speed.

Actually the improvements that SM3.0 brings (for gamers) are almost
all related to performance and speed and even that needs to be fully
tested because a) Some of the SM3.0 programming changes, like
branches, texture look-ups by VS3.0 may have a performance hit and b)
it's up to the hardware design (number of registers, FP24 or FP32 etc)
that'll dictate which hardware can run a certain shader faster.

The issue right now, is that nVidia with a likely slower core/clock
speed will try to be competitive with ATI because 6x00 cards may able
to do certain shader operation in single pass that 'll take longer on
X800 cards.

Don't forget that you can't just arbitrarily pick huge shaders that'll
put the GPU under a lot of pressure. Most of the FarCry shaders (that
were demoed by nVidia to show PS3.0 effect, which turned out to be
PS2.0 effects anyway) are hardly 10 or so instructions long.

You can read this interview of CryTek's CEO
http://www.pcper.com/article.php?aid=36

I am just a little bit wary of nVidia's marketing tactics. With NV3x
they came out touting CineFX and FP32 feature (which ATI still doesn't
have even with R420) but it turned out that the performance hit was
non trivial on any NV3x card and they had to drop down the precision
or pick DX8 shaders (in games like Far Cry or the upcoming HL2),
making those features practically worthless for NV3x series.

I really hope that this time nVidia has brought technology that
doesn't prove to be a handicap. My ultimate interest in this debate is
to see a fierce and lively competition between ATI and nVidia, so the
prices go down and I can get the card with best performance to price
ratio, whether it's from ATI or nVidia or someone else.

My 2¢
 
A

Asestar

prices go down and I can get the card with best performance to price
ratio, whether it's from ATI or nVidia or someone else.

Someone else... ? If only there was someone else... SOB.. 3dfx ... sob..
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top