NVIDIA Announces 16x Anti-Aliasing For SLI


G

Guest

http://www.anandtech.com/video/showdoc.aspx?i=2442
NVIDIA Announces 16x AA For SLI
In response to ATI's CrossFire launch, NVIDIA has revisited a few points
about their own solution, as well as revealed some sneak peeks into the
future of their SLI technology.

Of course, one of the key points NVIDIA wanted to make was that their game
support is no less than ATI's. Initially, either games needed profiles setup
in order to run, or users had to know how to hack the NVIDIA XML file.
NVIDIA is now offering the ability to enable user selected SLI modes for
games that do not have profiles. Profiles will take precedence over user
selected modes, but even games whose profiles disable SLI will allow the
user to force it on.

Their other real point of contention with ATI is their claim that they add
quality options where NVIDIA does not. As we know, ATI is enabling 10xAA and
14xAA options for games that don't see any real benefit from SLI otherwise.
In order to top the announcement that ATI made, NVIDIA has revealed that
they are planning on bringing out a 16xAA mode via SLI in a driver to be
launched in early July.

We haven't gotten as much detail about this implementation as we currently
have on ATI's AA modes. We don't know what the final sample point patter
will look like, but NVIDIA has said that they will provide this detail when
they finalize it themselves. We do know that, regardless of what NVIDIA
decides, their 16x mode will be a combination of 4x multisampling and 4x
supersampling. The debate currently is whether or not to implement
supersample AA via an increased resolution or by rendering the scene 4 times
with each rendering being slightly shifted. The advantage of the latter
method is that rotated grid SSAA can be used, but the disadvantage is that
the geometry load would be increased. NVIDIA has told us that they can do
either method but haven't decided which to settle on.

Why is 4xSS plus 4xMS equal to 16xAA? Because each supersample point
contains 4 multisample points giving us 4 times the multisample points. The
other advantage is that SSAA applies to the entire scene, so we get 4xSSAA
applied to parts of the scene that would see no benefit from multisampling
(the interior of polygons and textures).

This mode will not be a simple combination of two scenes rendered with the
current 8xAA, but rather each card will render 4xSS + 4xMS. For alternate
and split frame rendering, each card will be doing full 16xAA. This may also
give us a glimpse into the future as each generation graphics cards continue
to increase in power. Doing full 16xAA on each card means we could see this
order of AA running on a single card in a generation or so.

We are definitely interested in testing this mode when it comes along early
next month.
 
Ad

Advertisements

M

Mike B

http://www.anandtech.com/video/showdoc.aspx?i=2442
NVIDIA Announces 16x AA For SLI
In response to ATI's CrossFire launch, NVIDIA has revisited a few points
about their own solution, as well as revealed some sneak peeks into the
future of their SLI technology.

Of course, one of the key points NVIDIA wanted to make was that their game
support is no less than ATI's. Initially, either games needed profiles
setup in order to run, or users had to know how to hack the NVIDIA XML
file. NVIDIA is now offering the ability to enable user selected SLI modes
for games that do not have profiles. Profiles will take precedence over
user selected modes, but even games whose profiles disable SLI will allow
the user to force it on.

Their other real point of contention with ATI is their claim that they add
quality options where NVIDIA does not. As we know, ATI is enabling 10xAA
and 14xAA options for games that don't see any real benefit from SLI
otherwise. In order to top the announcement that ATI made, NVIDIA has
revealed that they are planning on bringing out a 16xAA mode via SLI in a
driver to be launched in early July.

We haven't gotten as much detail about this implementation as we currently
have on ATI's AA modes. We don't know what the final sample point patter
will look like, but NVIDIA has said that they will provide this detail
when they finalize it themselves. We do know that, regardless of what
NVIDIA decides, their 16x mode will be a combination of 4x multisampling
and 4x supersampling. The debate currently is whether or not to implement
supersample AA via an increased resolution or by rendering the scene 4
times with each rendering being slightly shifted. The advantage of the
latter method is that rotated grid SSAA can be used, but the disadvantage
is that the geometry load would be increased. NVIDIA has told us that they
can do either method but haven't decided which to settle on.

Why is 4xSS plus 4xMS equal to 16xAA? Because each supersample point
contains 4 multisample points giving us 4 times the multisample points.
The other advantage is that SSAA applies to the entire scene, so we get
4xSSAA applied to parts of the scene that would see no benefit from
multisampling (the interior of polygons and textures).

This mode will not be a simple combination of two scenes rendered with the
current 8xAA, but rather each card will render 4xSS + 4xMS. For alternate
and split frame rendering, each card will be doing full 16xAA. This may
also give us a glimpse into the future as each generation graphics cards
continue to increase in power. Doing full 16xAA on each card means we
could see this order of AA running on a single card in a generation or so.

We are definitely interested in testing this mode when it comes along
early next month.
16x AA?? how stupid is that? who in hell needs 16xAA? 4x is quite enough for
any resolution, and no more is needed at high resolutions over 1152x864.
what a wasted marketing ploy, when they could be working on some real
innovations. even to this day, what we have are effects that only better
what already existed with the r300. when are we going to get something
totally new and exciting again?
 
N

Non_Sequitur

Mike said:
16x AA?? how stupid is that? who in hell needs 16xAA? 4x is quite enough for
any resolution, and no more is needed at high resolutions over 1152x864.
what a wasted marketing ploy, when they could be working on some real
innovations. even to this day, what we have are effects that only better
what already existed with the r300. when are we going to get something
totally new and exciting again?
Why stop at 16? Why not 512x AA? I guess we have to wait for 32x, 64x,
128x, and 256x first, huh?
 
G

Gordon

Skybuck Flying said:
Why not use a blurry monitor instead of Anti Aliasing ? ;)
For me, just using an LCD monitor, in a non-native resolution, has a similar
effect but without the performance hit..
 
C

Cory Dunkle

Why not just run a higher resolution? I've hated AA from the start. It ruins
image quality. If you are that concerned about image quality as to buy an
insanely expensive SLI graphics card setup and a system to support it
properly then you can surely afford a good monitor that supports a decent
resolution, or you could only buy 1 video card, spend the moeny ona better
monitor, and forget about the AA. It's all a marketing ploy to make people
think nVidia is 'bigger and badder' since they have a higher number. I think
there are probably better ways nVidia could spend their developing time.
 
Ad

Advertisements

R

Raymond Martineau

Why not just run a higher resolution?
Monitor and/or system limitations. If you have more pixels on a physical
screen, it's more data per second that has to be transferred to the monitor
(unless you cut refresh rates) - I'm not sure which limit will be hit
first, but it will be a problem sooner or later.
I've hated AA from the start. It ruins image quality.
Runing image quality is a good thing. While it may look worse for still
shots, the jagged staircace and moire effects generally cause a much
greater problem when attempting animation.

There are some people that still notice jaggies at 1600x1200. Usually, they
have at least a 19" monitor.
It's all a marketing ploy to make people
think nVidia is 'bigger and badder' since they have a higher number.
Well, yes. Right now, the bottleneck is currently the video card, and
maxing out performance will fix this problem for a while.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top