Nvidia SLI, SLI's back with a vengeance

S

SLIisBACK

http://media.hardwareanalysis.com/articles/large/11206.jpg
http://media.hardwareanalysis.com/articles/large/11208.jpg
http://media.hardwareanalysis.com/articles/large/11207.jpg

http://www.hardwareanalysis.com/content/article/1728/

____________________________________________________________________________
___
Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM

By: Sander Sassen

I'm sure many of you can remember the days of 3dfx, the first Voodoo
Graphics back in 1996 and about a year later the introduction of the
Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
quite some time as two cards could be combined in something called an
SLI, Scan Line Interleave, configuration. Each card rendered half of
the image scan lines which resulted in double the performance of a
single board and the ability to play OpenGL games such as Quake 2 in a
1024x768 resolution. To date no manufacturer has come up with a
similar concept simply because modern graphics accelerators are all
AGP based, there's no dual AGP motherboards and PCI simply doesn't
have the bandwidth to handle modern graphics accelerators. With the
arrival of PCI-E things have changed though, a number of workstations
motherboards featuring the Tumwater chipset will have dual PCI-E-x16
slots making dual graphics accelerators a possibility again. Nvidia
steps up to the plate today with the re-introduction of the SLI
concept on the GeForce 6800 series, again using the SLI moniker but
now with a different approach to the same principles that made Voodoo2
SLI a huge success.




Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
configuration.

Whereas Voodoo2 SLI used a ribbon cable to be connected between two
Voodoo2 cards internally and a pass through VGA cable externally to
distribute the analog signal Nvidia's implementation is all done in
the digital domain. Both 6800 series PCI-E cards are connected by
means of a SLI, Scalable Link Interface, dubbed the MIO port, a
high-speed digital interconnect which connects to a connector on top
of both cards. Through this MIO port both cards communicate to each
other and distribute the workload which is accelerated by dynamic
load-balancing algorithms. In essence the screen is divided vertically
in two parts; one graphics card renders the upper section and the
second graphics card renders the lower section. The load balancing
algorithms however allow it to distribute the load across the graphics
processors. Initially they'll both start out at 50% but this ratio can
change depending on the load. Although Nvidia has remained
tight-lipped about what makes their SLI implementation tick exactly it
is clear that both hard- and software contribute to making SLI work.
Most of the dynamic load balancing between the two graphics processors
is handled in software and thus SLI needs driver support, drivers
which are as of yet unreleased, to work.




The MIO port connector that is used to connect two PCI GeForce 6800s
together in SLI.

Exact performance figures are not yet available, but Nvidia's SLI
concept has already been shown behind closed doors by one of the
companies working with Nvidia on the SLI implementation. On early
driver revisions which only offered non-optimized dynamic
load-balancing algorithms their SLI configuration performed 77% faster
than a single graphics card. However Nvidia has told us that
prospective performance numbers should show a performance increase
closer to 90% over that of a single graphics card. There are a few
things that need to be taken into account however when you're
considering buying an SLI configuration. First off you'll need a
workstation motherboard featuring two PCI-E-x16 slots which will also
use the more expensive Intel Xeon processors. Secondly you'll need two
identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
workstation users it is also a nice extra that with a SLI
configuration a total of four monitors can be driven off of the
respective DVI outputs on the graphics cards, a feature we'll
undoubtedly see pitched as a major feature for the Quadro version of
the GeForce 6800 series SLI configuration.




The high-speed digital MIO port bridge connecting the two PCI-E cards
together.

The dual PCI-E-x16 motherboard however will mean a significant
investment, two PCI-E GeForce 6800GT cards could however make more
sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
performance increase will be much larger. Also, workstation
motherboards run at a hefty price premium over consumer products,
fortunately they do not require dual Xeons, a single Xeon will work
just as well. All in all Nvidia's SLI implementation brings back fond
memories of the 3dfx days and has all the right ingredients to once
again revolutionize 3D graphics provided you're willing and able to
pay the hefty price tag associated with it. Unlike Voodoo2 there's no
simple upgrade to double your 3D performance; apart from a second
PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
That doesn't do much to dampen our spirits though, the best 3D
performance available comes at a price much like driving a Porsche or
Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
raising the bar and making the harts of many gamers rejoice; SLI is
back, and with a vengeance.

Sander Sassen.
____________________________________________________________________________
___


I can't wait to see ATI's response to this. MAXX could be back!

don't forget that ATI has had the ability to scale upto 256 R300-Radeon 9700
VPUs
since 2002. Both E&S and SGI have taken advantage of this. now hopefully
consumers can get in on the fun.
 
U

Unzar Jones

SLIisBACK said:
concept on the GeForce 6800 series

What cpu(s) can keep up with it?
Should the electrician connect it straight to Edison's sub-
station or will any gas powered generator do?
What will this over the top money pit be worth a year
after purchase?

I guess I'll pass.
 
C

chrisv

SLIisBACK said:
I can't wait to see ATI's response to this. MAXX could be back!

don't forget that ATI has had the ability to scale upto 256 R300-Radeon 9700
VPUs
since 2002. Both E&S and SGI have taken advantage of this. now hopefully
consumers can get in on the fun.

Learn how to post, dorky.
 
D

Dark Avenger

http://media.hardwareanalysis.com/articles/large/11206.jpg
http://media.hardwareanalysis.com/articles/large/11208.jpg
http://media.hardwareanalysis.com/articles/large/11207.jpg

http://www.hardwareanalysis.com/content/article/1728/

_________________________________________________________________________
___ ___
Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM

By: Sander Sassen

I'm sure many of you can remember the days of 3dfx, the first Voodoo
Graphics back in 1996 and about a year later the introduction of the
Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
quite some time as two cards could be combined in something called an
SLI, Scan Line Interleave, configuration. Each card rendered half of
the image scan lines which resulted in double the performance of a
single board and the ability to play OpenGL games such as Quake 2 in a
1024x768 resolution. To date no manufacturer has come up with a
similar concept simply because modern graphics accelerators are all
AGP based, there's no dual AGP motherboards and PCI simply doesn't
have the bandwidth to handle modern graphics accelerators. With the
arrival of PCI-E things have changed though, a number of workstations
motherboards featuring the Tumwater chipset will have dual PCI-E-x16
slots making dual graphics accelerators a possibility again. Nvidia
steps up to the plate today with the re-introduction of the SLI
concept on the GeForce 6800 series, again using the SLI moniker but
now with a different approach to the same principles that made Voodoo2
SLI a huge success.




Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
configuration.

Whereas Voodoo2 SLI used a ribbon cable to be connected between two
Voodoo2 cards internally and a pass through VGA cable externally to
distribute the analog signal Nvidia's implementation is all done in
the digital domain. Both 6800 series PCI-E cards are connected by
means of a SLI, Scalable Link Interface, dubbed the MIO port, a
high-speed digital interconnect which connects to a connector on top
of both cards. Through this MIO port both cards communicate to each
other and distribute the workload which is accelerated by dynamic
load-balancing algorithms. In essence the screen is divided vertically
in two parts; one graphics card renders the upper section and the
second graphics card renders the lower section. The load balancing
algorithms however allow it to distribute the load across the graphics
processors. Initially they'll both start out at 50% but this ratio can
change depending on the load. Although Nvidia has remained
tight-lipped about what makes their SLI implementation tick exactly it
is clear that both hard- and software contribute to making SLI work.
Most of the dynamic load balancing between the two graphics processors
is handled in software and thus SLI needs driver support, drivers
which are as of yet unreleased, to work.




The MIO port connector that is used to connect two PCI GeForce 6800s
together in SLI.

Exact performance figures are not yet available, but Nvidia's SLI
concept has already been shown behind closed doors by one of the
companies working with Nvidia on the SLI implementation. On early
driver revisions which only offered non-optimized dynamic
load-balancing algorithms their SLI configuration performed 77% faster
than a single graphics card. However Nvidia has told us that
prospective performance numbers should show a performance increase
closer to 90% over that of a single graphics card. There are a few
things that need to be taken into account however when you're
considering buying an SLI configuration. First off you'll need a
workstation motherboard featuring two PCI-E-x16 slots which will also
use the more expensive Intel Xeon processors. Secondly you'll need two
identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
workstation users it is also a nice extra that with a SLI
configuration a total of four monitors can be driven off of the
respective DVI outputs on the graphics cards, a feature we'll
undoubtedly see pitched as a major feature for the Quadro version of
the GeForce 6800 series SLI configuration.




The high-speed digital MIO port bridge connecting the two PCI-E cards
together.

The dual PCI-E-x16 motherboard however will mean a significant
investment, two PCI-E GeForce 6800GT cards could however make more
sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
performance increase will be much larger. Also, workstation
motherboards run at a hefty price premium over consumer products,
fortunately they do not require dual Xeons, a single Xeon will work
just as well. All in all Nvidia's SLI implementation brings back fond
memories of the 3dfx days and has all the right ingredients to once
again revolutionize 3D graphics provided you're willing and able to
pay the hefty price tag associated with it. Unlike Voodoo2 there's no
simple upgrade to double your 3D performance; apart from a second
PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
That doesn't do much to dampen our spirits though, the best 3D
performance available comes at a price much like driving a Porsche or
Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
raising the bar and making the harts of many gamers rejoice; SLI is
back, and with a vengeance.

Sander Sassen.
_________________________________________________________________________
___ ___


I can't wait to see ATI's response to this. MAXX could be back!

don't forget that ATI has had the ability to scale upto 256 R300-Radeon
9700 VPUs
since 2002. Both E&S and SGI have taken advantage of this. now
hopefully consumers can get in on the fun.

The militairy already uses hooked up ATI cards for their simulators...
yup..multiple R300 cores together for their simulators!

Ati can do it... but it's damn expensive..thus only the army has bought it!
 
F

FatDaddy

THIS IS A JOKE

3DFX?

REMEMBER VOODOO 6000

NO ONE WILL MAKE A 16X PCI DUAL SLOT MOTHER BOARD OTHER THEN SERVER SYSTEMS
AND YOU WILL NEED 800 WATTS POWER
O WOW
NVIDA HAS LOST i ALREADY HAVE MY ATI X800 XT PRO INSTALLED AND RUNNING FOR 2
WEEKS AND NOONE CAN EVEN GET AN ULTRA FROM NVIDIA
http://public.fotki.com/Tejas/
 
F

FatDaddy

I was being loud sorry

Mickey Johnson said:
I think your ati card has corrupted your keyboard. It seems that when the
ati driver is installed, the shift key is automatically pressed.



--
Mickster

Visit my website and see my arcade!!

http://mickster.freeservers.com

FOR
____________________________________________________________________________
____________________________________________________________________________
 
D

Dirk Dreidoppel

I was being loud sorry

And you are wrong. There will be dual PCI-E 16X boards. Alienware are
preparing to launch machines with this very setup.
 
J

J. Clarke

Dirk said:
And you are wrong. There will be dual PCI-E 16X boards. Alienware are
preparing to launch machines with this very setup.

I'm still waiting for my PCI/Microchannel board from Zeos. Believe one of
these small vendors will deliver an "innovative" product when you see it.

Note by the way that the Alienware board is not going to support arbitrary
video boards--according to their press release it is going to be tied to
specific models and combines the video using a third board. In other words
they've done something nonstandard.
 
R

Redbrick

____________________________________________________________________________
___
Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM

<snip>

I never did like the SLI configuration...took up two PCI slots
and kept you from installing anything else....all for what???

....1024x768?? You've got to be kidding???

Why not make a long PCI card with multiple GPU daughter card snap-on like a
ram extender.

That would save us the few and precious PCI-E slots.

So I guess with so many extra GPU cards...they'll all need two dedicated
molex connectors?

I think I'll pass on that...


Redbrick..who Loves his CLK
 
J

John Lewis

THIS IS A JOKE

3DFX?

REMEMBER VOODOO 6000

NO ONE WILL MAKE A 16X PCI DUAL SLOT MOTHER BOARD OTHER THEN SERVER SYSTEMS
AND YOU WILL NEED 800 WATTS POWER
O WOW
NVIDA HAS LOST i ALREADY HAVE MY ATI X800 XT PRO INSTALLED

Where ?

Does it fit ?

Your mouth seems too big for it, since you have kept repeating the
glorious news that you have some sort of X800 for the past 2 weeks.

Ati claim to make the following:-
X800 Pro
X800 XT
X800 XT Platinum.

Don't see a X800 XT Pro anywhere.............

Anyway, if the X800 ?? does not fit your mouth, there are other
quite suitable apertures

John Lewis
 
L

Larry Roberts

By the time this setup's price would come into most of the
game player's budget, the next great videocard chipset would hit the
market, boosting performance to rival, or surpass the dual card setup.
Unless the mainboard manufactuers start producing desktop mainnboards
with dual PCI-X slots, I don't see this as something that will take
off in the home user market.
Hell... By the time dual 12MB Voodoo2 SLI cards came into my
price range, the GF2 cards where dominating, and the GF3 cards where
about to hit the market.
 
I

Icer

That would save us the few and precious PCI-E slots.

AFAIK the PCI-X slots will only be fully utilized by video cards
anyway, which is why there would only be one slot normally. With
todays motherboards I only use the AGP slot and *maybe* one PCI slot
for a high end sound card, everything else is built in..
So I guess with so many extra GPU cards...they'll all need two dedicated
molex connectors?

Two cards, one connector between them.. thats it.

G Patricks
 
F

Folk

<snip>

I never did like the SLI configuration...took up two PCI slots
and kept you from installing anything else....all for what???

...1024x768?? You've got to be kidding???

Why not make a long PCI card with multiple GPU daughter card snap-on like a
ram extender.

That would save us the few and precious PCI-E slots.

So I guess with so many extra GPU cards...they'll all need two dedicated
molex connectors?

I think I'll pass on that...


Redbrick..who Loves his CLK

What are you saving those PCI slots for anyway? The only thing
currently occupying my five PCI slots is a sound card. In these days
of onboard NIC's, onboard RAID, etc., etc., there is less of a need
for PCI slots.
 
R

R420

Larry Roberts said:
By the time this setup's price would come into most of the
game player's budget, the next great videocard chipset would hit the
market, boosting performance to rival, or surpass the dual card setup.
Unless the mainboard manufactuers start producing desktop mainnboards
with dual PCI-X slots, I don't see this as something that will take
off in the home user market.
Hell... By the time dual 12MB Voodoo2 SLI cards came into my
price range, the GF2 cards where dominating, and the GF3 cards where
about to hit the market.

I don't see why I should spend $1000 or more for 2 GeForce 6800s when
I will be able to get an NV50 next year with similar or better
performance and newer features for about half the price
 
J

J. Clarke

Folk said:
What are you saving those PCI slots for anyway? The only thing
currently occupying my five PCI slots is a sound card. In these days
of onboard NIC's, onboard RAID, etc., etc., there is less of a need
for PCI slots.

It's not a "PCI slot", it's a "PCI _Express_" slot. There's a difference.
 
J

John Lewis

http://media.hardwareanalysis.com/articles/large/11206.jpg
http://media.hardwareanalysis.com/articles/large/11208.jpg
http://media.hardwareanalysis.com/articles/large/11207.jpg

http://www.hardwareanalysis.com/content/article/1728/

____________________________________________________________________________
___
Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM

By: Sander Sassen


See the new Anandtech article:-
http://www.anandtech.com/video/showdoc.html?i=2097

Not the SLI that we knoew and loved but a totally different
patented technique for sharing GPU loads in an intelligent manner.
They have the rights inherited from 3dfx to the name "SLI" for
GPU load-sharing.

It is pretty obvious that nVidia's current dual board "SLI" exercise
is a Trojan horse to test the functionality of the GPU shared-port and
build optimizing drivers/compilers in preparation for multiple GPUs on
one board.a la 3dfx V5 5500. This situation will become a reality
with the current round of design exercises at both nVidia ( and Ati )
to shrink the 6800 ( and X800 ) on to a smaller geometry process,
probably 110nm, and later 65nm. For nVidia, the resulting reduction
of the power per chip will certainly make a dual-GPU/board a
reality, or maybe even a dual GPU on a hybrid substrate.
Parallel graphics processing is becoming a mandatory requirement.
Process shrinks are not occuring fast enough to keep up with
the signal processing demands. Parallelism is the way to keep
the chip-yield up and the on-chip dissipation/unit-area within
reasonable bounds.

Intel is actively working on CPU dual-core parallel-architectures
too, beyond HT. They have realised that they are getting near
the limits of pushing clock-rate, and have to come up with other
solutions.

John Lewis
 
D

Dark Avenger

By the time this setup's price would come into most of the
game player's budget, the next great videocard chipset would hit the
market, boosting performance to rival, or surpass the dual card setup.
Unless the mainboard manufactuers start producing desktop mainnboards
with dual PCI-X slots, I don't see this as something that will take
off in the home user market.
Hell... By the time dual 12MB Voodoo2 SLI cards came into my
price range, the GF2 cards where dominating, and the GF3 cards where
about to hit the market.

Well, that indeed is a point.. why spend so much money... I guess it's for
bragging rights... to say that you are the fastest bitch on the market FOR
THAT MOMENT.

Yup, I know people like htat...gamers... hardcore gamers...go for nothing
less then the fastest... and see their moneys worth plummit in mere
months...

I self, well I buy products..when they are on good price... a price I am
willing to pay... I guess the most expensive thing about my rig is the
memory... 1 Gb of PC3200 DDR... Kingstong also ( not hyperX though ) ... so
even there I didn't invest to much..
 
K

K

AFAIK the PCI-X slots will only be fully utilized by video cards
anyway, which is why there would only be one slot normally. With
todays motherboards I only use the AGP slot and *maybe* one PCI slot
for a high end sound card, everything else is built in..

Be careful, PCIe and PCI-X are entirely different, though I can easily see
why people would want to call it PCI-X.

K
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top