SLI is back

T

Tim O

How true!!!

I can't believe how obsessed the industry has become with resolution. And
yet its almost completely irrelevant.

Take your average "The Matrix" DVD. A mere 852x480 pixels. And yet did
anyone ever say, it didn't look fantastic?

What we need is more polygons, and better lighting effects. Not more
pixels. 1280x1024 exceeds even HDTV standards and is *more* than enough!

This has some merit (motion blur is one of the ingredients to making
lower resolutions look good), but you also have to consider that you
sit 10 to 15 feet from a TV and less than two feet from my PC monitor.
It's fairly easy to pick out artifacting and other issues with DVD's
by either playing them on the PC or moving in closer to my TV. A 1080i
HDTV broadcast also makes most DVD's look pretty poor if viewed back
to back.

I still agree with what you're saying. I'm still playing Far Cry and
1024x768 with all details on, preferring the lower resolution and more
detail to cranking resolution up and having to sacrifice world detail.
 
A

Asestar

My line goes like:
some S3 pci, Matrox Mystique, Voodoo2, Voodoo2SLI, Ati RageXL, Voodoo3
3000agp, Kyro2 64mb, Gf2 gts 64mb (died), radeon 8500le (died), mobility
9000, 9600se, mobility 9600 pro..
 
J

JLC

Destroy said:
Depends a lot on hardware its being viewed on however. HUGE difference
between 1600x1200 and 800x600 on a 21 inch monitor. Resolution is very
important.

Absolutely. A TV screen and a PC monitor are not the same thing at all. I
only have a 19" monitor but 800x600 looks bad on it. I can see the dot pitch
or what ever you'd call it when I run a low rez. 1024x768 is what I run all
my games at. And I run my desktop at 1280x1024. JLC
 
D

Destroy

It sounds like an experiment alright, a marketing experiment. Being a high
end vendor I'm sure Alienware is always looking for something to distinguish
themselves from the pack. It doesn't have to be revolutionary, just unique
and intriguing enough to justify their inflated price point. There are
always people who will blindly spend more because they think they're getting
something for it. To me it just sounds like something Alienware is using to
enhance their image as an innovative, exotic PC vendor.

Here is more so how it works, ripped from
http://www.pcper.com/article.php?aid=43
--------
Although the first ALX systems certainly seem impressive, it is the
Video Array system found on the following model which has the potential
to take performance to a whole new level. In short, the patent pending
Video Array is a system which utilizes two PCI-Express graphics cards to
maximize performance. Although seeming alarmingly similar to the famous
SLI technology found on Voodoo cards of old, the new Video Array system
takes a totally new approach. Here, each graphics card is responsible
for rendering a specific portion of the screen. Typically, the screen is
divided into two horizontal halves. The job of rendering each portion is
then dictated to the appropriate card by a “video merger hub”. This
hardware component is able to take signals from the default video
drivers and can allocate responsibilities to either graphics card.
Overall, the separation of workload is hoped to increase performance
more than 40% over typical platforms in most applications. This
performance advantage is said to increase according to how taxing the
application may be. Doom 3 and Half-Life 2 fans rejoice as this new
system seems custom tailored to these titles.
---------
 
A

Allan Sheely

Here is more so how it works, ripped from
http://www.pcper.com/article.php?aid=43
--------
Although the first ALX systems certainly seem impressive, it is the
Video Array system found on the following model which has the potential
to take performance to a whole new level. In short, the patent pending
Video Array is a system which utilizes two PCI-Express graphics cards to
maximize performance. Although seeming alarmingly similar to the famous
SLI technology found on Voodoo cards of old, the new Video Array system
takes a totally new approach. Here, each graphics card is responsible
for rendering a specific portion of the screen. Typically, the screen is
divided into two horizontal halves. The job of rendering each portion is
then dictated to the appropriate card by a “video merger hub”. This
hardware component is able to take signals from the default video
drivers and can allocate responsibilities to either graphics card.
Overall, the separation of workload is hoped to increase performance
more than 40% over typical platforms in most applications. This
performance advantage is said to increase according to how taxing the
application may be. Doom 3 and Half-Life 2 fans rejoice as this new
system seems custom tailored to these titles.

Oh yea, I'm going to buy two X800's to gain 40% - NOT.
 
C

Chip

JLC said:
Absolutely. A TV screen and a PC monitor are not the same thing at all. I
only have a 19" monitor but 800x600 looks bad on it. I can see the dot pitch
or what ever you'd call it when I run a low rez. 1024x768 is what I run all
my games at. And I run my desktop at 1280x1024. JLC

Where did you guys get 800x600 from????? I said 1280x1024 was plenty. Not
800x600. Of course 800x600 will look crap on a 19" monitor.

Chip
 
C

Chip

Tim O said:
This has some merit (motion blur is one of the ingredients to making
lower resolutions look good), but you also have to consider that you
sit 10 to 15 feet from a TV and less than two feet from my PC monitor.
It's fairly easy to pick out artifacting and other issues with DVD's
by either playing them on the PC or moving in closer to my TV. A 1080i
HDTV broadcast also makes most DVD's look pretty poor if viewed back
to back.

I still agree with what you're saying. I'm still playing Far Cry and
1024x768 with all details on, preferring the lower resolution and more
detail to cranking resolution up and having to sacrifice world detail.

I agree. I think 1280x1024 is enough and many of todays cards will run that
resolution pretty well with 4xAA. 1600x1200 is over the top, imho.

Don't get me wrong, in an ideal world we'd all have cards that ran games
with all the features on at 1600x1200 or higher. Why not. Its just that I
think the screen resolution is no longer then most important feature. When
we were all struggling to play Quake at 400x300 and it needed a Pentium Pro
to make it run OK at 640x480, then the resolution made the *world* of
difference. But we've gone beyond that now. Does 1600x1200 look so much
better than 1280x1024? With 4xAA and 8xAF? I would say not.

Does HL2 look so much better than HL1? YES. Does Unreal3 look so much
better? YES. And why? Because these new games using more sofisticated
lighting effects and push many more polygons. HL2 at 1280x1024 will look
*miles* better than HL1 at 1600x1200!

Chip

Chip.
 
C

Chip

I have no problem with the "separation of the work" idea.

What I have a problem with is the "display of the results"! Given that you
can only attach the monitor to 1 card, how are you supposed to get the
output from the other card onto the screen? The only way is back across the
PCI-Express bus. The bandwidth of which is nowhere near large enough to
compete with the 256bit, 1000MHz+ bandwidth of the cards' video memory.
PCI-Express will be a big bottleneck!

Chip
 
A

Andrew

What I have a problem with is the "display of the results"! Given that you
can only attach the monitor to 1 card, how are you supposed to get the
output from the other card onto the screen? The only way is back across the
PCI-Express bus. The bandwidth of which is nowhere near large enough to
compete with the 256bit, 1000MHz+ bandwidth of the cards' video memory.
PCI-Express will be a big bottleneck!

But it doesn't need to keep up with the video memory. Each card only
has to render half a frame each, and the output of one card goes to
the other where the data is combined, which at my calculations based
on 1600x1200x32 @ 100fps, requires ~288MB/sec which is a fraction of
the PCI express bandwidth.
 
X

Xocyll

How true!!!

I can't believe how obsessed the industry has become with resolution. And
yet its almost completely irrelevant.

Take your average "The Matrix" DVD. A mere 852x480 pixels. And yet did
anyone ever say, it didn't look fantastic?

What we need is more polygons, and better lighting effects. Not more
pixels. 1280x1024 exceeds even HDTV standards and is *more* than enough!

The thing is they aren't getting obsessed with it now, they always have
been.

It's just that in recent years video cards (and cpus) are fast enough to
use those resolutions at decent speed where it would have been a slide
show or nothing on older hardware.

Combine that with the trend to bigger monitors where low resolution
looks blocky.

Combine that with the fact that displaying a higher resolution _without_
AA may yield a better and smoother picture than low res with AA, and at
better framerates.

I know games like Diablo (640x480) looked great on a 14" monitor, ok on
a 17" monitor and looks like crap on a 22".

Lord only knows how bad something like one of the early Wing Commander
games in 320x200 MCGA would look on a 22".


TV is a bad comparison, since you sit 6+ feet away from the TV, but only
2 feet from the monitor. Get too close to a TV and it usually looks
pretty crappy, and more importantly, TV is non-interactive.

Xocyll
 
X

Xocyll

Personally, I would never want dual vid cards, just as I wouldn't want
dual cpu's. And there are two reasons for that, 2 means twice as much
heat and twice as much noise.

If can effectively silence one, you can silence two.

There are lots of effective yet quiet cooling solutions, but they aren't
the cheaper ones.

Having as much heat as a modern video card produces, and having it
doubled, close together could be a problem, but that could be dealt with
easily enough by good case/airflow design.

Xocyll
 
X

Xocyll

I have no problem with the "separation of the work" idea.

What I have a problem with is the "display of the results"! Given that you
can only attach the monitor to 1 card, how are you supposed to get the
output from the other card onto the screen? The only way is back across the
PCI-Express bus. The bandwidth of which is nowhere near large enough to
compete with the 256bit, 1000MHz+ bandwidth of the cards' video memory.
PCI-Express will be a big bottleneck!

Presumably it's going to go something like

Game sends signal to video system
video merger hub splits signal and sends half to each card
each card process it's share and outputs back to the hub - or a second
"hub" that does nothing but merge the halves and keep things in sync.

Whether that's through the bus or through some kind of adaptor that
links the normal card outputs into a merger hub that has an output to
the monitor we'll h ave to wait and see.

It sounds like it could work, but I doubt it will ever work well enough
to make it worth paying the extra cash for a system with it and dual
cards.

Might be worth it later on when prices may drop, so you can use last
generation video cards and keep up with current generation stuff.

Xocyll
 
A

Arnaldo Horta

Allan Sheely said:
Personally, I would never want dual vid cards, just as I wouldn't want
dual cpu's. And there are two reasons for that, 2 means twice as much
heat and twice as much noise.

If you read the description of the Alienware setup, you will see that
they are liquid cooling their two card rigs,
so no noise and little heat....
 
J

JLC

Arnaldo Horta said:
If you read the description of the Alienware setup, you will see that
they are liquid cooling their two card rigs,
so no noise and little heat....
You guys should check out this link http://tinyurl.com/22pt3 they not only
do they have a Alienware box with the two card set up, they also have a
short vid clip from E3 showing it in action. So for all the guys that think
it's not going to work (I was one of them) it does. Now how well it's really
going to work is another matter. JLC
 
A

Allan Sheely

If you read the description of the Alienware setup, you will see that
they are liquid cooling their two card rigs,
so no noise and little heat....

OK, thx. Man, these rigs are going to cost big bucks though so I still
won't be getting one.
 
A

Asestar

Not to forget than even Voodoo5 was a kind-of SLI setup. Two gpu's and all..
Voodoo5 6000 remains as of yet, only card ever built with 4 gpu's !!

As for sync, i'm more in doubt about this so called, "alienware developed
software". Think about having to install 2 display drivers, and this thing.
Too risky as there are a thousand possible config conflicts ..
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top