ATi --- Crossfire "Mark II" will finally abandon the Master-Slave implementation.

J

John Lewis

ATi is finally doing what they should have done before Crossfire was
first shipped - integrated the compositor into the silicon of every
high-end Crossfire-capable GPU and (a la SLI) symmetrically
data-linking the GPUs on identical boards or modules.

See:-

http://www.dailytech.com/article.aspx?newsid=909

Seems not a good time to 'invest' in any of the current Crossfire
implementations. Orphaned products with very low sales volumes
( the current Master/dongle cards) normally get poor long-term
technical/software support.

John Lewis
 
W

Walter Mitty

"John" risked the wrath of Usenet weenies mastering
mommies computer when he ventured forth on 2006-02-21, commmitted
his life to the whims of Google, and spluttered:
ATi is finally doing what they should have done before Crossfire was
first shipped - integrated the compositor into the silicon of every
high-end Crossfire-capable GPU and (a la SLI) symmetrically
data-linking the GPUs on identical boards or modules.

See:-

http://www.dailytech.com/article.aspx?newsid=909

Seems not a good time to 'invest' in any of the current Crossfire
implementations. Orphaned products with very low sales volumes
( the current Master/dongle cards) normally get poor long-term
technical/software support.

John Lewis

It is a rum day when a company incurs the wrath of Mr Lewis. Like a
dog with a bone he is.
 
J

John Lewis

"John" risked the wrath of Usenet weenies mastering
mommies computer when he ventured forth on 2006-02-21, commmitted
his life to the whims of Google, and spluttered:


It is a rum day when a company incurs the wrath of Mr Lewis.

Nope. Not the company. Just the asinine management and marketing.

I'm sure that ATI engineering prototyped Crossfire in its present
form, showed it to management and asked "please can we integrate
the compositor into the GPUs and emulate SLI with identical boards
and symmetrical connections" And marketing/management said
"NO, just ship it for now, some idiots are bound to buy ! "

ATi running after nVidia reminds me of today's version of Intel's CPU
and chip-set groups running behind AMD as the innovator.
dog with a bone he is.

Gabe's rear end would be a much easier and jucier target for my
dog.... :) :)

John Lewis
 
M

Magnulus

Is that the biggest problem you could dig up with ATI? Because it
sounds like a non-issue to me. It's not as if SLI is a particularly
sensible deal, either.
 
J

John Lewis

Is that the biggest problem you could dig up with ATI? Because it
sounds like a non-issue to me.

Sure is an issue if you are somebody not particularly rich but
a PC gaming-enthusiast. pony up $1000 and 3 months later
find that the dual-card implementation that you invested in
had become a totally obsolete <architecture> nine months
after its introduction. Past history of low-volume obsolete
architectures has not been kind in terms of technical
support and software updates.
It's not as if SLI is a particularly
sensible deal, either.

Agreed, but 4 million SLI motherboards, 10 million
SLI-capable video cards, 500k-1million dual-card
SLI gaming rigs, plus a unified driver architecture
does have some weight in terms of long-term support.

John Lewis
 
F

First of One

For those looking for an eventual dually X1900 setup, the Crossfire master
card, already available, can work as a standalone. Price premium is about
$50. Finding a slave card in the future should not be problematic.
 
T

Tim O

ATi is finally doing what they should have done before Crossfire was
first shipped - integrated the compositor into the silicon of every
high-end Crossfire-capable GPU and (a la SLI) symmetrically
data-linking the GPUs on identical boards or modules.

See:-

http://www.dailytech.com/article.aspx?newsid=909

Seems not a good time to 'invest' in any of the current Crossfire
implementations. Orphaned products with very low sales volumes
( the current Master/dongle cards) normally get poor long-term
technical/software support.

John Lewis

Is there anyone that spends a fortune on new video cards that doesn't
realize they're a sucker deal? Something twice as fast is always a
year away.
 
T

Tim

Is there anyone that spends a fortune on new video cards that doesn't
realize they're a sucker deal? Something twice as fast is always a
year away.
... and doing so on a single card.

Now Dell is releasing a PC with four 7800GTX GPUs. It reminds me of the
razor companies with their number-of-blade wars. Did Gillette buy out Dell
recently?
 
F

Folk

Sure is an issue if you are somebody not particularly rich but
a PC gaming-enthusiast. pony up $1000 and 3 months later
find that the dual-card implementation that you invested in
had become a totally obsolete <architecture> nine months
after its introduction.

That's kind of how I felt after paying $400 for a 6800 GT (back in the
day) and six months later AGP went the way of the dinosaur.

Not quite the same issue, but painful nonetheless...
 
M

Magnulus

Tim said:
Now Dell is releasing a PC with four 7800GTX GPUs. It reminds me of the
razor companies with their number-of-blade wars. Did Gillette buy out
Dell recently?

Yes, it is nuts. I've got a 7800 GT (not the GTX) that runs just about
any game out there just fine with ungodly amounts of anti-aliasing (more
than I really need) at 1280x1024. Why would I even need two of them, let
alone four? And certain things, like anti-aliasing of normal maps, just are
not going to be fixed by throwing more hardware at it. Unless you've got a
huge widescreen monitor running games at 1900x1200 or whatever, what's the
need?

It all comes down to software. The ultra high end graphics stuff is
basicly dying on the PC beyond tech demoes and the occasional game. So you
can get your super-duper SLI graphics, and find there is absolutely no
reason to own them beyond bragging rights.
 
M

McGrandpa

Magnulus said:
Yes, it is nuts. I've got a 7800 GT (not the GTX) that runs just about
any game out there just fine with ungodly amounts of anti-aliasing (more
than I really need) at 1280x1024. Why would I even need two of them, let
alone four? And certain things, like anti-aliasing of normal maps, just
are not going to be fixed by throwing more hardware at it. Unless you've
got a huge widescreen monitor running games at 1900x1200 or whatever,
what's the need?

It all comes down to software. The ultra high end graphics stuff is
basicly dying on the PC beyond tech demoes and the occasional game. So
you can get your super-duper SLI graphics, and find there is absolutely no
reason to own them beyond bragging rights.
*I'll* be seeing about that one friend! I can only go up to 1280x1024 as
that's my monitors native res. All of them.

I'm running an X2 4800+, 2 gigs PC3200 and one (01) 7800GTX 256 (for the
moment).
HL2 running Lost Coast video stress test in 1280x1024 with everything on and
highest, with highest image quality, 8x aa 16x af comes back consistently
with 70.84 fps.
FEAR, in game, 1280x968 (its highest mode I can do) with everything cranked
up give me just in the 40's. Hmpfh.
Doom3. Um, I didn't slow down long enough to do a demo for framerates. I
set it to Nighmare and went for it. Half a day solid non-stop and I'm over
halfway through! Never seen it look so good or play so fast. I have Ultra
quality and everything is cranked up and highest. I'd guess in the 60's
fps.
Then, I had a hd crap out on me last night. 160gb's worth of data, just
flushed. Everything is back UP except for the bad hd. Happily just a data
drive and most stuff is backed up already.
Wheew.... I can't wait to see how SLI will work for some of the games! Oh,
did I mention all this is in XP Pro 64 bit edition? :) The one game
that's surprised me the most so far is FEAR. It does *NOT* 'give as much
as it gets'.... meaning I think the engine code is not well optimised for
either 32 bits mode or 64 bit mode. Because is simply does not look as good
in-game as HL2 64 bit, Quake4, Doom3.
I think I should see some serious improvment in fr's in all of these major
games with two GTX 256 meg (twins! I'm having twins!) in SLI.
McG. :blush:)))
 
M

McGrandpa

John Lewis said:
Please elaborate with all details. A curious mind wants to know.



John Lewis

Now that in itself has ME curious. Why?

This drive had a SMART error, is 3 years old, has been run quite hot at
times until I realized that it like all the stuff inside the case, needs
cooling. It 'lost' all partition data. I shouldn't have been using it for
anything critical. Now it has a different SMART error. Something about
RAW data rate. It took over 6 hrs to reformat, full drive, single
partition, NTFS.
And, it was only half full. Most of my stuff was backed up. Lost a few
days worth of emails.
Well?
McG.
 
J

John Lewis

Now that in itself has ME curious. Why?

This drive had a SMART error, is 3 years old, has been run quite hot at
times until I realized that it like all the stuff inside the case, needs
cooling. It 'lost' all partition data. I shouldn't have been using it for
anything critical. Now it has a different SMART error. Something about
RAW data rate. It took over 6 hrs to reformat, full drive, single
partition, NTFS.
And, it was only half full. Most of my stuff was backed up. Lost a few
days worth of emails.
Well?
McG.

The most interesting pieces of info were left out... manufacturer,
model number, manufacturing date. I have an interest
in collecting such information, as I build and repair PCs
quite frequently. Thanks in advance.

John Lewis
 
M

McGrandpa

John Lewis said:
The most interesting pieces of info were left out... manufacturer,
model number, manufacturing date. I have an interest
in collecting such information, as I build and repair PCs
quite frequently. Thanks in advance.

John Lewis
Western Digital WD1600JB-00DUA0, IDE 100, 160gb Manufactured 04JAN2003.

Hm. WD site says it's still under warranty. Through March. Seems I
registered it almost 3 years ago when I bought it. Thing is, it is still
running although it's *slow* and has that SMART raw read rate error. And it
forgot all about my partition.
HTH
McG.
 
J

John Lewis

Western Digital WD1600JB-00DUA0, IDE 100, 160gb Manufactured 04JAN2003.

Hm. WD site says it's still under warranty. Through March. Seems I
registered it almost 3 years ago when I bought it. Thing is, it is still
running although it's *slow* and has that SMART raw read rate error. And it
forgot all about my partition.
HTH
McG.

Thanks for the info.
Seems like a warranty return to me.
Just Do It.

John Lewis
 
M

Magnulus

McGrandpa said:
HL2 running Lost Coast video stress test in 1280x1024 with everything on
and highest, with highest image quality, 8x aa 16x af comes back
consistently with 70.84 fps.

I was playing the training level in Deus Ex with 8XS antialiasing at
1280x1024 with the supersampling transparent antialiasing (what a mouthful)
and it didn't slow down at all. Of course, I just have a "lowly" 7800 GT
card.
he one game that's surprised me the most so far is FEAR. It does *NOT*
'give as much as it gets'.... meaning I think the engine code is not well
optimised for either 32 bits mode or 64 bit mode.

This doesn't surprise me; Monolith engines have never been particularly
good.

Generally though there's no point in having SLI, not unless you have big
HDTV's or widescreen monitors. And especially because the industry I see
moving in two directions; a few tech demoes that push technology hard, and
the rest will be more indie game oriented. The PC gaming industry just
cannot sustain huge budgets like that forever; the cost of art assets in
games is getting higher and higher, and won't change until we have some
faster/easier way to do 3D modelling.
 
M

McGrandpa

Magnulus said:
I was playing the training level in Deus Ex with 8XS antialiasing at
1280x1024 with the supersampling transparent antialiasing (what a
mouthful) and it didn't slow down at all. Of course, I just have a
"lowly" 7800 GT card.


This doesn't surprise me; Monolith engines have never been particularly
good.

Generally though there's no point in having SLI, not unless you have big
HDTV's or widescreen monitors. And especially because the industry I see
moving in two directions; a few tech demoes that push technology hard, and
the rest will be more indie game oriented. The PC gaming industry just
cannot sustain huge budgets like that forever; the cost of art assets in
games is getting higher and higher, and won't change until we have some
faster/easier way to do 3D modelling.
yeah that's a real humble card ya got there friend ;)

I be grandpa, I be hmmm .. 'independant'.. now, I saw it, I wanted it ....
and I got it. :) SLI

I see that along with the technology advances there will be gaming advances
as well. I think the industry is spreading out to cover what they want to
of the expanded technology. There are hills and valleys in this landscape,
there are a lot of developers out there that are looking for the right spot
for them to develop something to hit. This entire landscape has grown.
The populace on it has too. Both in 'shops' and buyers (gamers). I guess
I'm agreeing with your view in a sense. I think there is more to look
forward to as a gamer than you seem to, looking at this picture. See,
there will still be some shops working on the big artistic productions. And
all the little hot shots will be more numerous, looking for that 'sweet
spot' to hit just like everyone else is. When the op's in the other ng
said 'pc gamers never had it so good!' they know what they're talking about.
Ten years ago it was a really tedious thing to port a console game to the
pc. Now if a dev house likes something they see on a consol someplace,
it's almost an instant port to pc. It's easier and faster. Plus, consoles
have grown hugely too. We have more to choose from I feel.
McG.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top