Nvidia's 'Nalu' vs ATI's 'Ruby'

D

duralisis

NV55 said:
http://karpo.org/images/Dunno/Ruby.jpg

Nvidia's GeForce 6800 (NV40) 'Nalu' demo consists of 300,000 polygons.
very very impressive shit!

ATI's Radeon X800 (R420) 'Ruby' demo consists of only 80,000 polygons.
however it is also rendering a background with other elements. closer
to what would be in an actual game.

now, console gamers: keep in mind, whatever ATI has been able to come
up with at this point (re: the R420 / Radeon X800) is a stepping stone
to what they'll be able to have in 2005 for Xbox Next, and in 2006 for
GCNext :)

Nvidia's characters seem to have more "life" to them, more subtle
nuances and personality. ATI's always seem flat and "anime" like.
 
B

Bagpuss

I was talking about home consoles, not handhelds.


True, but I said mainstream consoles. I know there were some old consoles
that had backwards compatibility, but the Atari 7800 was barely more than a
blip in the history of game machines. And it sure as heck didn't do anything
to help the 7800, did it?

But how much of a radical change was the 7800 compared to the 2600? I
suspect considerably less than the XBox -> XBox 2. Atari had a
tendancy to do very small increments when releasing computers &
consoles.

Personally I don;t think anyone will give a flying feck if its
compatible or not within a few months of release. Except for the
journalists of course who make a carer out of whinging about pointless
things that no one else gives a stuff about.
 
B

Bagpuss

HDTV does display higher res but very few people have those TVs.
Current NTSC TV is capable of max of 720x480 so no amount of tweaking
on the system would get past that, and that type of TV is still in
just about every homes.

This does presents a delimia for game developer. Of they try to use
the newer TV standard with higher resolution, very few will buy
becauser not everyone's ready to blow $2000 for a new HDTV. So the
game developers would have to stick with older standard with lower
graphic resolution..

On top of that America is not the center of the universe and not every
country runs on NTSC (thankfully) so you have to support other video
formats too.

In terms of game development the higher resolution (given the console
has enough renedering head room to do it) is little more than alowing
a player option to select resolution then calling the library function
to change resolution with different paramters at the appropriate
points. Its not that difficult to do, the only consideration is that
higher resolutions take up more rendering time.
 
I

Impmon

But how much of a radical change was the 7800 compared to the 2600? I
suspect considerably less than the XBox -> XBox 2. Atari had a
tendancy to do very small increments when releasing computers &
consoles.

7800 has a completely different CPU (backward compatible with 6507
used by 2600), addition of extra RAM, and a separate video chip. 2600
mode used the Stella chip while 7800 mode used the Maria chip.

Probably the least radical of all bacward compatible console and on
par with Gameboy Color (different CPU but worked with older games)
 
A

Andrew Ryan Chang

Zackman said:
The "wily consumer" would probably love to have more electronics that are
exactly as functional and relevant and non-obsolete fives years after their
launch as they were the first day they came out. A PS2 game you buy next
year will play on the exact same hardware as the PS2 game you bought three
years ago, and it will run exactly as intended by the developers. How many
other pieces of consumer electronics have a five year lifespan during which
time owners don't need to worry about a better version of the hardware
coming out? Certainly not PC components.

Consoles win out vs PCs, yes. I think every other home appliance
or consumer electronic device has less obsolescence than consoles. But
then again, that's the nature of computing progress.
 
A

Asestar

Mr. Brian Allen said:
Hell, ATI can't even write drivers that work well with current software!

Now this is just plain meaningless. Have you actually tried catalyst
drivers? And if so, which software did they not work on?
 
A

Asestar

Well, Nvidia has whole team devoted to making demos. ATI has just a few
persons who do a lot of other things as well, so yeah. But think Nvidia has
to pay demo develop teams pay from their graphiccard sales, hence the high
prices :)
 
N

Nada

http://karpo.org/images/Dunno/Ruby.jpg

Nvidia's GeForce 6800 (NV40) 'Nalu' demo consists of 300,000 polygons.
very very impressive shit!

ATI's Radeon X800 (R420) 'Ruby' demo consists of only 80,000 polygons.
however it is also rendering a background with other elements. closer
to what would be in an actual game.

now, console gamers: keep in mind, whatever ATI has been able to come
up with at this point (re: the R420 / Radeon X800) is a stepping stone
to what they'll be able to have in 2005 for Xbox Next, and in 2006 for
GCNext :)

The one thing that nVidia might win is the demo department. If Ati
will release another multibillion-polygon monkey demo, it'll hurt the
sales. When you can see the pink color of Nalu's nipples, who the
hell wants to see a monkey peeling a banana?
 
A

Andrew Ryan Chang

Asestar said:
I mean there is a limit on how good graphics can be enjoyed on a tv set and
i must say Xbox and PS2 are damn near that limit. If tv's could display

I would say they are /nowhere near/ that limit. Consider how
not-real even the best (realistic-looking) games look compared to real
scenes broadcast on TV. The lighting, detail, and sheer amount of stuff
and crowds that can be easily filmed is far, far beyond what an XBox or
PS2 can do. Resolution is not the limiting factor.
 
A

Asestar

Andrew Ryan Chang said:
I would say they are /nowhere near/ that limit. Consider how
not-real even the best (realistic-looking) games look compared to real
scenes broadcast on TV. The lighting, detail, and sheer amount of stuff
and crowds that can be easily filmed is far, far beyond what an XBox or
PS2 can do. Resolution is not the limiting factor.

Well, keep telling that to nvidia and ati, who are aiming at fluid gaming @
1600x1200. Also a tv-broadcast is interlaced. Watch how blurry and faded
most of the images look.
It's not just the resolution that's limiting it, it's the overall quality
itself! Sure enough you don't notice anything while playing on 30"LCD tv,
but for most other tv's, picture is just plain bad. Colors are not that
sharp, hence little difference between 16bit dithered and 32bit textures.
While this difference can be spot-on clear on a monitor.

As for the being realistic, this is a game issue, rather than hardware
issue.
Tell me, how many developers are willing to put huge effort into a game that
may be sold for only 2-3 months? Not many. Look at SplinterCell. It can
deliver some amazing light and shadow effects on xbox. Does that mean all
xbox games have same quality of visuals? No.
Now a days, it come down to games themself, not the consoles.
Also no matter what, realtime games will never be completely real looking,
without having higher res.
 
A

Asestar

When you can see the pink color of Nalu's nipples, who the
hell wants to see a monkey peeling a banana?

eh.. all of the ATI owners? But you're right. They don't *want* to see it,
they're just stuck with it.
Sad tough, nvidia is using now cheap tricks! Adding sex appeal to their
product line. First Dawn, then Dusk and now Nalu... hmm... what would be
next, Lola? :)
 
E

Eric Pobirs

Asestar said:
eh.. all of the ATI owners? But you're right. They don't *want* to see it,
they're just stuck with it.
Sad tough, nvidia is using now cheap tricks! Adding sex appeal to their
product line. First Dawn, then Dusk and now Nalu... hmm... what would be
next, Lola? :)

Would that be a suspicious bump mapping demonstration?
 
E

Eric Pobirs

Impmon said:
7800 has a completely different CPU (backward compatible with 6507
used by 2600), addition of extra RAM, and a separate video chip. 2600
mode used the Stella chip while 7800 mode used the Maria chip.

Probably the least radical of all bacward compatible console and on
par with Gameboy Color (different CPU but worked with older games)


The difference was very substantial. The Atari 400/800 computers and
Atari 5200 (same chipset in console and computers) were a major jump in
power over the 2600 but the 7800 had distinct advantages over those. It had
the same 256 color palette as the GTIA (one of the co-processor pair in the
Atari 800) but supported modes allowing more colors onscreen before
resorting to special programming tricks (like changing the color registers
as the screen was drawing so that different horizontal bands could have
different groups of four colors), supported larger and much more numerous
sprites. Oddly the sound in the 7800 relied on the same chip from the 2600.
Some 7800 games had better audio by including a POKEY chip from the 400/800
in the cartridge.

It was a bit of a mystery why Atari didn't instead produce a more
powerful version of the ANTIC, GTIA, and POKEY chips so they could made a
major upgrade to their computers as well as a new console platform. further
down the road they could have done as Apple did with the ][GS, creating a
16-bit backward compatible system using the 65816. (Also used in SNES.) Part
of the reason may have been that Atari had made a substantial cash
investment in Amiga before they were acquired by Commodore and were
expecting the new system to be sold under an Atari label and eliminate most
interest in a improved 800 series system.

Not long after the 7800 was announced Atari was sold to the Tramiel
family that had founded Commodore and were ousted from that company. The
Tramiels weren't interested in putting up the capital for marketing a game
system, so the 7800, games and everything else that had already been
manufactured just sat in a warehouse for years. They eventually dumped it
all on the retail market for a small fraction of what it had cost Time
Warner to make. There were a few third party titles but only a small number
of the available games ever made good use of the system.

The 7800 was doomed for reasons other than technical abilities. Even if
it had been released as scheduled and beat the NES to market it would have
lacked Nintendo's most important contribution to the console business: the
business model that made third party publishers a source of revenue rather
than competitors. Interestingly both game systems would have several of the
same titles since Atari had licenses for several of Nintendo's hits like
Donkey Kong, Donkey Kong Jr., and the original Mario Bros. game. All of
these had very poor version on the 7800 though, which was surprising
considering the very good version on the Atari computers and other computers
through the Atarisoft brand.

I'd say the 7800 is somewhat less of a leap than that from GBC to GBA.
The GBA is a completely different ISA while all of the Atari 8-bit products
used some form of 6502 or derivative.
 
E

Eric Pobirs

Zackman said:
I was talking about home consoles, not handhelds.


True, but I said mainstream consoles. I know there were some old consoles
that had backwards compatibility, but the Atari 7800 was barely more than a
blip in the history of game machines. And it sure as heck didn't do anything
to help the 7800, did it?

To be fair, the 7800 was pretty much strangled in its cradle. At the
time it was intended to be launched the 2600 was still being supported and
the installed base was among the largest by the day's measure. It made sense
at the time to give all those 2600 owners something significantly stronger
than everything that had been seen at that point.

It have made even more sense to do a major upgrade to the Atari 800/5200
chipset and take advantage of the much larger base of experienced
programming talent and third party support. I've never heard any explanation
of how the people running Atari then came to decide it was better to be
backward compatible to the 2600 than 800/5200.
 
E

Eric Pobirs

Impmon said:
HDTV does display higher res but very few people have those TVs.
Current NTSC TV is capable of max of 720x480 so no amount of tweaking
on the system would get past that, and that type of TV is still in
just about every homes.

This does presents a delimia for game developer. Of they try to use
the newer TV standard with higher resolution, very few will buy
becauser not everyone's ready to blow $2000 for a new HDTV. So the
game developers would have to stick with older standard with lower
graphic resolution..


HDTV is by no means a $2,000 investment any longer. There are good tube
models starting well under $1,000. (I remember when a good 17" SVGA monitor
was almost $1,000, now good units are under $100.)
http://www.bestbuy.com/site/olspage...ragments/product/olslinelistingsortfilter.jsp

These prices will only continue to drop and analog models will offer a
decreasing range of choices as the point of price overlap with digital sets
keep dropping. Already the bulk of new big screen TV sales are for HD
capable models. They are increasingly the only choice one you exit the
bargain basement. By the time the first of the next-gen consoles ships the
installed based will have increased well over a million units at the current
rate of sales.

Another big factor is who is buying the sets. Since video games are
luxury items the companies are looking to target affluent (not necessarily
rich) shoppers who'll be able to indulge in frequent game purchases. Owning
and HD capable screen is a good way to target affluence. Some people will
make sacrifices to have the best screen for their home but for most
consumers it is dependent on it being a relatively casual purchase.

It's extremely unlikely that anyone will produce a game for the next-gen
consoles exclusively requiring a 1080i display but expect all of the new
consoles to support HD and a substantial number of games to exploit that
capability and growing along with the installed base.
 
C

chrisv

Impmon said:
HDTV does display higher res but very few people have those TVs.
Current NTSC TV is capable of max of 720x480 so no amount of tweaking
on the system would get past that, and that type of TV is still in
just about every homes.

720x480 on a widescreen TV with component video input, which is not
NTSC. No chance for that with S-video or composite video.
This does presents a delimia for game developer. Of they try to use
the newer TV standard with higher resolution, very few will buy
becauser not everyone's ready to blow $2000 for a new HDTV. So the
game developers would have to stick with older standard with lower
graphic resolution..

There's no reason why they can't allow the user to switch resolutions.
I don't think that HDTV quality is even needed for games. IMO,
widescreen DVD-quality (720x480) is pretty good, for now. This is
especially nice for multi-player games, where, on a widescreen TV,
each player can have a half of the TV that's almost as large as a
normal TV.
 
C

chrisv

I would say they are /nowhere near/ that limit. Consider how
not-real even the best (realistic-looking) games look compared to real
scenes broadcast on TV. The lighting, detail, and sheer amount of stuff
and crowds that can be easily filmed is far, far beyond what an XBox or
PS2 can do. Resolution is not the limiting factor.

Of course, normal video is, by nature, already fully anti-aliased.
 
N

Noozer

Where are all these demo's at?

I was able to find Lava Caves and the Chimp demo on the ATI site and that
was it.

: (
 
B

Ben Pope

Impmon said:
7800 has a completely different CPU (backward compatible with 6507
used by 2600),

If it's backwards compatible, it's hardly completely different, from a
compatibility standpoint :p

Ben
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top