holy smokes; AMD-ATI R600 ( Radeon X2900XTX ) GPU Technology Demo; "Ruby 4"

R

Radeon350

decent quality video
http://www.hexus.tv/show.php?show=66

low quality video

looks better than Nvidia's "Adrianne" tech demo for the G80 ~ GeForce
8800 GTX


R600 / Radeon X2900XTX is a friggin' 512-bit MONSTER that will
provide bandwidth of over 100 GB/s even with lower-end GDDR3 memory.
According to Wikipedia, R600's bandwidth ranges from 115.2 GB/s to 160
GB/s depending on the memory options.

http://en.wikipedia.org/wiki/Radeon_R600


some of the technical details of the Ruby 4 demo:


___________________________________________________________________________

"AMD Ruby 4 (R600) demo is impressive

CeBIT 007 Ruby learns to ski with nice triangle budget

By Theo Valich in Hanover: Friday 16 March 2007, 07:20
YOU CAN FIND R600 boards in hidden places of CeBIT, but we expected
that.

AMD is faithfully protecting the possible leakage of pictures with
marking all of the boards with name of the partner it was intended
for, only problem there was the colour of the markings.

We do not see the reason for this paranoid behaviour, since all of the
important R600 pictures ended up on websites a long time ago. We have
seen UFO, GDDR3 boards floating around with a "subject-to-change"
GF8800-looking triple-heatpipe cooler in red colour, but we'll leave
it at that. The theme of this story is Ruby, after all.

We have seen the demo running couple of times and this poor hack has
to say that it looks quite impressive, especially when it comes to
snow itself or fur on Ruby's winter outfit. There is also a matter of
realistic blend of textures on her face and skeleton animation. You
could easily imagine that Ruby is a real person, judging by insane
amount of detail that went into a creation of this demo.

Main render target resolution is not full HD or 1920x1080 (or 1200)
pixels, but rather a baseline 1280x720p with HDR in FP16 format and
MSAA turned on to 4X. Anisotropic filtering should be set at high in
all cases, but this resolution left us confused a bit. It seems that
ATI will push the CrossFire two-board package for full-HD resolution.
This is mainly due to texture memory budget, since current demo is
eating 680 MB. Since ATI is only having 1GB boards around, you can
easily calculate that remaining 320MB would not be sufficient for the
1080p frame buffer and decent framerates. However, do not think that
there are issues with GPU themselves, since virtual memory addressing
is working just nicely on both G80 and R600 chips.

Scenes in the demo have between two and two and a half million
triangles, depending on scene complexity. Ruby is around 200K
triangles and it uses 128 morph targets for facial animation and
around 200 bones for skinning animation. When it comes to the face of
ATI's bride, animation was done by filming a face of the real-world
Ruby (actress) with a high-definition camera. After the filming, plain
vanilla video was analysed and processed step-by-step, using highly
complex facial recognition software in order to extract facial
animation data. After this video session, Ruby's face alone got
layered with 15 different textures, and placed in a scene generated by
procedurally generated snow. Snow simulation is processed entirely on
a GPU and can be dynamically melted or amplified to increase the snow
cover (and the snow effect while Ruby is doing a snowboard scene in
Janica Kostelic style).

When it comes to the fur on Ruby's collar, this is no longer simple
vertex-generated fur with predefined movement, but rather a simulated
with a physics model, also done on the GPU."

http://www.theinquirer.net/default.aspx?article=38361
_________________________________________________________________________
 
R

Radeon350

download the FLV version of the video:
http://www.hexus.tv/shows/cebit07/ATi-demo.flv

if you need a FLV player:
http://www.download.com/FLV-Player/3000-2139_4-10467081.html



R600 technology slides

http://mr-63596.v-mirror.spb.ru/news/img/07124551.jpg
http://mr-63596.v-mirror.spb.ru/news/img/07124607.jpg
http://mr-63596.v-mirror.spb.ru/news/img/07124618.jpg
http://mr-63596.v-mirror.spb.ru/news/img/07124631.jpg
http://mr-63596.v-mirror.spb.ru/news/img/07124640.jpg


"Ruby 4" demo screenshots

http://img6.picsplace.to/img.php?file=img6/27/Imgp5304.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5492.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5493.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5494.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5496.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5497.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5498.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5499.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5509.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5510.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5511.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5512.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5513.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5514.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5515.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5516.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5518.jpg[
http://img6.picsplace.to/img.php?file=img6/27/Imgp5519.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5520.jpg
http://img6.picsplace.to/img.php?file=img6/27/Imgp5521.jpg
 
J

John Lewis

decent quality video
http://www.hexus.tv/show.php?show=66

low quality video
Holy SMOKES is right. The part is in redesign again on 65nm. The 65nm
yield is pitiful. Rumor from Cebit says that less than 20,000 total of
the 'current design' will be made available for shipment starting in
May ... not clear whether that is the 80nm heat-monster or the 65nm
poor yield version.

And the date for full unfettered production is unknown. Shades of the
X1800-series fiasco all over again. I'm sure that nVidia must be
rather amused by this hiccup. The nV partners have now shipped
500,000 8800-series graphics cards.

John Lewis
 
H

HockeyTownUSA

Radeon350 said:
decent quality video
http://www.hexus.tv/show.php?show=66

low quality video

looks better than Nvidia's "Adrianne" tech demo for the G80 ~ GeForce
8800 GTX


R600 / Radeon X2900XTX is a friggin' 512-bit MONSTER that will
provide bandwidth of over 100 GB/s even with lower-end GDDR3 memory.
According to Wikipedia, R600's bandwidth ranges from 115.2 GB/s to 160
GB/s depending on the memory options.

http://en.wikipedia.org/wiki/Radeon_R600


some of the technical details of the Ruby 4 demo:


GREAT! I can see it now. 1TB HDD required and HD-DVD drive to handle all the
damn high resolution textures. Not to mention needing super fast HDD to
cache the textures. I am assuming the bottom end card will come with 1GB of
RAM?
 
D

DRS

[...]
X1800-series fiasco all over again. I'm sure that nVidia must be
rather amused by this hiccup. The nV partners have now shipped
500,000 8800-series graphics cards.

Just as a matter of interest, where did that figure come from?
 
M

Magnulus

It is indeed impressive. The graphics look on par with the movie
Final Fantasy.

Still, NVidia has the better hardware and products, and I doubt that
ATI will change
this. ATI lost alot of momentum with the X1900 flaws NVidia's only
real issue is their
drivers aren't so great for Vista. They've fallen behind in the
driver race. Still,
it is leaps better than what ATI was doing a few years ago.
 
A

AirRaid Mach 2.5

It is indeed impressive. The graphics look on par with the movie
Final Fantasy.

Still, NVidia has the better hardware and products, and I doubt that
ATI will change
this. ATI lost alot of momentum with the X1900 flaws NVidia's only
real issue is their
drivers aren't so great for Vista. They've fallen behind in the
driver race. Still,
it is leaps better than what ATI was doing a few years ago.



while the demo does look impressive, in no way does it look even close
to the
movie Final Fantasy: The Spirits Within. If GPUs were capable of FF:
TSW in realtime, there wouldn't be much point in developing ever-more
powerful GPUs.
That level of graphics won't happen in realtime until probably the
middle to later part of the next decade, if that soon.
 
N

No One

Magnulus said:
It is indeed impressive. The graphics look on par with the movie
Final Fantasy.

Still, NVidia has the better hardware and products, and I doubt that
ATI will change
this. ATI lost alot of momentum with the X1900 flaws NVidia's only
real issue is their
drivers aren't so great for Vista. They've fallen behind in the
driver race. Still,
it is leaps better than what ATI was doing a few years ago.

Not to mention that AMD is getting their asses handed to them by Intel.
I like the competition, so I hope AMD and ATI can get better.

Drivers for Vista seem to be a huge problem with all cards. Micorsoft
deserves a few lumps here.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top