XBox 2 graphics & bandwidth


R

Radeon350

[from TeamXbox]

First Details: Inside the Xbox 2 - Part 1
By: Cesar
09.08.2003 @ 07:05 PM




Both ATI and Microsoft executives are absolutely refusing to answer
questions on the Xbox successor, but that didn't stop us in our
mission to be the "Insider's Choice for Xbox Information."

We're proud to bring you today the very first info on the Xbox 2 GPU.
Our highly placed source within the industry informed us that the
graphic technology powering the Xbox successor is a derivative of the
R500, the successor of the R420 to be unveiled later this year at
Comdex.


This graphic chip has been in design for longer than a year at ATI's
Marlborough, Mass. office and much like the Xbox's nVIDIA GPU, the
Xbox 2 graphic chip will also be a custom silicon that will have the
R500 as its core technology.

This graphic chip is aimed at the next version of the DirectX API,
most probably called DirectX 10, which is already in development and
simply code named: DirectX/LH. LH stands for Longhorn, the next major
desktop Windows release, which will follow Windows XP.

This same source also told us: "Microsoft chose ATI not just because
the publicly known problems with nVIDIA but also because current
technology shows ATI is the real winner when it comes to pixel shaders
performance." Something that is correct, as several publications have
put in evidence that ATI's Radeon 9800 Pro surpasses the GeForce FX
5900 Ultra in most Pixel Shaders 2.0 benchmarks.



"And we all know graphics' future is all about pixel shaders" our
source added.

This VPU is being designed with the latest technologies in mind, such
as GDDR2 SDRAM provided by Samsung running at 1600 MHz. A 128-bit
configuration is capable of providing up to 25.6 GB/s peak bandwidth,
while its 256-bit mode brings up to a shocking 51.2GB/s peak
bandwidth!!! Samsung's GDDR2 256-megabit memory will enable graphics
memory cards of 512 MB, althought it is impossible to confirm if the
Xbox 2 will feature such amount of system memory.

Speculating the Xbox 2 might ship in Christmas 2005, we can be sure
its graphic chip will support Pixel Shader 3.0, a model that is a
significant improvement over today's 2.0 version, as well as Vertex
Shaders 3.0. This will make the Xbox 2, without a doubt, the most
powerful console when it comes to visual performance with a graphic
chip that, in hardware terms, is two generations ahead of current
technology.


http://www.teamxbox.com/news.php?id=4811

cute little article on Xbox 2 but there is something wrong with it. As
I've said before, DX10 and R500 generation will have something beyond
Pixel Shader and Vertex Shader 3.0 - The 3.0 standard is part of DX9
and should be implimented in ATI's R420 and Nvidia's NV40. The XBox 2,
which will used some derivative of R500, should have VS/PS 4.0 or
better.

Also, while 51 GigaBytes/sec bandwidth seems like alot today, that
will look weak compared to PS3's eDRAM bandwidth (for both CPU and
GPU) which is expected to be in the 100s of GB/sec, even though PS3's
main memory bandwidth (Rambus XDR) might only be 25 GB/sec. Unless
MS/ATI are designing XBox2's GPU with ultra fast eDRAM, the 51 GB/sec
(or whatever it is, it will "only" be in the dozens of GB) will have
to suffice for EVERYTHING. CPU, GPU, MCP. just like Xbox1. It will
be interesting to see how MS/ATI go about tackling PS3's reported
TFLOP performance and staggering on-chip memory bandwidth.
 
Ad

Advertisements

L

Leon Dexter

This will make the Xbox 2, without a doubt, the most
powerful console when it comes to visual performance with a graphic
chip that, in hardware terms, is two generations ahead of current
technology.

Ha! It's starting already. Statements like this are hilariously stupid
this far out. Claiming to be "the best" when almost nothing is known about
the competition is a new low in this industry. And if the three machines
come out as close together as they are supposed to, specs are going to be
even less important than they are this time around.
 
K

kevin getting

cute little article on Xbox 2 but there is something wrong with it. As
I've said before, DX10 and R500 generation will have something beyond
Pixel Shader and Vertex Shader 3.0 - The 3.0 standard is part of DX9
and should be implimented in ATI's R420 and Nvidia's NV40. The XBox 2,
which will used some derivative of R500, should have VS/PS 4.0 or
better.

PS 3.0 isn't part of the initial DX 9 release, its more of a DX 9.5 type
of thing. I can see DX 10 with PS 4.0 but DX 10 isn't due to arrive
sometime from now. I think the wait is a good thing since hardware needs
to catch up in performance as well as consumer adoption.
Also, while 51 GigaBytes/sec bandwidth seems like alot today, that
will look weak compared to PS3's eDRAM bandwidth (for both CPU and
GPU) which is expected to be in the 100s of GB/sec, even though PS3's
main memory bandwidth (Rambus XDR) might only be 25 GB/sec. Unless
MS/ATI are designing XBox2's GPU with ultra fast eDRAM, the 51 GB/sec
(or whatever it is, it will "only" be in the dozens of GB) will have
to suffice for EVERYTHING. CPU, GPU, MCP. just like Xbox1. It will
be interesting to see how MS/ATI go about tackling PS3's reported
TFLOP performance and staggering on-chip memory bandwidth.

I keep on thinking that the PS3 is going to use some sort of multi-chip
module like the POWER4. I mean everything gets squeezed into a package
that fits into your hand - the CPUs, the GPU and even the Rambus memory!
That would allow for hundreds of GB of bandwidth to main memory at
relatively low latencies. eDRAM is an expensive option with relatively
poor yeilds compared to seperate processor and memory dies. My mind is
still considering Sony is going to push ray tracing on the PS3. If the
rumors of 8 PPC chips with 8 vector units per PPC hold true, they might be
able to do just that. The idea of 8 PPC chips would fit into in the MCM
plan very nicely. The only problem would be cooling the beast.

eDRAM is a rumored addition to the R500/NV50 generation of chips in small
amounts. It wouldn't surprise me if ATI or nVidia goes to MCM's for their
high end graphics chips. A MCM can easily be designed to accomedate a 512
bit wide memory bus running at speeds greater than 1 Ghz. The performance
gains would be the justification for the added expense.

The shader quality of the R500/NV50 might be of comparable quality to ray
tracing found on CPUs. The chips may become just as programmable as a CPU
in this respect. In other words, the X-Box 2 is going to be doing stuff
on its GPU where the PS3 uses several CPUs to do.

It will be interesting to see how things play out in the future. I can't
wait till all the rumors and speculation becomes stop when they start
launching the next generation of hardware and the facts are revealed.
 
M

Marshall

Leon Dexter said:
This will make the Xbox 2, without a doubt, the most

Ha! It's starting already. Statements like this are hilariously stupid
this far out. Claiming to be "the best" when almost nothing is known about
the competition is a new low in this industry. And if the three machines
come out as close together as they are supposed to, specs are going to be
even less important than they are this time around.

Hey, the paid propaganda hacks hafta have something to post here
to annoy us with, neh? ;-)
-Marshall
 
F

Felger Carbon

kevin getting said:
If the
rumors of 8 PPC chips with 8 vector units per PPC hold true, they might be
able to do just that. The idea of 8 PPC chips would fit into in the MCM
plan very nicely. The only problem would be cooling the beast.

And _paying_ for the beast. 8 PPC CPUs in a cheapo game console?? ;-)
 
A

Allan Martin

It will be interesting to see how things play out in the future. I can't
wait till all the rumors and speculation becomes stop when they start
launching the next generation of hardware and the facts are revealed.

Does it really matter? As soon as they come out we will be dreaming and
speculating about Xbox 3 and PS4.
 
Ad

Advertisements

R

Roger Squires

Is all this expensive XBox2 technology still intended to display on a
640x480 regular tv ?

rms
 
W

wogston

Is all this expensive XBox2 technology still intended to display on a
640x480 regular tv ?

Hope atleast 1080i or 720p, or otherwise they would be taking a step
backwards. ;-)
 
B

bariole

I keep on thinking that the PS3 is going to use some sort of multi-chip
module like the POWER4. I mean everything gets squeezed into a package
that fits into your hand - the CPUs, the GPU and even the Rambus memory!
That would allow for hundreds of GB of bandwidth to main memory at
relatively low latencies. eDRAM is an expensive option with relatively
poor yeilds compared to seperate processor and memory dies. My mind is
still considering Sony is going to push ray tracing on the PS3. If the
rumors of 8 PPC chips with 8 vector units per PPC hold true, they might be
able to do just that. The idea of 8 PPC chips would fit into in the MCM
plan very nicely. The only problem would be cooling the beast.

Do not expect wonders. There is no way that real-time ray tracing will be
possible with next generation hardware. Eight Power4+ CPUs can't do nowhere
close to real time ray tracing. If PS3 will use multiple PPC cores, than
those cores will be scaled down versions of Power series. Anything else
would made PS3 chips just too big and too expensive to be used in a toy.
Pixar and other animation studios don't have equipment which can do ray
tracing in real time and they spend millions of dollars on theirs render
farms. Military simulators don't do ray tracing.

Hardware capable of creating real-time ray tracing graphics in quality
comparable to modern renders (Lightwave, Lightscape or Maya quality) will
not hit desktop PC's and consoles before 2016-2020.
 
Ad

Advertisements

X

xTenn

bariole said:
Do not expect wonders. There is no way that real-time ray tracing will be
possible with next generation hardware. Eight Power4+ CPUs can't do nowhere
close to real time ray tracing. If PS3 will use multiple PPC cores, than
those cores will be scaled down versions of Power series. Anything else
would made PS3 chips just too big and too expensive to be used in a toy.
Pixar and other animation studios don't have equipment which can do ray
tracing in real time and they spend millions of dollars on theirs render
farms. Military simulators don't do ray tracing.

Hardware capable of creating real-time ray tracing graphics in quality
comparable to modern renders (Lightwave, Lightscape or Maya quality) will
not hit desktop PC's and consoles before 2016-2020.

I was with you, cheering you on, until you threw out (what I feel is) a
ridiculous timeframe. You should expect that quality on PC desktops no
later than 2010, at least from what I can find about the direction graphics
chips are going, along with new hurdles being overcame in 64 bit processing
for the masses. After all, we are talking about kernel-mode address space
of 248 terrabytes, paged and non-paged pools weighing in at 128gb, and even
System Cache getting 1 terrabyte of address space, and the graphics guys are
planning on exploiting this, keeping their curve beyond even what this chip
(the AMD64 "Hammer" in this case) offers. The resulting marraige (and insane
cross-chipset interfacing this kind of adressing allows) should definitely
push the bell curve to pull expectations back in line with the dates I've
pulled out of nether regions.

IMHO, of course. :)
 
B

bariole

I was with you, cheering you on, until you threw out (what I feel is) a
ridiculous timeframe. You should expect that quality on PC desktops no
later than 2010, at least from what I can find about the direction graphics
chips are going, along with new hurdles being overcame in 64 bit processing
for the masses. After all, we are talking about kernel-mode address space
of 248 terrabytes, paged and non-paged pools weighing in at 128gb, and even
System Cache getting 1 terrabyte of address space, and the graphics guys are
planning on exploiting this, keeping their curve beyond even what this chip
(the AMD64 "Hammer" in this case) offers. The resulting marraige (and insane
cross-chipset interfacing this kind of adressing allows) should definitely
push the bell curve to pull expectations back in line with the dates I've
pulled out of nether regions.

IMHO, of course. :)

2010!?
That is just six years from now. I don't believe that it will happen so
soon. It would be nice, but I don't believe it.

Ligtwave 5.5 had animation of walking robot. Then top of the line P2 450MHz
needed 45 minutes to calculate one frame. Animation was quite simple,
nowhere close to cinema quality, ray tracing depth was set to 10,
resolution was 640x480. I guess that P4 3.2GHz would do same job in about 3
minutes (7 times higher clock and high optimization of Lightwave for P4).
For real time 30 FPS is needed. That means that 5400 times more powerful
hardware than P4 3.2 is needed. Even with ray tracing done in hardware, for
me year 2010 seems little to early.
Ligthscape uses more precise algorithm and it is couple times slower than
Lightwave. And I guess that Pixar render is very slow (but it's results are
astonishing).

64 bit address space doesn't mean that in year 2010 we will hit terrabayt
level memory capacity. And being 64 bit architecture Power already has 64
bit address space; even new member of PowerPC family is 64 bit chip. And
while 64 bit chips will simplify managing large memory their ability to do
64 bit calculation will not have significant impact on game performance.
N64 had 64 bit CPU so is it better than Xbox? No. Better game performance
with Athlon64 will be result of integrated memory controller, kick ass FPU
and better IPC.
 
X

xTenn

bariole said:
2010!?
That is just six years from now. I don't believe that it will happen so
soon. It would be nice, but I don't believe it.

Ligtwave 5.5 had animation of walking robot. Then top of the line P2 450MHz
needed 45 minutes to calculate one frame. Animation was quite simple,
nowhere close to cinema quality, ray tracing depth was set to 10,
resolution was 640x480. I guess that P4 3.2GHz would do same job in about 3
minutes (7 times higher clock and high optimization of Lightwave for P4).
For real time 30 FPS is needed. That means that 5400 times more powerful
hardware than P4 3.2 is needed. Even with ray tracing done in hardware, for
me year 2010 seems little to early.
Ligthscape uses more precise algorithm and it is couple times slower than
Lightwave. And I guess that Pixar render is very slow (but it's results are
astonishing).

64 bit address space doesn't mean that in year 2010 we will hit terrabayt
level memory capacity. And being 64 bit architecture Power already has 64
bit address space; even new member of PowerPC family is 64 bit chip. And
while 64 bit chips will simplify managing large memory their ability to do
64 bit calculation will not have significant impact on game performance.
N64 had 64 bit CPU so is it better than Xbox? No. Better game performance
with Athlon64 will be result of integrated memory controller, kick ass FPU
and better IPC.

Could not agree with you more about the end result being the combination of
the right integration, but from what I've seen the pieces are truly coming
together - in time. Could be wrong, but look how fast features per pixel
have came about...

The only point about the 64 bit processor is the resulting manipulations
made possible in a truly flat environment, which AMD is trying to bring
about, both in addressing and register manipulation - NOT 64bit in general.
Can you tell I like this chip, and get ansy when I think about the
connections with true graphics hardware that this makes possible? Even
then, by that timeframe we will be in another generation altogether, but we
are talking about a break from the normal x86 evolution.

It is all conjecture, but I am anxious to see where it leads within the next
7 years. I will say that a look back at cutting edge video 7 years ago as
compared to 2 years ago makes me feel even more confident about that 2010
timeframe.

(Of course, the comment about the Nintendo64 will be promptly ignored <g> )
 
R

RusH

[cut]
2010!?
That is just six years from now. I don't believe that it will happen
so soon. It would be nice, but I don't believe it.

Ligtwave 5.5 had animation of walking robot. Then top of the line P2
450MHz needed 45 minutes to calculate one frame. Animation was quite
simple, nowhere close to cinema quality, ray tracing depth was set to
10, resolution was 640x480. I guess that P4 3.2GHz would do same job
in about 3 minutes (7 times higher clock and high optimization of
Lightwave for P4). For real time 30 FPS is needed. That means that
5400 times more powerful hardware than P4 3.2 is needed. Even with ray
tracing done in hardware, for me year 2010 seems little to early.
Ligthscape uses more precise algorithm and it is couple times slower
than Lightwave. And I guess that Pixar render is very slow (but it's
results are astonishing).
[cut]

you are missing one point there - graphic enginges evolve, they change,
raytracing will be no longer needed to do a real time realistic graphics in
2010, few months ago some netherland chick (prof. of something she was)
came up with new cloud engine (or was it liquid engine) capable of
rendering 1000x700 real time clouds with sun in the background looking just
like raytraced - the math was done on flat surfaces. It was on the
slashdot, it was all over the place.
 
B

bariole

It is all conjecture, but I am anxious to see where it leads within the next
7 years. I will say that a look back at cutting edge video 7 years ago as
compared to 2 years ago makes me feel even more confident about that 2010
timeframe.

11.9.2010. Same place.
Then we shall see who was right and who was wrong.
 
Ad

Advertisements

X

xTenn

bariole said:
11.9.2010. Same place.
Then we shall see who was right and who was wrong.

LOL!

Deal - but only if my old, rusty circa 2008 3D holographic display is back
out of the shop... ;)

Coolness.
 
R

Radeon350

bariole said:
Do not expect wonders. There is no way that real-time ray tracing will be
possible with next generation hardware. Eight Power4+ CPUs can't do nowhere
close to real time ray tracing. If PS3 will use multiple PPC cores, than
those cores will be scaled down versions of Power series. Anything else
would made PS3 chips just too big and too expensive to be used in a toy.
Pixar and other animation studios don't have equipment which can do ray
tracing in real time and they spend millions of dollars on theirs render
farms. Military simulators don't do ray tracing.

Hardware capable of creating real-time ray tracing graphics in quality
comparable to modern renders (Lightwave, Lightscape or Maya quality) will
not hit desktop PC's and consoles before 2016-2020.


I agree about ray tracing. it will not be done by PS3-N5-Xbox2. Nor
will the PS4-Xbox3-N6 do ray tracing in real-time by 2010-2012. Maybe
a generation or two after that ( in other words 2 or 3 console cycles
away ) pretty much what you said.

About PS3 and its 8 (or 16) PowerPC or POWER CPUs, they will not be
doing the lions share of the processing. The 8 Attached Processing
Units (APUs) *per* PPC CPU will be. There will be between 32~64 APUs
in the PS3's main Cell-CPU alone. The PS3's GPU will have a further
16-32 APUs. So the lowest number of APUs for the *whole* PS3 is 48.
(32 in the CPU, 16 in the GPU) while the highest number of APUs, I
guess would be 96 (64 in the CPU, 32 in the GPU).

Each APU will have 4 floating point units (FPUs) and 4 integer units
(IUs). I don't think they'll be able to do both floating point ops and
integer ops at the same time. Not clear on that. but regardless, the
FPUs and IUs within the APUs will be doing the bulk of the processing
for PS3. Therefore, the APU *is* the workhorse of PS3. The PowerPC or
POWER CPU cores will only be managing what work the APUs have to do,
and controlling the APUs access to memory. The PPC / POWER cores are
the generals, the APUs are divisions, the FPUs and IUs are the
brigades / batallions. The PS3 will employ an army of processors to do
all the work to give us prettier, more intense games. :)
 
A

Ali P

xTenn said:
the ago

LOL!

Deal - but only if my old, rusty circa 2008 3D holographic display is back
out of the shop... ;)

Coolness.

All this discussion about the future got me wondering what people were
speculating 10 years ago would be the state of gaming today. So I had a hunt
on Google and came across this thread from 1993 about the future of gaming
circa 2003...
http://tinyurl.com/n7pr

Some quotes:

"In the next decade (the 00's?) we'll have digitzed odours, holographic
VR and synthezised blood, but we'll still be blowing things up."

"Virtual Reality: That's the first thing that pops to mind nowadays when
anyone talks about
video games of the future. Cheap color eye goggles are certain to be
developed within a decade as already some good TV resolution LCD displays
are available."

"I think we would probably have virtual
reality machines in arcades (they will replace the popular video game
arcades one sees now - it has already begun to happen), and one could
probably see some kind of personal computer linked to a virtual reality
machine (sort of like they have done with combining a Sega Megadrive
with a PC)."
 
Ad

Advertisements

S

Strontium

-
Ali P stood up at show-n-tell, in [email protected], and
said:
All this discussion about the future got me wondering what people were
speculating 10 years ago would be the state of gaming today. So I had
a hunt on Google and came across this thread from 1993 about the
future of gaming circa 2003...
http://tinyurl.com/n7pr

Some quotes:

"In the next decade (the 00's?) we'll have digitzed odours,
holographic
VR and synthezised blood, but we'll still be blowing things up."

"Virtual Reality: That's the first thing that pops to mind nowadays
when anyone talks about
video games of the future. Cheap color eye goggles are certain to be
developed within a decade as already some good TV resolution LCD
displays are available."

"I think we would probably have virtual
reality machines in arcades (they will replace the popular video game
arcades one sees now - it has already begun to happen), and one could
probably see some kind of personal computer linked to a virtual
reality machine (sort of like they have done with combining a Sega
Megadrive
with a PC)."


LOL! Stop making me feel old! :)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top