Why ATI delayed the R600.....

T

TheSmokingGnu

John said:
... the REAL scoop...

Proprietary architectures make me a sad panda.

Let's just hope it's better than AMD's atrocious attempts at "quad" core
processing, else it'll be the end of both companies. Can't compete in
the market WITHOUT A FLANGING PRODUCT, guys!

TheSmokingGnu
 
C

chainbreaker

TheSmokingGnu said:
Proprietary architectures make me a sad panda.

Let's just hope it's better than AMD's atrocious attempts at "quad"
core processing, else it'll be the end of both companies. Can't
compete in the market WITHOUT A FLANGING PRODUCT, guys!

TheSmokingGnu

You would think that the market would have learned something about
proprietary from Packard Bell. Guess not.
 
H

HockeyTownUSA

chainbreaker said:
You would think that the market would have learned something about
proprietary from Packard Bell. Guess not.

Guys, you didn't take the red pill did you?

remember: 0104
 
F

FKS

TheSmokingGnu said:
Proprietary architectures make me a sad panda.

Let's just hope it's better than AMD's atrocious attempts at "quad" core
processing, else it'll be the end of both companies.

Before doing something radical, AMD needs to do better in the conventional
cpu market. AMD is trying to fly when it cannot even run fast.
 
Z

Zaghadka

Before doing something radical, AMD needs to do better in the conventional
cpu market. AMD is trying to fly when it cannot even run fast.
Anyone can fly, any time. Landings are a bitch though. ;^)

--
Zag


"The Ends Justify The Means" ~Niccolo Machiavelli, c. 1550

"The Means Justify The Means" ~George W. Bush, c. 2000
 
H

Ham Pastrami

TheSmokingGnu said:
Proprietary architectures make me a sad panda.

Can you explain how this architecture is any more proprietary than the
existing ones that are currently being used? Or is Intel now considered a
standards body?
 
C

chainbreaker

HockeyTownUSA said:
Guys, you didn't take the red pill did you?

remember: 0104

GACK!

Well, everyone should *still* have learned from Packard Bell's fiasco. :)
 
F

First of One

Two R600 dies on a single CPU socket running off of system memory. Now
that's as good an April Fool's joke as any...
 
T

TheSmokingGnu

Ham said:
Can you explain how this architecture is any more proprietary than the
existing ones that are currently being used?

If, say, I purchase an Intel chip now, it's just an Intel chip. It only
does the CPU job, and when I buy a motherboard, I'm free to use
essentially any other competing product (even the motherboard!) with it.

Same thing with existing AMD chips. I can buy that chip and /only/ that
chip. Any other component is freely choosable by the consumer.

This architecture locks the consumer into a choice of video hardware,
and further subjugates them by forcing their upgrade path into, surprise
surprise, more of the same. If they wanted to switch mid-year from an
ATI setup to an nVidia one, they're SOL, thus proprietary.
Or is Intel now considered a
standards body?

Thankfully no! ;)

TheSmokingGnu
 
J

J. Clarke

TheSmokingGnu said:
If, say, I purchase an Intel chip now, it's just an Intel chip. It
only does the CPU job, and when I buy a motherboard, I'm free to use
essentially any other competing product (even the motherboard!) with
it.

And having video integrated on the chip prevents you from choosing a
"competing motherboard" how?

Hint--AMD doesn't sell motherboards.
Same thing with existing AMD chips. I can buy that chip and /only/
that chip. Any other component is freely choosable by the consumer.

Nope, you are forced to buy AMD's memory manager. On Intel it's
separate.
This architecture locks the consumer into a choice of video hardware,
and further subjugates them by forcing their upgrade path into,
surprise surprise, more of the same. If they wanted to switch
mid-year from an ATI setup to an nVidia one, they're SOL, thus
proprietary.

So you're saying that that architecture would make it physically
impossible to stick an nvidia video board into a PCI Express slot?

If the integrated video gives you more performance than a separate board
can where's the problem? And if it doesn't then how long do you think
it's going to last in the market?
 
T

TheSmokingGnu

J. Clarke said:
And having video integrated on the chip prevents you from choosing a
"competing motherboard" how?

It doesn't, that's not my point. This architecture forces the pairing of
CPU and GPU, that's the sticky bit. The point was that an
existing-architecture CPU purchase is just that, a CPU, not a CPU/GPU
combination.
Nope, you are forced to buy AMD's memory manager. On Intel it's
separate.

Oh, well consider the nits picked. Next I suppose we can argue about why
I'm forced to use Samsung L2 cache, or why I can only use an Award BIOS
on my Giga-byte mobo. :p

So you're saying that that architecture would make it physically
impossible to stick an nvidia video board into a PCI Express slot?

My impression from the article was that the boards were radically
redesigned, and that the only upgrade path available was then either to
plug in another Fusion processor or else use a RenderX chip, and that
the board would not carry the necessary PCIe connections for video cards.
If the integrated video gives you more performance than a separate board
can where's the problem? And if it doesn't then how long do you think
it's going to last in the market?

If it is faster (and if they're using eDRAM, it may well be), then
there's no issue at all. I welcome any increases in performance, but
stepping out on a limb like this, especially when neither AMD nor ATI
have put ANY competition on the market for the last 6 months, and have
little to no economic basis, it seems like a bad pretense to be working
on a radical redesign that essentially equates to a proprietary
architecture.

Of course, if everyone jumps on the bandwagon and puts out lots of
support for the system, and there aren't any other unforeseen isses with
it, it may well be the future of end-user computing. Or, it'll tank
really, really bad and leave both companies (both of which I have used
and recommended to friends) high and dry, with no "traditional" product,
no "new" product, and a whole lotta debt to pay off.

I just don't want to see them wasting time on something only they can
develop and/or support, which distracts them from their main job which
is bringing some competitive pain to Intel/nVidia. Market share is not
won on promises and paper launches alone.

TheSmokingGnu
 
X

Xocyll

TheSmokingGnu <[email protected]> looked up from
reading the entrails of the porn spammer to utter "The Augury is good,
the signs say:
If, say, I purchase an Intel chip now, it's just an Intel chip. It only
does the CPU job, and when I buy a motherboard, I'm free to use
essentially any other competing product (even the motherboard!) with it.

Same thing with existing AMD chips. I can buy that chip and /only/ that
chip. Any other component is freely choosable by the consumer.

This architecture locks the consumer into a choice of video hardware,
and further subjugates them by forcing their upgrade path into, surprise
surprise, more of the same. If they wanted to switch mid-year from an
ATI setup to an nVidia one, they're SOL, thus proprietary.

Did you even read the article?

A quote;
"The theory is that once they get you hooked on the Excellon, you are
more likely to buy the RenderX than a GeForce 8800 GTX as the RenderX
chip would certainly cost less than a complete graphics card."

Does that say the user can't buy a GeForce card?
No, it says the user would be more likely to buy a RenderX as it would
be cheaper than buying an entire card from the competition.

Where's the lock-in?

Now if video cards still used AGP and none of those Excellion
motherboards came with an AGP slot, you might have had a point.

Xocyll
 
J

J. Clarke

TheSmokingGnu said:
It doesn't, that's not my point. This architecture forces the pairing
of CPU and GPU, that's the sticky bit. The point was that an
existing-architecture CPU purchase is just that, a CPU, not a CPU/GPU
combination.


Oh, well consider the nits picked. Next I suppose we can argue about
why I'm forced to use Samsung L2 cache, or why I can only use an
Award BIOS on my Giga-byte mobo. :p



My impression from the article was that the boards were radically
redesigned, and that the only upgrade path available was then either
to plug in another Fusion processor or else use a RenderX chip, and
that the board would not carry the necessary PCIe connections for
video cards.

The only way it could do that would be to not support an external bus,
which would make it damned difficult to attach peripherals.
If it is faster (and if they're using eDRAM, it may well be),

How about the ultra high bandwidth path between the CPU and GPU that is
made possible by having them both on the same die?
 
J

John Lewis

To some of you, I apologise for the posting date, but I needed to
cater for reception in a bunch of different time zones :) :) :)

John Lewis

Taken the "Red Pill" yet ?

John Lewis
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top