ZRAM L3 cache for FX 62?

T

The little lost angel

Not a chance in hell.

Why not though? Does ZRAM not work? Is yield very low? Is it subjected
to very high error rates? Or do you just mean AMD won't put such a
large amount of L3 into the FX62?
 
Y

YKhan

The only way I can see it happening already is if AMD was working on
the process long before they actually signed the deal with Innovative
Silicon. So perhaps as they were ongoing with the research, they did a
patent search and found out it was already patented? It's not unusual
that different companies may end up coming up with the exact same thing
in parallel. So perhaps the timing of the license agreement was not the
signal to the start of the research, but simply the signal to the
conclusion of legal negotiations?

Yousuf Khan
 
D

David Kanter

YKhan said:
The only way I can see it happening already is if AMD was working on
the process long before they actually signed the deal with Innovative
Silicon. So perhaps as they were ongoing with the research, they did a
patent search and found out it was already patented? It's not unusual
that different companies may end up coming up with the exact same thing
in parallel. So perhaps the timing of the license agreement was not the
signal to the start of the research, but simply the signal to the
conclusion of legal negotiations?

I guarantee that's not what happened. AMD just saw an interesting
technology and thought "maybe this will be useful, let's try it".

You're right though that the time frame is impossible unless they were
doing work before (which they weren't).

DK
 
D

David Kanter

Not a chance in hell.
Why not though? Does ZRAM not work? Is yield very low? Is it subjected
to very high error rates? Or do you just mean AMD won't put such a
large amount of L3 into the FX62?

ZRAM has yet to be proven to work, and I have heard some compelling
reasons why it is not great. The SER is fine AFAIK. AMD probably will
use cache, but it will be SRAM.

AMD is a very conservative company; the last time they did something
risky, they totally got screwed and had to have IBM bail them out
(SOI). They sure as hell aren't going to design a mass market chip
based on an unproven technology. Moreover, even if they wanted to,
there's no way that they could get turn around within a year. YK
already pointed out that it probably takes at least 1.5 years or more
to get stuff into a chip.

Put this way, AMD is probably 6 months from taping out the chips that
will be shipped in 1Q07; do you think they can integrate this stuff and
design an L3 controller in 6 months?

No, this just doesn't make sense. AMD is too risk averse and the time
frame is all wrong.

DK
 
G

George Macdonald

Why not though? Does ZRAM not work? Is yield very low? Is it subjected
to very high error rates? Or do you just mean AMD won't put such a
large amount of L3 into the FX62?

Toshiba has done "prototype" 128Mb chips and they say "we would be able to
put the LSI to use in three years if necessary".
http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=174910622

Of course Innovative Silicon may find that VCs get err, restless at that
timeframe so it's hard to say what the bottom line really is. AIUI there
*are* difficulties to get past with having to "pump" the cells to change
"state" from 1 to 0, which slows things down and it may not be just a drop
in replacement for DRAM.
 
T

The little lost angel

AMD is a very conservative company; the last time they did something
risky, they totally got screwed and had to have IBM bail them out
(SOI). They sure as hell aren't going to design a mass market chip
based on an unproven technology. Moreover, even if they wanted to,
there's no way that they could get turn around within a year. YK
already pointed out that it probably takes at least 1.5 years or more
to get stuff into a chip.

Put this way, AMD is probably 6 months from taping out the chips that
will be shipped in 1Q07; do you think they can integrate this stuff and
design an L3 controller in 6 months?

No, this just doesn't make sense. AMD is too risk averse and the time
frame is all wrong.

Obviously I don't know much about how these things are arranged so
pardon me for asking stupid questions.

But why is it not possible for AMD to have been researching and
experimenting on the ZRAM possibility, finds that it will work well
with what they have before deciding "ok, let's pay money for this
stuff and announce that we actually will be using this".

After all, it's less risky and better than spending big cash for
something that might turn out not to work. Nobody at Innovative has to
know that AMD is quietly trying out the same stuff if AMD never
produce/sells anything with ZRAM.

Even if they do know AMD is trying out the technology, couldn't AMD
point it out to them that "you let us try to make it work with our
chip first. If it works out into a successful implementation, we'll
pay for it and you get a major marketing boost. Or we look for
something else and you continue to find your first big break."
 
D

David Kanter

AMD is a very conservative company; the last time they did something
Obviously I don't know much about how these things are arranged so
pardon me for asking stupid questions.

That's not a problem. I once asked many stupid questions and became
'smarter', or so I'd like to think : )
But why is it not possible for AMD to have been researching and
experimenting on the ZRAM possibility, finds that it will work well
with what they have before deciding "ok, let's pay money for this
stuff and announce that we actually will be using this".

Look at what AMD said:

"We've looked at data from Innovative Silicon and it looks very
promising. We still need to assure ourselves that this will work in our
own application. We need to see how it scales and we need to make our
own test vehicles," he added.

The 'he' refers to Craig Sander a VP at AMD. ISTM that he's implying
AMD hasn't done any testing of this. Also note that:

Jones, an executive experienced in intellectual property licensing,
also declined to comment on AMD's timetable for introduction of Z-RAM
but offered a more general perspective. "In the past it has been two
years from when you sign a deal to when it is in production."

Also, if AMD was going into production they would be ponying up A LOT
more money. Its the sort of thing that wouldn't be buried in EETimes,
but would be picked up by financial analysts, etc.

Not only that, but Innovative Silicon would be crowing rather loudly
about their success if AMD were to go to mass market with ZRAM....or
possibly IPOing. AMD would need to pay them a lot of extra $$,$$$,$$$
to keep quiet about such things.
After all, it's less risky and better than spending big cash for
something that might turn out not to work. Nobody at Innovative has to
know that AMD is quietly trying out the same stuff if AMD never
produce/sells anything with ZRAM.

Gotcha, so you mean AMD looking internally at the same stuff. That's
not really going to happen:

1. AMD doesn't have spare engineers to look at stuff like this
2. They would have been talking about it if they had looked at it
before.
3. AMD doesn't really do fundamental R&D work like this (no offense to
AMD R&D employees)
4. It would probably be illegal as hell to do that (i.e. copy ZRAM
technology exactly), since ISi has patents on it for sure.
Even if they do know AMD is trying out the technology, couldn't AMD
point it out to them that "you let us try to make it work with our
chip first. If it works out into a successful implementation, we'll
pay for it and you get a major marketing boost. Or we look for
something else and you continue to find your first big break."

That's basically what this is about. AMD's not saying they are going
to use the technology, just committing to a preliminary investigation.

Does that answer most of your questions?

DK
 
T

The little lost angel

That's basically what this is about. AMD's not saying they are going
to use the technology, just committing to a preliminary investigation.

Does that answer most of your questions?

Yup, must had missed the parts in the articles that mentioned that
they were only looking into it and needed to make test equipment for
it. Thanks! ^^
 
G

George Macdonald

ZRAM has yet to be proven to work, and I have heard some compelling
reasons why it is not great. The SER is fine AFAIK. AMD probably will
use cache, but it will be SRAM.

In December '05 Toshiba reported working prototype 128Mb chips using FBE --
admittedly with a 3 year horizon for embedded implementation -- and in
March '05, Innovative claimed to have 90nm "silicon" in hand and megabit
chips being fabricated.
AMD is a very conservative company; the last time they did something
risky, they totally got screwed and had to have IBM bail them out
(SOI). They sure as hell aren't going to design a mass market chip
based on an unproven technology. Moreover, even if they wanted to,
there's no way that they could get turn around within a year. YK
already pointed out that it probably takes at least 1.5 years or more
to get stuff into a chip.

We know that AMD has a partnership with IBM, they work jointly on SOI and
IBM has been looking at FBE memory cells for a while now. "Unproven" is as
hard to err, prove as the converse... based on current publicly available
info.
Put this way, AMD is probably 6 months from taping out the chips that
will be shipped in 1Q07; do you think they can integrate this stuff and
design an L3 controller in 6 months?

No, this just doesn't make sense. AMD is too risk averse and the time
frame is all wrong.

I tend to agree that the timeframe is likely further out but I'm not sure
sure that things are as low on the curve as you'd like them to be.:)
 
G

George Macdonald

I guarantee that's not what happened. AMD just saw an interesting
technology and thought "maybe this will be useful, let's try it".

You're right though that the time frame is impossible unless they were
doing work before (which they weren't).

Forgive us if we prefer to overlook your "guarantee" and rely on the
evidence that IBM has been interested in FBE for several years and that the
AMD/IBM joint work just might be futher up the curve. The signing of a
license does not necessarily mean that work is just starting... as with
e.g. the AMD/Rambus license. You don't need a license to do lab work.
 
D

David Kanter

YKhan said:
Forgive us if we prefer to overlook your "guarantee" and rely on the
evidence that IBM has been interested in FBE for several years and that the
AMD/IBM joint work just might be futher up the curve. The signing of a
license does not necessarily mean that work is just starting... as with
e.g. the AMD/Rambus license. You don't need a license to do lab work.

OK George, let's put down money on this. I wager there's not a chance
in hell AMD will ship a part in volume using ZRAM before 2007.

We can have an independent individual (or a collection of 3) judge
this.

DK
 
Y

YKhan

Still there's no official word on when anything got started. Although
we can be pretty sure of relative time frames, we can't be sure of
absolute time frames. Meaning like for example, we can be fairly
certain that something might take 2 years to develop, but we still
don't know when that 2 years started.

Yousuf Khan
 
G

George Macdonald

OK George, let's put down money on this. I wager there's not a chance
in hell AMD will ship a part in volume using ZRAM before 2007.

Hmmm, I don't see that anybody has mentioned "before 2007" until you pulled
it out of some err, deep dark place.

All *I* am saying is that we can't know exactly where things are on the
curve and it is *possible* that they are already some ways up it. As noted
in another post in this thread, Toshiba estimates 3 years to get from where
they are to a working embedded implementation - it could be that long for
AMD or it could be less.
We can have an independent individual (or a collection of 3) judge
this.

Pick up that soiled gauntlet and... on your way!:p
 
D

David Kanter

George said:
Toshiba has done "prototype" 128Mb chips and they say "we would be able to
put the LSI to use in three years if necessary".
http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=174910622

Of course Innovative Silicon may find that VCs get err, restless at that
timeframe so it's hard to say what the bottom line really is. AIUI there
*are* difficulties to get past with having to "pump" the cells to change
"state" from 1 to 0, which slows things down and it may not be just a drop
in replacement for DRAM.

ZRAM is almost certainly much hotter than DRAM.

DK
 
G

George Macdonald

ZRAM is almost certainly much hotter than DRAM.

Why?... this current pumping? I believe I read that one of its advantages
is that it does not require as high an internal voltage as DRAM.
 
D

David Kanter

ZRAM is almost certainly much hotter than DRAM.
Why?... this current pumping? I believe I read that one of its advantages
is that it does not require as high an internal voltage as DRAM.

I'm not entirely sure why, this came up in discussion at ISSCC and it
was stated as a fact. At the time I didn't ask why, perhaps I should
have. ZRAM is roughly the same, possibly slightly lower power than
eDRAM.

Here's my understanding:

The problem is that as I understand it basically the frequency. DRAM
(for say, DDR) runs at an internal clock of 100-200MHz, with I/Os
running at much higher speeds. So, you are rate limited in how often
you can probe the DRAM for data and fetch it.

ZRAM is being used in situations where you want to be hitting it with a
request every cycle...so it's at a higher frequency. Obviously, IBM's
36MB eDRAMs for the POWER5 burn a hell of a lot more power than 36MB of
Hynix DRAM...

So my guess is that ZRAM has higher heat/power requirements because of
the applications that it is targeted for. As to whether it has
intrinsically higher power, I am unsure of the answer, but perhaps
someone can comment here.

DK
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top