GPU sockets are coming

N

nv55

http://www.anandtech.com/video/showdoc.aspx?i=2570

G72 and G73 Info; Plus NVIDIA's Quest for a GPU Socket

We continue to hear new details about G72 and G73 here in Taiwan, and
the latest batch of info from our vendors is that G72 and G73 will be
pin compatible with NV40 and NV43. In other words, your next NVIDIA
video card might have the same PCB from the 6600GT, but with a
different GPU.

This means lower cost to the manufacturer - there is no need for new
R+D or board designs. It also means G72 and G73 will launch very fast
when the decision comes from NVIDIA as vendors can very easily switch
production from the older chips to the new ones. Two vendors confirmed
with us that they are already retooling their PCB for six pin 12V molex
with the anticipation that G72 and G73 SLI might need the additional
power, but even NVIDIA won't comment to the manufacturers at this
point.

A lot seems to hinge on ATI's future choices with R580, X1600 and
X1300. As of now, the launch date for X1600 is still late November and
NVIDIA isn't exactly hurting for new value and midrange SKUs with the
success of 6600GT. The X800GTO and X800GTO2 really give 6600GT a run
for its money, but we digress.

NVIDIA's Secret Flip Chip GPU

Manufacturers seem to think G72 and G73 will be an easy tool over from
NV40/43, but another vendor claims NVIDIA has bigger plans. They claim
that NVIDIA is working on flip chip GPU sockets for motherboards.
Apparently, inside NVIDIA engineering teams have several prototypes
where the GPU, rather than the CPU, is the main focus of a motherboard
with two sockets: one for the GPU and another for the CPU. Whether or
not such a machine will ever see the light of day is difficult to say
right now. However, the idea of pin compatible GPUs already suggests
that we are halfway there when it comes to buying GPUs the same way we
buy CPUs: in flip chips. We have plenty of questions, like how the
memory interface will work and how that will affect performance, but
GPU sockets are likely less a question of "if", but rather "when".
 
I

Iain McClatchie

I'd like to see a GPU socket where the memories are attached to the
same substrate as the GPU die.

If you look at a modern CPU, the die is in a heat spreader attached
balls-down to the top of a substrate that looks a lot like a little
PC board. As of 3 years ago, one name for this substrate was ALIVH.
The possibilities are quite exciting here, and I'm not sure why
they've not been explored before:

- chip-to-substrate connection pitch is around 120 um, in both X
and Y.
- even a Pentium 3 had well over 1000 connections to the
substrate, mostly for power. It seems reasonable to have 800+
data pins today.
- the same substrate is used in cell phones, so it's cheap and can
be made reasonably large (2" x 3").

A 512b DRAM bus would be quite reasonable on this package, along
with a PCIe x16 interface. Best of all, the socket ends up with
just the PCIe x16, (maybe HT x8), two DVI, and power coming out,
which might make it cheaper than a CPU socket.

Downsides:

- Does NVidia take over the graphics board biz, or do they ship
bare die? If the former, they get to handle memory chip
inventory, which is generally a money loser.
- Need a rev of the package for every DRAM supplier/version.
- Form factor is horizontal rather than vertical.

The last thing might be turned into an advantage, if NVidia
wanted to get *really* aggressive. They could define a
motherboard with all the usual bits (they're nearly all on the
NVidia southbridge these days anyway), with an Athlon+memory on
one side and the GPU on the other. Boot from flash (required),
stick a hard drive on top and a fan on one side, blowing through
the PSU, and the entire PC form factor could get down to
4" x 6" x 4" and 100W, like that little Apple cube, but smaller
and with less complex packaging.
 
Z

zzipper

This would be really good for us but...it would probably really piss of the
motherboard mfgs :) Imagine being able to defeat the 6 month obsolescence
we live with now by just upgrading a cpu and gpu. I think this would be
great, hopefully it happens.
 
A

Alfie [UK]

This would be really good for us but...it would probably really piss of the
motherboard mfgs :) Imagine being able to defeat the 6 month obsolescence
we live with now by just upgrading a cpu and gpu. I think this would be
great, hopefully it happens.
It wouldn't bother the MoBo M/Fs only the GPU board licensers.
ATI/nVidia really don't care who buys their chips, they just make chips,
not GPU boards. The MoBo guys would probably love it as they can then
compete on who has the best GPU chip interface, or supposrt the most GPU
plug-ins.
 
M

Magnulus

They ought to be moving towards integrating the graphics onto the
motherboard in some fashion. In this day adn age, there is simply no good
reason to have the video as an add-in card, heck, a good deal of the sound
out there is onboard now. It would also allow for much better thermal
solutions for the graphics chip- imaging being able to stick a copper
heatsink and 80mm fan on your graphics chip, for instance. You could clock
it higher and it would run quieter.
 
M

Michael W. Ryder

Magnulus said:
They ought to be moving towards integrating the graphics onto the
motherboard in some fashion. In this day adn age, there is simply no good
reason to have the video as an add-in card, heck, a good deal of the sound
out there is onboard now. It would also allow for much better thermal
solutions for the graphics chip- imaging being able to stick a copper
heatsink and 80mm fan on your graphics chip, for instance. You could clock
it higher and it would run quieter.
I hope they never build mother boards with only on-board graphics. The
add-in video card lets one put in a video card that meets there needs
and budget. If you need a more powerful video card than the one
currently in the computer you just replace the card, not the motherboard
and memory, not that the computer makers wouldn't love this. The same
happens with the built-in sound you talk about. A lot of people can
live with that sound, others spend $200 or more for a replacement.
You can already add new fans and heat sinks to most video cards, and
much more easily than trying to fit another heat sink and fan on an
already crowded motherboard. The only advantage I can see for on
on-board GPU is the signal traces would be shorter.
 
A

Andy Glew

Magnulus said:
They ought to be moving towards integrating the graphics onto the
motherboard in some fashion. In this day adn age, there is simply no good
reason to have the video as an add-in card, heck, a good deal of the sound
out there is onboard now. It would also allow for much better thermal
solutions for the graphics chip- imaging being able to stick a copper
heatsink and 80mm fan on your graphics chip, for instance. You could clock
it higher and it would run quieter.


Most systems already have graphics integrated into the motherboard and
chipset. With Intel being the biggest vendor.

Separate graphics cards are for the bleeding edge market - gamers,
etc.


--
---
Andy Glew
PREFERRED EMAIL: (e-mail address removed)
Although I am trying to quit Outlook,
I fetchmail all of my Outlook/Exchange email
to UNIX to read
FALLBACK EMAIL: (e-mail address removed)

503-264-4119

Potential bias: employed now by Intel
past by AMD, Intel, Motorola, Gould ...
This post is personal, and is not the opinion of
any of my employers, past or present.
 
C

chrisv

Magnulus said:
They ought to be moving towards integrating the graphics onto the
motherboard in some fashion. In this day adn age, there is simply no good
reason to have the video as an add-in card, heck, a good deal of the sound
out there is onboard now. It would also allow for much better thermal
solutions for the graphics chip- imaging being able to stick a copper
heatsink and 80mm fan on your graphics chip, for instance. You could clock
it higher and it would run quieter.

How would you make a socket that fits the various graphics chips
available? How about the video memory and it's sockets?
 
A

abc

chrisv said:
How would you make a socket that fits the various graphics chips
available? How about the video memory and it's sockets?

The same way manufacturers agree on what socket a memory module etc.. should
be.

The industry would get together and agree on a standard (i.e. AGP).
Although of course there is no reason each VGA manufacturer could not have
their own configuration (like AMD/Intel CPUs), they would need to weigh up
the pros and cons.

I couldn't see much benefit in this proposal (except to move new mobos), you
could probably make a flatter PC, but integrated components aren't usually
much good for overclocking etc. ATM memory module sizes are probably a
hurdle - probably not so much in the future though.
 
C

chrisv

abc said:
The same way manufacturers agree on what socket a memory module etc.. should
be.

The industry would get together and agree on a standard (i.e. AGP).

And what would the video-memory bus-width be? 128b? 256b? See the
problem?
 
A

abc

chrisv said:
And what would the video-memory bus-width be? 128b? 256b? See the
problem?

Well that's what they decide when they get together! - sheesh!

How do they determine PCI or AGP bus? Well, same way.
 
C

chrisv

abc said:
Well that's what they decide when they get together! - sheesh!

The point is, there's no "correct" answer to that question. Sheesh!
How do they determine PCI or AGP bus? Well, same way.

Bad analogy.

The current method of plugging a card into a bus does not put massive
restrictions on the performance or price of the video solution. They
are free to use whatever bus-width and memory type/size they want.
Having sockets for the video chip and the memory would severely limit
flexibility compared to the current situation.

It's a bad idea.
 
Z

Zak

chrisv said:
And what would the video-memory bus-width be? 128b? 256b? See the
problem?

The sheer width is an issue as is the different types of RAM.

But the changing width isn't: put on two DIMM sockets and if the
customer wants to use one that's fine.

However seeing how closely video chip and memory toe together it is
better to place them together on a board.. which is suspiciously like
what we have now.

I also have the feeling that the mainstream video is not making much
progress fill-rate wise; you cannot buy a entry level card and have it
outperform a Ti4200 - which is different from the situation with CPUs.

That said a GPU module would look suspiciously like a video card...


Thomas
 
D

Dan Koren

Andy Glew said:
Separate graphics cards are for the
bleeding edge market - gamers, etc.

To say nothing about such irrelevant
pursuits as CAD/CAE, image processing,
graphics, animation, scientific and
medical visualization, financial
modeling, etc...

;-)


dk
 
A

abc

Dan Koren said:
To say nothing about such irrelevant
pursuits as CAD/CAE, image processing,
graphics, animation, scientific and
medical visualization, financial
modeling, etc...

;-)

They are still niche markets though, and often require different performance
options than games do.
 
G

greenaum

To say nothing about such irrelevant
pursuits as CAD/CAE, image processing,
graphics, animation, scientific and
medical visualization, financial
modeling, etc...

How would one financial-model on a graphics card?

True, medical visualisation stuff does like to use a lot of VR. Tho
most of what I've seen on TV's gouraud shaded anyway. They can get
away with doing that on-cpu, I'm sure. As 3D cards get powerfuller,
they also seem to be shrinking into narrower and more narrow niches as
to what they can do.
 
I

Iain McClatchie

How would one financial-model on a graphics card?

The financial models themselves run on the CPU(s). The data
generated can be difficult to understand. 3D visualizations can
make it easier to understand, since a lot of the data is
multidimensional. Consider put and call options, which at any
point in time for a particular commodity exist as prices for a
cross product of strike prices and future dates. If you are
considering derivatives based on relationships between those
prices, it can help to visualize those relationships.

It's the same problem the numerical aerodynamics people had
in the 1990s, which led to them surrounding their Crays with
SGI workstations.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top