Intel--Nvidia merger or acqusition. unlikely, but possible ?

J

J. Clarke

The said:
AFAIK, locally M$ have been pushing for businesses to adopt a 3 year
subscription/payment plans for WindowsXP licenses instead of paying
once off. Seems like it's a good hint of things to come, get them used
to the idea of paying smaller amounts over a fixed number of years
then slowly comes the perpetual annual subscriptions.

Is this a 3 year subscription that ends with a working product or one in
which the product expires at the end of 3 years? They tried one in which
the product expired in Australia and I seem to recall that the result was
not at all desirable for Microsoft.
 
C

chrisv

Judd said:
"The competition"? There was only 1 real player and that was MS who bought
the rights to the technology and the team.

WTF are you talking about? Intel's "competition" in the high-end
video market was nvidia and 3dfx. Sheesh!
 
T

The little lost angel

Beg to differ. Even with the high turn over in electronics you are still
going to have a large number of the same people and procedures for Q.C.,
testing, and programming. I mean Microsoft is an good example of decades
of poor performance and buggy software: It happens.

My last experience with a desktop ATI card was similarly painful,
about 4yrs ago I think. However, after that, about 2 years back, my
experience with ATI Mobile driver on my Thinkpad was pretty ok... but
that might have to do with the fact IBM probably checked it first.
 
G

Garrot

Rick said:
Of course if you have knowledge to the contrary i.e. ATI fired tje
entire crew and started from over from scratch I would accept it.

Yea, they changed their driver model drastically since 1999. Their
hardware quality and engineering capabilities has been changed
drastically since then too. With the acquisition by AMD it's going to
get even better.

I am using Nvidia on both my PC's right now but having an ATI X1900XTX
would make me quite happy too.
 
G

Garrot

EDM said:
That's what happened to Matrox, except the "crew" quit
instead of being fired. The company never did recover
in the 3D market.

Matrox was never a force in the 3D market anyway.
 
T

Tony Hill

Beg to differ. Even with the high turn over in electronics you are still
going to have a large number of the same people and procedures for Q.C.,
testing, and programming. I mean Microsoft is an good example of decades
of poor performance and buggy software: It happens.

Of course if you have knowledge to the contrary i.e. ATI fired tje
entire crew and started from over from scratch I would accept it.

With ATI things have definitely improved. It was a long and hard
process, but they do seem to have their drivers under control now. I
have had rather poor experiences in the past with ATI cards, but my
current one is mostly working well. My biggest complaint right now is
that their options for image scaling on my widescreen LCD are
completely ass-backwards (it ONLY scales image that do *NOT* scale to
the right aspect ration).
 
Y

Yousuf Khan

Rick said:
For me too as of ~1999, which I admit is a long time in computer years,
I had huge problems which were never resolved. Gamma was the main
problem for me at the time but I had some other issues with 3d games
that forced me to buy a new video card after lack of a response from
ATI. Thing is ~7 years isn't that long for being P.O.ed at a manufacturer.

Great hardware but problems with drivers/running in anything other then
common modes. It was like they only tested 800x600 in 256 color mode and
left everthing else from EGA to other VGA modes to users to figure out.
Serious lack of Q.C. and testing IMO.

I too had many problems with ATI drivers, which turned me off of their
hardware, although the hardware was not bad. For example at one time
ATI drivers were extremely hard to remove, just doing an uninstall was
a hit and miss proposition. I had all kinds of ATI hardware ranging
from cheap to All-In-Wonders with TV functions built in. I loved those
TV tuners, but it wasn't enough to make me want to stay with that
driver set. ATI's quality control problems could be easily taken care
of if they'd only not integrate their stupid applications in with their
drivers. Keep it simple, and things will go well.

I'm hoping that with the acquisition by AMD, ATI will finally
open-source their drivers and that way we can see proper support in
Debian-based Linux distros. Also the open source drivers will show how
much better the hardware is than their closed-source drivers.

Yousuf Khan
 
Y

Yousuf Khan

Luis said:
I absolutely agree with this analisys.
I guess also NVidia won't develop more great chipsets for motherboards
supporting AMD cpus since AMD now has absolute control (and preference)
over its (new) own production through ATI.

Not likely that Nvidia will stop developing for AMD platforms, they get
to develop for this platform without paying any royalties to AMD, as
AMD has made it open, Hypertransport. They have to pay big bucks to
Intel to develop chipsets for their FSB.

However, we might now see Nvidia developing more platforms for Intel
than it has in the past. In the past, it has always put the priority on
its AMD platforms.

Yousuf Khan
 
Y

Yousuf Khan

David said:
I don't really think so. They have more transistors and functional
units, but if you look at the design cycle times, it quite obviously
takes a hell of a lot more time and effort to design an x86 MPU. Also,
a lot of the stuff in a GPU is replicated, and you can 'fix' a lot of
stuff in the drivers. Now MPUs obviously use a lot of cache, which is
replicated, but I suspect that the logic part of MPUs is much more
complex than GPUs.

One takes 5 years and hundreds of engineers to design. The other takes
2-3 years to design and maybe 100 engineers or so.

And yet, we see issues crop up like it did at ATI a few years ago when
their design was too complicated and the fabs they had contracted were
not able to produce their designs on time or in great numbers. So it
would seem that these GPUs are complicated enough to require some major
talking between the designers and process engineers.
Intel is hardly barred from acquisitions. Also, AMD will be
scrutinized because they will now be working very closely with a
competitor (NV) and any possibility of market share allocation or
collusion will have to be dealt with. That being said, it's hardly an
insurmountable problem.

Well, we'll see how closely they do work with Nvidia anymore after
this. Nvidia didn't seem very pleased with this development. But I
guess when all is said and done, it's much easier to partner with AMD
than with Intel. They might withdraw a bit from AMD for a few months,
calm down a bit and eventually get back to normal relations with them.
Depends on how good AMD diplomats are. Right now, everyone is doing
their jilted spouse routines. Intel is pissed at ATI for its
infidelity, and Nvidia is pissed at AMD for its infidelity, so Intel
and Nvidia are doing a showing of hand-holding with each other.

Yousuf Khan
 
D

David Kanter

Yousuf said:
And yet, we see issues crop up like it did at ATI a few years ago when
their design was too complicated and the fabs they had contracted were
not able to produce their designs on time or in great numbers. So it
would seem that these GPUs are complicated enough to require some major
talking between the designers and process engineers.

Way to back track on your claim. You initially claimed "High-end GPUs
are more complex than the CPUs from either Intel or AMD."

Besides ANY device requires coordination between designers and process
engineers. Sometimes it may just be sending someone a spice deck and
design parameters, sometimes it is a lot more. You are welcome to
think that GPUs are more complex than CPUs, but you are quite obviously
wrong by almost any reasonable metric that matters.

Transistors and die size don't count because automated tools can alter
that balance. The number of engineer years, TTM do and the degree of
custom design do count. In that regard, you will find that GPUs are
significantly easier to design than CPUs.

You could also think about the likelihood of errata/design bugs. In a
CPU, you cannot really hide a lot of bugs. In a GPU, the driver
totally abstracts the hardware and can get around many bugs that CPUs
cannot. The options for bug fixes in CPUs usually involve BIOS
updates, but BIOSes have a limited amount of room.
Well, we'll see how closely they do work with Nvidia anymore after
this. Nvidia didn't seem very pleased with this development. But I
guess when all is said and done, it's much easier to partner with AMD
than with Intel.

I think I agree, up to a certain point. There are rather signficant
concerns regarding roadmaps, design info, etc. that will be problematic
for AMD and NV's partnership.

As much as someone from AMD can say: "We will have a chinese wall
between the GPU designers and chipset designers", nobody in their right
mind will believe that. IT's just like those chinese walls in those
taiwanese foundries or ODMs....
They might withdraw a bit from AMD for a few months,
calm down a bit and eventually get back to normal relations with them.
Depends on how good AMD diplomats are.

And lawyers.
Right now, everyone is doing
their jilted spouse routines. Intel is pissed at ATI for its
infidelity, and Nvidia is pissed at AMD for its infidelity, so Intel
and Nvidia are doing a showing of hand-holding with each other.

I agree that this is not likely to be the permanent state of affairs,
it will be interesting to see what the steady state ends up as.

DK
 
R

Rick Cortese

Yousuf said:
I too had many problems with ATI drivers, which turned me off of their
hardware, although the hardware was not bad. For example at one time
ATI drivers were extremely hard to remove, just doing an uninstall was
a hit and miss proposition. I had all kinds of ATI hardware ranging
from cheap to All-In-Wonders with TV functions built in. I loved those
TV tuners, but it wasn't enough to make me want to stay with that
driver set. ATI's quality control problems could be easily taken care
of if they'd only not integrate their stupid applications in with their
drivers. Keep it simple, and things will go well.

I owned two ATI video cards and one<old Rage II IIRC> is still in use in
a computer I put together for a friend. About all he uses it for is
researching wooden boats, wine making, and porn. Works fine in those
applications.<sic>

I had trouble with both ATI's I owned. Burned out one monitor from
setting the brightness and contrast too high to correct for it. The DEC
monitor I used in the system I gave to my friend above, I saw sitting
outside next to his house in the dirt waiting for the garbage man. He
had the same complaint about brightness and ended up getting a different
monitor after frying the DEC.

One of my favorite errors with the ATI was "No 3d available" error from
games. When I did get the system to ack I had a 3d card, everything that
was supposed to be shaded by lighting came out pitch black.

I was building a lot of low end systems at the time and happened across
an Epox MVP4 motherboard for ~$40 in a sale bin. The MVP was an old
Socket 7 that with built in Trident video and sound. I stuck some RAM
and a 500 mHz K6 on that thing it ran the same game, albiet slowly,
without a single protest or glitch. The motherboard cost half the ATI
video card but they still went to the trouble of getting the video
right. By then I had switched to a Voodoo on my main system which not
only ran everything, but did it quicker then the ATI.

Then there was that computer mags did where they found out ATI cheated
with the Quake frame rate test. Hard to forget that kind of stuff.

Then there was all the open source/Linux fiasco.

These weren't just programmer problems, they went right to the corporate
culture. I hope this changes with AMD in the picture too.
 
Y

Yousuf Khan

David said:
Way to back track on your claim. You initially claimed "High-end GPUs
are more complex than the CPUs from either Intel or AMD."

And I still am, where's the back-tracking? I'm counterpointing your
claim that GPUs *aren't* more complicated than CPUs. If they were so
much simpler than CPUs, then we wouldn't see these CPU-like production
problems crop up. CPUs tend to be hard to produce because of their
complicated circuitry, and so are GPUs.
Besides ANY device requires coordination between designers and process
engineers. Sometimes it may just be sending someone a spice deck and
design parameters, sometimes it is a lot more. You are welcome to
think that GPUs are more complex than CPUs, but you are quite obviously
wrong by almost any reasonable metric that matters.

Whatever, I've watched the industry long enough to know that there is a
direct correlation between circuit complexity and manufacturing
problems. You have almost no difficulties producing SRAM chips, but
tons of problems with CPUs and GPUs. That's why CPU manufacturers
usually turn out to be IDMs (integrated device manufacturers, meaning
they own their own fabs). They need control over their own
manufacturing process, and they usually can't contract it out
(Chartered being the lone exception so far). I see GPUs are still
running at sub-1GHz these days, if they have to go above 1Ghz, I think
it makes perfect sense that they join up with IDMs. Their speed is
being limited by their lack of control over their own production
process.

I think I agree, up to a certain point. There are rather signficant
concerns regarding roadmaps, design info, etc. that will be problematic
for AMD and NV's partnership.

As much as someone from AMD can say: "We will have a chinese wall
between the GPU designers and chipset designers", nobody in their right
mind will believe that. IT's just like those chinese walls in those
taiwanese foundries or ODMs....

Obviously AMD can't claim it's going to keep its CPU and GPU groups
separate, when everybody suspects that they're buying the company to
integrate a CPU with a GPU.

However, as far as chipsets go, what's there left inside them anymore?
The memory controller is integrated into the CPU now. We're left with
wireline and wireless networking, USB/Firewire, PCIe, and not much
else. Nvidia, as well as VIA, SIS and Broadcom can all design those
things just as well as ATI and hook them up via Hypertransport too.
Can't see how ATI will have any advantage here other than the marketing
advantage of being able to say that it's an AMD chipset for an AMD
processor in those markets where it might possibly matter, like
corporate laptops.
I agree that this is not likely to be the permanent state of affairs,
it will be interesting to see what the steady state ends up as.

I think we'll know the steady state relationships within one or two
quarters.

Yousuf Khan
 
Y

Yousuf Khan

AirRaid said:
PC hardware industry: If AMD-ATI decides to build high-speed extensions
to the AMD CPU's (or support chips) that enables ATI GPU's to run much
faster using AMD CPU's then it may give AMD-ATI a strategic advantage
in the high-end PC games hardware space. And it may force NVIDIA to
work with Intel to create a competitive alternative. This would
basically split the market into two camps: AMD-ATI and Intel-NVIDIA
(even if Intel and NVIDIA doesn't merge).

This is suggesting that it would've cost $10B to buy out Nvidia.

Nvidia would cost $10 billion to buy
"You would have to spend at least eight billion dollars to buy the
currently-available Nvidia shares. Of course, you have to add some
incentive on top of the price so we are talking about close to 10
billion dollars for this green-loving graphics company."
http://www.theinquirer.net/default.aspx?article=33298

However, I got the feeling that if AMD had asked nicely, Nvidia
could've been coaxed to merge with AMD for a lot less than what AMD
paid to buyout ATI. However, that would get messy in boardroom. Who
would run the show afterwards? Too many cooks in the kitchen. Here with
ATI, everybody knows ATI is the subordinate and they can't expect to
get equal responsibilities as a merger would have brought.

Yousuf Khan
 
D

David Kanter

Yousuf said:
And I still am, where's the back-tracking? I'm counterpointing your
claim that GPUs *aren't* more complicated than CPUs.

Not effectively.
If they were so
much simpler than CPUs, then we wouldn't see these CPU-like production
problems crop up.

Having to talk to a process tech is hardly an unusual problem. I
suspect the vast majority of ASIC designs end up having to deal with
process issues.
CPUs tend to be hard to produce because of their
complicated circuitry, and so are GPUs.

GPUs are semicustom devices, most CPUs are full custom. There's a hint
for you.
Whatever, I've watched the industry long enough to know that there is a
direct correlation between circuit complexity and manufacturing
problems.

Correlation isn't causation. Manufacturing problems are also caused by
using inaccurate spice decks, bad circuit design, smaller process
nodes. GPUs might be more complex than hardware RAID controllers, but
that's not saying much. They are most assuredly less complex than CPUs
by almost any measure.
You have almost no difficulties producing SRAM chips, but
tons of problems with CPUs and GPUs.

Do you have any proof that the number of problems encountered in GPUs
and CPUs is anywhere close to equivalent? Maybe you counted errata?
That's why CPU manufacturers
usually turn out to be IDMs (integrated device manufacturers, meaning
they own their own fabs).

Yes, like those guys at Sun, Transmeta, MIPS, or ARM. You got this one
backwards again...
They need control over their own
manufacturing process, and they usually can't contract it out
(Chartered being the lone exception so far).

This is a performance issue, not a QC one. People contract out CPU
foundry work to IBM...
I see GPUs are still
running at sub-1GHz these days, if they have to go above 1Ghz, I think
it makes perfect sense that they join up with IDMs.

Last time I checked, Sun was shipping well above 1GHz and they are
fabless. The fact of the matter is that Nvidia doesn't want a fab. If
they need cutting edge process tech, they can use IBM...which they
tried, and gave up on because of high defect rates.
Their speed is
being limited by their lack of control over their own production
process.

No, their speed is being limited by their design goals. When you have
data parallelism there is no reason to try and hit high clock speeds,
that just makes your life difficult. Heat ~ Frequency^3...
Obviously AMD can't claim it's going to keep its CPU and GPU groups
separate, when everybody suspects that they're buying the company to
integrate a CPU with a GPU.

Then how are they going to work with Nvidia again? NV's executives
have a duty to ensure that competitors don't have their roadmaps,
errata, etc. etc.
However, as far as chipsets go, what's there left inside them anymore?
The memory controller is integrated into the CPU now. We're left with
wireline and wireless networking, USB/Firewire, PCIe, and not much
else. Nvidia, as well as VIA, SIS and Broadcom can all design those
things just as well as ATI and hook them up via Hypertransport too.
Can't see how ATI will have any advantage here other than the marketing
advantage of being able to say that it's an AMD chipset for an AMD
processor in those markets where it might possibly matter, like
corporate laptops.

Q: What would happen if the VP of marketing at Nvidia gave NV roadmaps
to someone at ATI?
A: He'd be fired and there would be a shareholder lawsuit.

Why don't you go ask someone who designs stuff for a living whether
GPUs or CPUs are more complex. I'm sure they will have some
enlightening commentary such as:

1. GPUs use simpler processes (TSMC bulk versus AMD SOI, or Intel
bulk)
2. GPUs don't have a fixed ISA with nasty legacy baggage
3. GPUs don't use full custom design
4. GPUs aren't OOO
5. GPUs are only beginning to show aggressive circuit design...and
still nothing on the scale of what Intel or DEC does/did
6. CPUs take 5 years to design, GPUs take 3 or less
7. GPU problems can often be 'fixed' in the driver, BIOS fixes have to
be prioritized due to space availability, whereas a driver has
unlimited space

Do you have any counter arguments or reasons why GPUs are more complex?
I'm waiting to hear.

DK
 
Y

Yousuf Khan

Rick said:
These weren't just programmer problems, they went right to the corporate
culture. I hope this changes with AMD in the picture too.

Yup, I'm absolutely certain that AMD will bring a level of
professionalism to ATI's corporate culture. It's helped by the fact that
in this acquisition there is a clear boss here -- AMD. There's no
pretension that this is a merger of equals, this is an outright
acquisition. If any sort of changes are required, then one guy will have
the final say. If AMD had gone with Nvidia, then the lines of command
would've been a bit blurred.

Yousuf Khan
 
J

Jason Lane

Never happen. No one at Intel could control Jen-Hsuen. Then again, if
they replace Otellini with him and Intel would be far better off. :)
 
D

David Kanter

Yup, I'm absolutely certain that AMD will bring a level of
professionalism to ATI's corporate culture. It's helped by the fact that
in this acquisition there is a clear boss here -- AMD. There's no
pretension that this is a merger of equals, this is an outright
acquisition. If any sort of changes are required, then one guy will have
the final say. If AMD had gone with Nvidia, then the lines of command
would've been a bit blurred.

I certainly hope so. I was on a CC with some folks regarding the
merger and one of my questions was: "Does this mean I can look forward
to ATI drivers for linux that don't suck ass?" The response: "You
know, that's the first time anyone asked us about that." I think it is
one of those issues that may not be readily apparent to an exec, but
that they will get around to fixing.

DK
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top