which video card for mac G5?

W

woodsie

wondering if anyone can help this video card newbie. i'll be getting a G5
dual 2 GHz Mac. doing mainly 2d & 3d graphics and some video editing. No
game playing.

the above mac comes with NVIDIA Geforce FX 5200 Ultra which i understand
to be a 'basic' card. next available up is the ATI Radeon 9600XT. will
either card be enough or should i get something better?
 
A

Augustus

woodsie said:
wondering if anyone can help this video card newbie. i'll be getting a G5
dual 2 GHz Mac. doing mainly 2d & 3d graphics and some video editing. No
game playing.

the above mac comes with NVIDIA Geforce FX 5200 Ultra which i understand
to be a 'basic' card. next available up is the ATI Radeon 9600XT. will
either card be enough or should i get something better?

ATI 9800 Pro 128Mb for G5 systems. Twice the card of a 9600XT.
http://www.welovemacs.com/109a14400.html
 
S

Scotter

Since you do video editing and 3D graphics, you may want to look into cards
that are specifically made for that kind of work. I know nVidia has a line
of cards for this and ATi may, as well.
Here is the link to nVidia's "Quadro" line:
http://www.nvidia.com/page/workstation.html
If you can't find what you need with nVidia or ATi, last ditch would be
Matrox, I guess.
 
B

Benjamin Gawert

woodsie said:
wondering if anyone can help this video card newbie. i'll be getting
a G5 dual 2 GHz Mac. doing mainly 2d & 3d graphics and some video
editing. No game playing.

the above mac comes with NVIDIA Geforce FX 5200 Ultra which i
understand to be a 'basic' card. next available up is the ATI Radeon
9600XT. will either card be enough or should i get something better?

I have a G5 1.8DP with the FX5200, and it's more than enough for what You
want to do. The card is slow at games but more than sufficient for
everything else. No other card will bring You any improvements at 2D
(something on which all cards of the last ~5 years are equally good), and
it's also more than enough for 3D work...

The other cards only bring You some benefit if You're into gaming, but You
already said that this isn't the case. So save the money and better get some
add'l RAM instead...

Benjamin
 
B

Benjamin Gawert

Scotter said:
Since you do video editing and 3D graphics, you may want to look into
cards that are specifically made for that kind of work. I know nVidia
has a line of cards for this and ATi may, as well.
Here is the link to nVidia's "Quadro" line:
http://www.nvidia.com/page/workstation.html
If you can't find what you need with nVidia or ATi, last ditch would
be Matrox, I guess.

All nice ideas, but neither the Quadros nor Matrox cards You mentioned work
with a Mac, and that's what the OP has...

Benjamin
 
T

Tony DiMarzio

No other card will bring You any improvements at 2D (something on which
all cards of the last ~5 years are equally good

Not the case at all.

There IS a strong variance between the 2D capabilities of chipsets made by
players other than ATI and NVidia (like Via, Sys, Intel etc...) over the
past 5 years. To be accurate you'd have to say (something on which all
NVidia, ATI, and Matrox cards of the last ...say 3 years... are equally
good)

Tony
 
W

woodsie

"Benjamin Gawert" said:
I have a G5 1.8DP with the FX5200, and it's more than enough for what You
want to do. The card is slow at games but more than sufficient for
everything else. No other card will bring You any improvements at 2D
(something on which all cards of the last ~5 years are equally good), and
it's also more than enough for 3D work...

The other cards only bring You some benefit if You're into gaming, but You
already said that this isn't the case. So save the money and better get some
add'l RAM instead...

Benjamin

thanks for your input.

bit hard to work out what to do when so many people have opposite
opinions. guess i'll start off with someone lower on the list and see how
that goes.
 
B

Benjamin Gawert

woodsie said:
thanks for your input.

bit hard to work out what to do when so many people have opposite
opinions. guess i'll start off with someone lower on the list and see
how that goes.

Well, You should know that in PeeCee-Land the FX5200 has a bad reputation
simply because it's too slow for most modern games. It is a full DirectX9
card which supports all the nice goodies but doesn't perform well enough to
be able to be useful for Dx9 games. So that's one reason for negative
feedacks.

The second reason is probably that most people never would buy a FX5200 for
themselves, because it's a low-end card. But most of these people also never
would recommend a FX5200 for other people, despite the fact that not
everyone does gaming and that the FX5200 (or any other card of that price
range) is more than enough for the supposed tasks.

But You're not a gamer, and You also are going to buy a Mac for which the
list of available cards is quite short compared to PC-Land. The FX5200Ultra
Apple sells with the Powermac G5 isn't the fastest card, and if You were
into serious gaming You probably would be much better with i.e. the
Geforce6800 DDL, but for the applications You listed, none of the other
availabe card will bring You _any_ benefit. Just a higher price...

The Apple FX5200U also has a very good analog signal quality (better than
Apples Radeon 9600), and it's passive cooled. A nice thing if You consider
that the Powermac G5 is extremely silent...

Benjamin
 
B

Benjamin Gawert

Tony said:
Not the case at all.

There IS a strong variance between the 2D capabilities of chipsets
made by players other than ATI and NVidia (like Via, Sys, Intel
etc...) over the past 5 years.

Not really. Even the UMA gfx chips made by VIA, SIS and intel during the
last 5 years aren't really slower than todays top end cards when it comes to
2D. Differences usually are barely measureable, and certainly not feelable
in real work...

Benjamin
 
T

Tony DiMarzio

I'm going to have to disagree with that.

The chipsets in question are integrated into the Thin Clients that I
benchmark with every subsequent hardware or software release (one of my
duties as a test/software engineer) by the company I work for (Neoware).
There are significant performance variances between these chipsets with
respect to 2D rendering. The benchmarks speak for themselves... then again
so do the hardware specs that we have acquired straight from the
manufacturers (Intel, VIA, SiS). I'd provide them for you to look at
yourself but they're under NDA.

Tony
 
J

J. Clarke

Tony said:
I'm going to have to disagree with that.

The chipsets in question are integrated into the Thin Clients that I
benchmark with every subsequent hardware or software release (one of my
duties as a test/software engineer) by the company I work for (Neoware).
There are significant performance variances between these chipsets with
respect to 2D rendering. The benchmarks speak for themselves... then again
so do the hardware specs that we have acquired straight from the
manufacturers (Intel, VIA, SiS). I'd provide them for you to look at
yourself but they're under NDA.\

Any chip that can't render fast enough to function in a thin client is
broken. Are you sure your benchmarks reflect the video chip and not some
other aspect of performance? Or is your definition of "thin client"
different from that of the rest of the industry?
 
W

woodsie

"Benjamin Gawert" said:
Well, You should know that in PeeCee-Land the FX5200 has a bad reputation
simply because it's too slow for most modern games. It is a full DirectX9
card which supports all the nice goodies but doesn't perform well enough to
be able to be useful for Dx9 games. So that's one reason for negative
feedacks.

The second reason is probably that most people never would buy a FX5200 for
themselves, because it's a low-end card. But most of these people also never
would recommend a FX5200 for other people, despite the fact that not
everyone does gaming and that the FX5200 (or any other card of that price
range) is more than enough for the supposed tasks.

But You're not a gamer, and You also are going to buy a Mac for which the
list of available cards is quite short compared to PC-Land. The FX5200Ultra
Apple sells with the Powermac G5 isn't the fastest card, and if You were
into serious gaming You probably would be much better with i.e. the
Geforce6800 DDL, but for the applications You listed, none of the other
availabe card will bring You _any_ benefit. Just a higher price...

The Apple FX5200U also has a very good analog signal quality (better than
Apples Radeon 9600), and it's passive cooled. A nice thing if You consider
that the Powermac G5 is extremely silent...

thanks again. but i've ordered the 9600. lol.

oh well guess it won't kill me.
 
T

Tony DiMarzio

Any chip that can't render fast enough to function in a thin client is
broken. Are you sure your benchmarks reflect the video chip and not some
other aspect of performance? Or is your definition of "thin client"
different from that of the rest of the industry?

No, my definition of "thin client" isn't different from that of the rest of
the industry, considering the company I work for, Neoware, basically is the
rest of the industry. We're the leading Thin Client provider in the world
next to Wyse Technologies, but anyone who knows the industry intimately
knows that Wyse is approaching its last days (we almost bought them out last
quarter). That said, look at the units offered at www.neoware.com if you're
curious. My group develops the Linux version of our products. I used to work
on the WinCE and XPe versions as well but NeoLinux (the custom Linux distro
we use on our Thin Clients) is much more fun to engineer.

Anyway... "Any chip that can't render fast enough to function in a thin
client is broken." - True, but I didn't say the chipsets didn't render fast
enough to function. They all function. However, when the CPU of the TC is
not the bottleneck, apparent differences can be seen (in 2D graphics
benchmarks) between the various graphics chipsets. Either way though
"rendering fast enough" is completely relative.

A specific example would be to compare two hypothetical hardware platforms
both running 800mhz VIA C3 processors, one unit using a VIA graphics
chipset, the other using an S3 chipset. In an ICA or RDP benchmark the
800mhz unit with the VIA chipset would complete a sequence of 2D X drawing
directives 30% faster than the S3 chipset. There ya have it.

Tony
 
B

Benjamin Gawert

Tony said:
I'm going to have to disagree with that.

The chipsets in question are integrated into the Thin Clients that I
benchmark with every subsequent hardware or software release

Thin Clients? Well, ok, if You want to extend it to embedded devices and
thin clients You're probably right.

I was speaking of what's available for PCs and Macs which is probably what's
important for most people here.

I have a thin client here (HP t5500 with ATI RageXC 8MB), and even when its
brand new, its gfx core is way older than just five years. Such cores are in
use in a lot of appliances, but it makes almost zero sense when diskussing
about PC and Mac gfx...

Benjamin
 
T

Tony DiMarzio

Thin Clients? Well, ok, if You want to extend it to embedded devices and
thin clients You're probably right.

I was speaking of what's available for PCs and Macs which is probably
what's important for most people here.

I have a thin client here (HP t5500 with ATI RageXC 8MB),

I had to do some devel work on the HP t5500. I know it well :) Not a bad
device, but pales in comparison to our competing devices. HP needs to do
some serious restructuring and re-planning in their Thin Client department
if they expect to make it.
and even when its brand new, its gfx core is way older than just five
years. Such cores are in use in a lot of appliances, but it makes almost
zero sense when diskussing about PC and Mac gfx...

Agreed... it doesn't make much sense when discussing PC and Mac mainstream
graphics chipsets.
 
M

Minotaur

Tony said:
Not the case at all.

There IS a strong variance between the 2D capabilities of chipsets made by
players other than ATI and NVidia (like Via, Sys, Intel etc...) over the
past 5 years. To be accurate you'd have to say (something on which all
NVidia, ATI, and Matrox cards of the last ...say 3 years... are equally
good)

Tony

Definantly a difference here between the XFX 6800GT I haveon the shelf
and this PowerColour X800XT PE. With the ATI card, 1920X1440 is now
crystal clear and 205?X15?? is now usable! Seems ATI is still ahead with
delivering quality images on screen..

Minotaur (8*
 
J

J. Clarke

Tony said:
No, my definition of "thin client" isn't different from that of the rest
of the industry, considering the company I work for, Neoware, basically is
the rest of the industry. We're the leading Thin Client provider in the
world next to Wyse Technologies, but anyone who knows the industry
intimately knows that Wyse is approaching its last days (we almost bought
them out last quarter). That said, look at the units offered at
www.neoware.com if you're curious. My group develops the Linux version of
our products. I used to work on the WinCE and XPe versions as well but
NeoLinux (the custom Linux distro we use on our Thin Clients) is much more
fun to engineer.

Anyway... "Any chip that can't render fast enough to function in a thin
client is broken." - True, but I didn't say the chipsets didn't render
fast enough to function. They all function. However, when the CPU of the
TC is not the bottleneck, apparent differences can be seen (in 2D graphics
benchmarks) between the various graphics chipsets. Either way though
"rendering fast enough" is completely relative.

A specific example would be to compare two hypothetical hardware platforms
both running 800mhz VIA C3 processors, one unit using a VIA graphics
chipset, the other using an S3 chipset. In an ICA or RDP benchmark the
800mhz unit with the VIA chipset would complete a sequence of 2D X drawing
directives 30% faster than the S3 chipset. There ya have it.

Geez, I thought you were talking about a video chip that somebody might
actually use in a PC.

Now, do you see a difference between an FX5200 and a 6800 Ultra in that
application?
 
B

Benjamin Gawert

Minotaur said:
Tony DiMarzio wrote:
Definantly a difference here between the XFX 6800GT I haveon the shelf
and this PowerColour X800XT PE. With the ATI card, 1920X1440 is now
crystal clear and 205?X15?? is now usable! Seems ATI is still ahead
with delivering quality images on screen..

Fine! But we talked about differences in 2D performance, not image quality.
And image quality has nothing to do if there's an ATI or Nvidia GPU on the
chip, it depends on how much the gfx board manufacturer invests in the
output filters. So in Your case it's not "ATI has better image quality than
Nvidia", it's simply "the ATI board from Powercolor has a better image
quality than the Geforce from XFX". I also had a Powercolor gfx card that
had a great analog signal quality, but it had an Geforce4 Ti4200 on it.
Looks like Powercolor cares more than XFX for the filter quality...

There are enough cards out there that have piss-poor image quality with ATI
GPU, and there also are Nvidia cards which provide very good images. The
manufacturer of the GPU has nothing to do with that...

Benjamin
 
T

Tony DiMarzio

J. Clarke said:
Geez, I thought you were talking about a video chip that somebody might
actually use in a PC.
Nope


Now, do you see a difference between an FX5200 and a 6800 Ultra in that
application?

Rhetorical question I'm assuming? In case it's not - I am unable to test
that because there is no "Thin Client" in the world that could accommodate
either an FX5200 or 6800 Ultra. Even if there did exist ThinClients which
used those chipsets, the bottleneck would be the low power low tech CPU's of
the TC's and not the graphics chipsets.

Even without testing, I highly doubt there is an appreciable 2D performance
difference between the FX5200 or 6800U, or between any recent ATI or NVidia
offering for that matter. The performance differences are really only there
between the low end chips from S3, VIA, SiS etc..
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top