Crytek: Next-Gen Console Hardware Locked Down Next Year, Out In2012-2013, Epic Agrees

Y

YKhan

I am not one to not use ATI as they used to be the best a few years back.
But, if Nvidia drags their feet, I will have no problem spending the money
on the latest ATI DX11 cards. I am just afraid of the support ATI doesn't
offer like they used to. Many game developers support Nvidia and I think
that's sad, but it is what it is.

Apparently that's no longer the case anymore: the game developers want
support for DX11 right now, and ATI is only one capable of providing
that, right now.

Asus Voltage Tweaks Radeon HD 5870 To A 38% Performance Boost -
HotHardware
"We tell you, ATI has a hit on its hands. We haven't seen a Radeon
card generate this much buzz and industry support in what feels like
years, but the powerful Radeon HD 5870 is doing just that."
http://hothardware.com/News/Asus-Voltage-Tweaks-Radeon-HD-5870-To-A-38-Performance-Boost/
 
Y

Yousuf Khan

Tom said:
Yes, I have been looking at them and they have been selling on New Egg
but selling out. I read a few tech sites for reviews and surprisingly,
the GTX295 still outperforms the single 5870. But the 5870 is 1g to the
295's 1.7 gigs, but the differences are still substantial. I think I
will wait until the 5870x2s come out, or see if Nvidia takes the cake
yet again. ATI still smokes Nvidia in pricing. Even the single 5870 is
much cheaper than the GTX295.

Here are two reviews from two very reliable sites.

http://www.techspot.com/review/198-ati-radeon-hd-5870-review/page1.html

http://www.anandtech.com/video/showdoc.aspx?i=3643

Yes, what's your point? The HD5870 is a single-GPU solution, whereas the
GTX295 is a dual-GPU solution. Even the older model dual-GPU 4870X2
beats the new HD5870 in some cases, due to the dual GPUs. The HD5870's
real competition is the GTX285, where it massacres the Nvidia. What you
should be impressed with is not the times that the 5870 got beaten by
older technology dual-GPUs, but the times that it beat those dual-GPUs.

Yousuf Khan
 
Y

Yousuf Khan

Tom said:
In any case, I was only comparing the 295 anyway, not once did I mention
the 285 or lower, so I don't see your point and your point is muted when
those cards even beat the 5870 in a few cases. As I said before, the
price is what makes ATI more palatable, but since Nvidia (as noted even
by these sites) cards runs games typically better as they get more dev
support for optimization. I may spend the extra money this time or I may
not, timing is the issue now. I am just undecided whether I want to wait
(probably) until next year for the GTX300 series, or just go with the
5870x2 that is supposed to come out in early November. I want power and
the latest, so I am not out of spending money, but if ATI gets these
5870x2s out the door, I would probably even forget about any 300 series
in the future, because I know no game is going to tax a card like that
for a long time to come.


Anyways, it's much more impressive that the single GPU HD5870 beat out
the dual-GPU GTX295 or HD4870X2 in five or six tests respectively. You
expect the dual-GPUs to usually beat out single GPUs

Well, it may be all academic pretty soon. According to this article,
Nvidia's on the verge of cancelling its GTX260, 275, and 285, with the
likelihood that the 295 will be gone pretty soon too.

SemiAccurate :: Nvidia kills GTX285, GTX275, GTX260, abandons the mid
and high end market
"NVIDIA IS KILLING the GTX260, GTX275, and GTX285 with the GTX295 almost
assured to follow as Nvidia (Nvidia: NVDA) abandons the high and mid
range graphics card market. Due to a massive series of engineering
failures, nearly all of the company's product line is financially under
water, and mismanagement seems to be killing the company.

Not even an hour after we laid out the financial woes surrounding the
Nvidia GTX275 and GTX260, word reached us that they are dead. Normally,
this would be an update to the original article, but this news has
enough dire implications that it needs its own story. Nvidia is in
desperate shape, whoop-ass has turned to ash, and the wagons can't be
circled any tighter."
http://www.semiaccurate.com/2009/10...x275-gtx260-abandons-mid-and-high-end-market/

Yousuf Khan
 
Y

YKhan

But you really need to look at those benchmark tests compared to what stress
and game(s) they were under, that's what I noticed. I found it astounding
that a few times the 285 and 280 beat the 5870.  I am psyched about the5870
as it seems for the money, Nvidia can put out something already
comparatively more powerful as how the 295 was to the 4870x2, but the newer
ATI comes closer to the challenge and still can be much cheaper.

Can you point out where the 285 beat the 5870? I've seen very few
examples. In the first article, TechSpot, they ran just single cards,
and the only things beating the 5870 with any consistency were the two
dual-GPU solutions, 4870X2 or 295. In the Anandtech article, they ran
mixes of single and dual cards, and again the only time anything beat
the 5870 were either single-card/dual-GPU or dual-card/single-GPU.
If this is true and it seems credible enough, Nvidia just may be on the
ropes. The fact that ATI cards are (and have been) hitting GDDR5 and Nividia
hasn't ever mentioned their cards supporting this, so I wonder if they are
going to that rating on their next cards. I don't think, as ATI comes out
with these cards that are more than adequate to run even Crysis, that Nvidia
can keep pumping out these high end cards at these price levels and expect
people to pay for them while not offering anything more that ATI cards can
handle. I honestly do not see, in the near future, games requiring near
anything more than the need of what the 5870x2 will offer.

I don't think it's got anything to do with technology like GDDR5 vs.
GDDR4 per se, it's more to do with cost to build. The low- and mid-
price Nvidia cards cost more to build than they are getting back for
them. This likely has a lot to do with Nvidia's extremely large die
sizes. ATI has been concentrating on lowering costs for itself, and
hence for the customers. The way to lower costs is to reduce the size
of the GPU die.

Yousuf Khan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top