Far Cry , NV40 and PS3.0

J

John Lewis

The first and latest patch (V1.1) for the just-released game Far Cry
adds support for the NV40 chip-set and support for PS3.0 shader
profiles. See the patch readme. I wonder who has Alpha/Beta samples
under NDA of NV40 ?

nVidia publicly announced the release of their Shader Model 3.0
development kit a couple of days ago.

Didn't say anything about future Ati support in this patch at all.
Presumably coming when Ati make tools and GPU samples available
to Crytek.

Also, there will be a (free-update) version of Far Cry released
optimised for the Athlon64 --- able to handle larger maps etc.

Go to http://www.amd.com/farcry

Click on the link in the right-hand column to register
for the free 64-bit Far Cry download when it is available.

Intel, are you reading ??

Remember, that the retaill copy of FC comes with the most complete and
most easily-accessible FPS game Editor since Build and it can handle a
game-area of 2000 sq. km.....if the processor(s) and memory can hack
it :) :) The 180-page PDF user-manual for the Editor can be
downloaded from:-

www.farcryHQ.com

--- go to the "Other" page under "Files"


John Lewis
 
D

DreamMaker

The first and latest patch (V1.1) for the just-released game Far Cry
adds support for the NV40 chip-set and support for PS3.0 shader
profiles. See the patch readme. I wonder who has Alpha/Beta samples
under NDA of NV40 ?


Which card use the nv40 technologe
a tough that the fx5950u had the highest nv number?

what the differrence between the two ?
 
N

NightSky 421

John -
I fear that all of the things you speak about in your post could be the
final undoing of gaming on the PC. Maybe not right away, but I see the
potential. It is a mistake for hardware makers, regardless of who they
are, to manipulate the consumer public by having partnerships with game
developers and publishers so that a certain product *deliberately* is
better on one piece of hardware than another. I mean, what are we
supposed to do? Run multiple gaming computers so that we can have every
possible combination between Nvidia, ATI, Intel and AMD hardware and
install games on the appropriate machine to satisfy the results of a
particular partnership? And then upgrade all of those computers on a
semi-regular basis to keep up with the latest games? You'd think these
companies, as well as game publishers and developers, could see the
potential damage to be done here in the long run.
 
C

Chubben

Well said.

Even if I havent noticed any problems running games on my rig, I fear that
it is just a matter of time before they will apear.....
 
D

DaveL

I wonder who has Alpha/Beta samples under NDA of NV40 ?


I saw an article recently where John Carmac of Id software said he has a
NV40 and said it is awesome. But what else could he say as his company is
in an agreement to bundle Doom3 with Nvidia cards.

I'm guessing many NV40 cards are in the hands of developers under NDA.

Dave
 
J

John Lewis

Well said.

Even if I havent noticed any problems running games on my rig, I fear that
it is just a matter of time before they will apear.....

Far Cry V1.1 works just as before on existing machines, in spite of
the upgrade in support to NV40, PS3.0. I have noticed a few graphics
bugs with the latest patch that are not present in V1.0, but they have
nothing directly to do with the support additions. The video bugs are
far grosser than that - occasional graphics flashing etc. Game-play is
fine and existing eye-candy is actually enhanced slightly. I'm sure
these obviously blatent video hiccups will be taken care of in the
next patch update.

Crytek ADDED graphics support in the patch; they did not take away
any existing functions. The upcoming next-gen nVidia, Ati GPUs
( NV4x, Ati 4xx ) will add more hardware functionality for eye-candy
(PS3.0 is improved eye-candy..) while still keeping frame-rates high.
The typical penalty for older cards is that you have to run the video
options below the maximum upgraded "eye-candy" settings for a
decent frame-rate; the game will still run fine and will have as much
eye-candy as the existing card was ever physically able to produce
while maintaining a decent frame-rate. Nothing new here, this has
been the norm for years. You may have problems on the advanced
games with bottom-of-the-barrel older cards, but it is in both the
developer and publisher's financial interest to support as wide a
range of existing video cards as is practically feasible.

Simple technical synopsis:

DX8 and DX9 have eye-candy functions built into their specs. DX9 is
continuing to evolve those functions. Implementation of these
functions requires either emulation using the CPU (very slow) or
equivalent hardware functionality in the GPU (very fast). Each
generation of GPUs always adds more hardware functionality than the
predecessor generation and the top end of each GPU family implements
more parallel-processing of these hardware functions than the lower
ones in the same family, so they are faster (and more power-hungry and
cost more money). However, the new GPU families are also functionally
upward and downward compatible with the previous generation. So older
software will run on new GPUs and new software will run on old GPUs --
however in the latter case, if you turn on ALL of the eye-candy in the
graphics options of a game which has implemented the eye-candy to the
limits of the latest GPU, the display may be reduced to a slide-show
on an older GPU, because the poor exhausted CPU ( not GPU) is
emulating all of the new eye-candy functions. So, the developers of
these top-end games put the eye-candy selection in the user's hands by
providing a huge range of user-selectable video options. The user has
the free choice of living within the limits of his video hardware and
adjusting the eye-candy appropriately, or upgrading --- depending on
how important eye-candy is to his/her enjoyment of the game.

BTW, Crytek has also been very clever in a current realm --- they
have been smartly innovative in improving the efficiency of existing
graphics operations. One example:- Their Polybump (patented)
technology has mananged to render objects with 250 polygons
previously requiring 1500 polygons and with little noticeable
difference in the end-result.

And Far Cry is unusual in another respect. Besides the usual
Low/Medium/High etc. Video Option adjustments in the Option
Menus, the advanced user can toggle individual functions within
each option, without even turning on the console-commands.
The Far Cry Configuration Tool available in the Start Menu has
a Customize selection under Video Options (advanced)
for individual eye-candy selection -- such as a Perfect-water-
reflection/Blurry-water-reflection toggle, (default is blurry) with an
associated frame-rate penalty depending on GPU hardware......
So the passionately-interested user can completely customize all
graphics functions for the most pleasing compromise between
frame-rate and eye-candy for his/her specific video (and CPU)
hardware.

John Lewis





John Lewis
 
J

John Lewis

Which card use the nv40 technologe
a tough that the fx5950u had the highest nv number?

what the differrence between the two ?


The NV40 is the next-gen after the nVidia FX59xx series. Specs
have not been publicly released yet. There are a lot of rumors.
Presumably alpha-samples of the parts or detailed emulation
models are now available to Crytek and other developers
under Non-Disclosure Agreement.

John Lewis.
 
R

Ricardo Delazy

John -
I fear that all of the things you speak about in your post could be the
final undoing of gaming on the PC. Maybe not right away, but I see the
potential. It is a mistake for hardware makers, regardless of who they
are, to manipulate the consumer public by having partnerships with game
developers and publishers so that a certain product *deliberately* is
better on one piece of hardware than another. I mean, what are we
supposed to do? Run multiple gaming computers so that we can have every
possible combination between Nvidia, ATI, Intel and AMD hardware and
install games on the appropriate machine to satisfy the results of a
particular partnership? And then upgrade all of those computers on a
semi-regular basis to keep up with the latest games? You'd think these
companies, as well as game publishers and developers, could see the
potential damage to be done here in the long run.

As long as it runs OK on the majority of popular graphics cards, I'm not
particularily worried about bundled offers.

Don't forget that it is in the game developers best interest ($) to have
a game running on as many machines as possible.

I doubt that there will be any major differences in the way DOOM 3 or
other new releases will play on next gen ATI or NVIDIA accelerators,
although I am hoping that ATI will boot NVIDIA's butt. :)

Ricardo Delazy
 
A

Aki Peltola

Ricardo Delazy said:
I doubt that there will be any major differences in the way DOOM 3 or
other new releases will play on next gen ATI or NVIDIA accelerators,
although I am hoping that ATI will boot NVIDIA's butt. :)

I think DOOM3 won't be anything so special anymore when
it's finally released... Such graphics have already become standard
and what, DOOM3 only has got that, few different monsters and
the usual guns? Boring.
 
N

NightSky 421

Ricardo Delazy said:
As long as it runs OK on the majority of popular graphics cards, I'm not
particularily worried about bundled offers.

Don't forget that it is in the game developers best interest ($) to have
a game running on as many machines as possible.


This is true. I hope developers and publishers can keep this fact in
focus.

I doubt that there will be any major differences in the way DOOM 3 or
other new releases will play on next gen ATI or NVIDIA accelerators,
although I am hoping that ATI will boot NVIDIA's butt. :)


I'm very much looking forward to Doom 3!
 
N

Nada

NightSky 421 said:
John -
I fear that all of the things you speak about in your post could be the
final undoing of gaming on the PC. Maybe not right away, but I see the
potential. It is a mistake for hardware makers, regardless of who they
are, to manipulate the consumer public by having partnerships with game
developers and publishers so that a certain product *deliberately* is
better on one piece of hardware than another. I mean, what are we
supposed to do? Run multiple gaming computers so that we can have every
possible combination between Nvidia, ATI, Intel and AMD hardware and
install games on the appropriate machine to satisfy the results of a
particular partnership? And then upgrade all of those computers on a
semi-regular basis to keep up with the latest games? You'd think these
companies, as well as game publishers and developers, could see the
potential damage to be done here in the long run.

I've always thought that it's the world-wide PC hardware conspiracy
that makes the PC market constantly evolving with the help of a
consumer's heavier purse.
 
N

Nada

Aki Peltola said:
I think DOOM3 won't be anything so special anymore when
it's finally released... Such graphics have already become standard
and what, DOOM3 only has got that, few different monsters and
the usual guns? Boring.

And besides, who the hell wants to crawl under Freddie Krueger's oily
boiler room when "Far Cry" offers a virtual Hawaii?

Actually, I will, even though it will make me cry and crap my pants.
 
N

Nada

NightSky 421 said:
This is true. I hope developers and publishers can keep this fact in
focus.

That may be true to a point, but the first "Doom" back in the day made
many 286 owners upgrading for 386 and 486s. And the harsh demandings
of the game didn't stop it for ending up at millions of homes. The
key element, to me, is to have a large and wide scalability with the
game engine and the option to have as many graphics settings as
possible a la "Serious Sam". But the question is who wants to play
"Far Cry" with a Duron 800MHz and GeForce 2? "Far Cry" is the first
game in years that really made my average rig downgrade into a toffee
marathon session with "high" settings (and there's even a "higher"
settings available with this game). Pissed me off, but hopefully the
5900XT waiting at the post office will save my month.
 
N

Nada

So, the developers of
these top-end games put the eye-candy selection in the user's hands by
providing a huge range of user-selectable video options. The user has
the free choice of living within the limits of his video hardware and
adjusting the eye-candy appropriately, or upgrading --- depending on
how important eye-candy is to his/her enjoyment of the game.

This is something that would be nice to see with every upcoming PC
game release.

The options menu rarely gets as wide or long as it has been with "Far
Cry" and the old "Serious Sam". "Medal of Honor - Allied Assault" had
also an excellent selection for the end user with the audio-video
options menus. Unfortunately, not all end users even know their
excistence unless you don't have a guru or a gamer among the family.
They purchase the new hardware and more ofthen than not don't even
realize they could boost-up the graphics settings. I'm giving my old
Titanium 4200 to my cousins, and it's sort of shame that they have
played half of "No One Lives Forever 2" with an old DX7 card, since
the game offers a lot more settings for visual enjoyment. While I
think it sucks to play a game with a machine over the recommend spec
and getting intolerable framerates dropped in their early teens, I
wouldn't want a change the ideology where developers would stop
creating straight for PC and only stick with the console hardware
specs. What I don't like about the XBox-like games such as "Deus Ex -
Invisible War" is the dominant hardware that is clearly set by the
standards seen on the XBox version of the game. The green Hulk-box
dominates the PC too much. With "Far Cry", upcoming "Half Life 2" and
"Doom 3" will once again set PC to the top shelf among gamers, no
matter what the little 10-year-old snot-drooling Playstastion 2 guru
of your block says.

Goddammit PC kicks ass, or what!
 
J

John Lewis

That may be true to a point, but the first "Doom" back in the day made
many 286 owners upgrading for 386 and 486s. And the harsh demandings
of the game didn't stop it for ending up at millions of homes. The
key element, to me, is to have a large and wide scalability with the
game engine and the option to have as many graphics settings as
possible a la "Serious Sam". But the question is who wants to play
"Far Cry" with a Duron 800MHz and GeForce 2? "Far Cry" is the first
game in years that really made my average rig downgrade into a toffee
marathon session with "high" settings (and there's even a "higher"
settings available with this game). Pissed me off, but hopefully the
5900XT waiting at the post office will save my month.

.....for about a month. Until FC gives you the upgrade itch again :)
On my 5900/128, I cannot set everything to max without unacceptable
frame-rate hits (not messing wiith the pixel-shaders either, which are
set default to PS1.1 for nVidia in FC ). Does not detract from my
enjoyment of the game one bit. Crytek have done an excellent job
of scaling the average-user-accessible graphics settings. And their
auto-detection of hardware works like dream. Most of the eye-candy
additions that I am "missing" are visible to those with magnifying
glasses.......

However, with the addition of NV40/PS3.0 support in the V1.1 patch ,
the itch to upgrade further when the next-generation video cards
arrive has definitely become more persistent. A frame-rate of 30-50
FPS is perfectly playable, but how about bragging 50-100 with all
eye-candy on ?

Advice to those who have not bought ( or pirated ) FC yet. Don't ! The
game will cost you at least $540. $40 for the game itself and $500
(at least) for the sudden upgrade itch.........

John Lewis
 
N

Nada

....for about a month. Until FC gives you the upgrade itch again :)
On my 5900/128, I cannot set everything to max without unacceptable
frame-rate hits (not messing wiith the pixel-shaders either, which are
set default to PS1.1 for nVidia in FC ).

Oh no. I'd hate to upgrade again in the summer. I'm expecting to
have "Thief 3" working better than "Deus Ex 2", but I have a feeling
it's going to be somewhat slow DX8.1 experience even with the FX line
of cards.

Does not detract from my
enjoyment of the game one bit. Crytek have done an excellent job
of scaling the average-user-accessible graphics settings. And their
auto-detection of hardware works like dream. Most of the eye-candy
additions that I am "missing" are visible to those with magnifying
glasses.......

I'd like to see a definition for DX8 and DX9 effects. I'm still
testing the "Far Cry" island with DX8 card, and from what I've seen,
the surroundigs of the game has been very pleasing. Almost relaxing
to the point where the "goals" of the game itself have been secondary
for just plain tourism.
However, with the addition of NV40/PS3.0 support in the V1.1 patch ,
the itch to upgrade further when the next-generation video cards
arrive has definitely become more persistent. A frame-rate of 30-50
FPS is perfectly playable, but how about bragging 50-100 with all
eye-candy on ?

I hate to say this, but perhaps I should had waited a month longer for
an upgrade. I have a feeling April will bring new gear to the end
users.
Advice to those who have not bought ( or pirated ) FC yet. Don't ! The
game will cost you at least $540. $40 for the game itself and $500
(at least) for the sudden upgrade itch.........

John Lewis

What's really funny about "Far Cry" is that there was no hype over it,
and it totally surprised from behind "Doom 3" and "Half Life 2",
climbed up to a front runner and set a new audio-visual standard for
first person shooters. How many times we have seen oily maps with
techno-infested and dark spaceships without hope? Not that there's
nothing wrong with them, if they succeed of having the same atmosphere
than "Alien" and "Aliens". I've never seen grass grow in a game
environment so "fresh" and alive before. Heck, even the mobs are
fishing there and taking a break from shooting you. For some reason,
while I was exploring the island, I felt like there was a Predator
stalking me or something. Felt like Scwarzenegger hiding in mud.
(Mod creators, take a note!)
 
M

Manuel

Goose said:
According to http://www.nvnews.net/#1080315857 he said:
"I just got an nv40 and put it in my dev machine, and it is spectacular!"

SO basicaly, with a p4 2.4 1024Ram and a geforce4 ti4200 I cant expect
anything? WHat the hell I need ot buy then?

Even on medium setting the slideshow is present... and btw, how you
sdhow the fps in FC?

Manuel
 
P

Pluvious

On 30 Mar 2004 21:53:38 -0800, (e-mail address removed) (Manuel) wrote:

||>
||> > I saw an article recently where John Carmac of Id software said he has a
||> > NV40 and said it is awesome.
||>
||> According to http://www.nvnews.net/#1080315857 he said:
||> "I just got an nv40 and put it in my dev machine, and it is spectacular!"
||
||SO basicaly, with a p4 2.4 1024Ram and a geforce4 ti4200 I cant expect
||anything? WHat the hell I need ot buy then?
||
||Even on medium setting the slideshow is present... and btw, how you
||sdhow the fps in FC?
||
||Manuel

\r_displayinfo 1

0 to turn it off.

Just wait a few weeks for the announcement of the new generation video
cards and get one. That 4200 is old school.

Pluvious
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top