XBOX 2 GDC CHATTER REACHES MELTDOWN

R

R420

http://www.computerandvideogames.com/news/news_story.php?id=102527

_______________________________________________________________________________
Wednesday 17th March 2004


XBOX 2 GDC CHATTER REACHES MELTDOWN

Microsoft engages in semantic warfare as speculation over new hardware
at the San Jose event spirals out of control

18:46 The games industry was today thrown into turmoil as mixed
messages emerged regarding the potential unveiling of Xbox 2 at next
week's Games Developers Conference in San Jose.
In response to reports from the UK trade press that no details of Xbox
2 will be revealed at the event, a Microsoft spokesperson commented to
us a few moments ago: "What we're not talking about [at GDC] is
Longhorn, we're not announcing the next Xbox and Bill Gates won't be
there.

"At GDC we are going to talk about the role software (including
DirectX) can play in mitigating some of the daunting issues our
industry faces - soaring consumer expectations, discontinuous
technology shifts, the intense amount of time it takes to develop
cross platform etc."

Do the three statements in the above quote mean that Xbox 2 will not
feature at (and, crucially, around) GDC? No, and Microsoft has been
studiously careful in its choice of words. Whilst it would be patently
ridiculous to expect the company to pull out the box and go "Check out
our awesome console, man!" there is a precedent - Microsoft's original
unveiling of Xbox hardware at GDC in previous years where technical
demonstrations of future hardware were on display.

While Microsoft itself refused to elaborate on the above statement,
senior sources have stated they will be amazed if "Xbox 2" is not
present and correct in some shape or form - at some location or other
- in San Francisco next week. Whatever, we'll keep you full posted on
all developments as and when they happen.

Johnny Minkley
_______________________________________________________________________________
 
J

Julian Cassin

"At GDC we are going to talk about the role software (including
DirectX) can play in mitigating some of the daunting issues our
industry faces - soaring consumer expectations, discontinuous
technology shifts, the intense amount of time it takes to develop
cross platform etc."

The sad thing is that the latest incarnations of DirectX provide very little
extra benefit (if any) over
earlier versions for 2D game programmers. I doubt they will have added
anything extra this time.

Seems they have pretty much neglected 2D games since they got rid of Direct
Draw. Now you
have to setup flat polygons to simulate sprites.

Julian
 
E

Eric Pobirs

Julian Cassin said:
The sad thing is that the latest incarnations of DirectX provide very little
extra benefit (if any) over
earlier versions for 2D game programmers. I doubt they will have added
anything extra this time.

Seems they have pretty much neglected 2D games since they got rid of Direct
Draw. Now you
have to setup flat polygons to simulate sprites.

Julian

It isn't as though there is any market driving development of improved
technology for 2D graphics. Even if there was, what would you ask for? What
technological limitation is a 2D programmer facing today that isn't likely
to be alleviated by improvements also used by 3D tasks, as in using polygons
to simulate those things once done with sprites and priority planes? If your
Metal Slug clone is composed entirely of textures painted on polygons but
nobody can see the difference between that and hardware defined sprites, who
cares?

The barriers for new 2D games are not technological. They are solely a
matter of the lack of a large market for such products other than on
handhelds. Even those are increasing shifting towards 3D as larger screens
with higher resolution become economically practical.
 
J

Julian Cassin

It isn't as though there is any market driving development of improved
technology for 2D graphics. Even if there was, what would you ask for? What
technological limitation is a 2D programmer facing today that isn't likely
to be alleviated by improvements also used by 3D tasks, as in using polygons
to simulate those things once done with sprites and priority planes? If your
Metal Slug clone is composed entirely of textures painted on polygons but
nobody can see the difference between that and hardware defined sprites, who
cares?

The barriers for new 2D games are not technological. They are solely a
matter of the lack of a large market for such products other than on
handhelds. Even those are increasing shifting towards 3D as larger screens
with higher resolution become economically practical.

It isn't just that, at least they could get rid of the "tearing" and the
intimitant "pauses"
associated with DirectX based games. Why Microsoft refuses to allow
synchronisation with frame flyback is beyond me.

Julian
 
E

Eric Pobirs

Julian Cassin said:
It isn't just that, at least they could get rid of the "tearing" and the
intimitant "pauses"
associated with DirectX based games. Why Microsoft refuses to allow
synchronisation with frame flyback is beyond me.

Julian


Are you still working in DX3? I haven't heard complaints of that sort in
several years. Even then the problems were almost always in third party
non-WHQL drivers and inadequate hardware resources. That hasn't been a
problem for 2D stuff for a long, long time.
 
J

Julian Cassin

It isn't just that, at least they could get rid of the "tearing" and the
Are you still working in DX3? I haven't heard complaints of that sort in
several years. Even then the problems were almost always in third party
non-WHQL drivers and inadequate hardware resources. That hasn't been a
problem for 2D stuff for a long, long time.

Unfortunately I haven't seen a 2D XBox game so I don't know if XBox's
version of
DirectX solves the problems, but I am yet to find a *single* Windows DirectX
program with no tearing,
whether it is 2D or 3D (on GeForce4, Athlong 1800XP, 512mb RAM, Win98SE or
WinXPPro).
 
E

Eric Pobirs

Julian Cassin said:
sort

Unfortunately I haven't seen a 2D XBox game so I don't know if XBox's
version of
DirectX solves the problems, but I am yet to find a *single* Windows DirectX
program with no tearing,
whether it is 2D or 3D (on GeForce4, Athlong 1800XP, 512mb RAM, Win98SE or
WinXPPro).


Something is seriously wrong with your rig. If this was a DX problem it
would be the #1 rant in gaming forums everywhere. Instead it seems to be
just your experience on a single system.

Doesn't that suggest something to you?
 
J

Julian Cassin

Something is seriously wrong with your rig. If this was a DX problem
it
would be the #1 rant in gaming forums everywhere. Instead it seems to be
just your experience on a single system.

Doesn't that suggest something to you?

Definately NOT! I regularly go to gaming sessions with between 10-20 people
and *ALL* their systems which the slowest is probably an Athlon 1800XP have
tearing, *EVERY* single one of them. Even with the Radeon 9600 card playing
3D games such as UT2003.

Move quickly to the left and right and guess what? You see tearing. Just
like in Windows 98 or XP when you drag windows around quickly, you see
tearing (of course Windows isn't using DX for it's window drawing). I am
*YET* to see Windows animate anything with absolutely *NO* tearing.

The funny thing is some people only notice it when you point it out to them
and they think it is normal.

Julian
 
E

Eric Pobirs

Julian Cassin said:
Definately NOT! I regularly go to gaming sessions with between 10-20 people
and *ALL* their systems which the slowest is probably an Athlon 1800XP have
tearing, *EVERY* single one of them. Even with the Radeon 9600 card playing
3D games such as UT2003.

Move quickly to the left and right and guess what? You see tearing. Just
like in Windows 98 or XP when you drag windows around quickly, you see
tearing (of course Windows isn't using DX for it's window drawing). I am
*YET* to see Windows animate anything with absolutely *NO* tearing.

The funny thing is some people only notice it when you point it out to them
and they think it is normal.


Then I'd suggest a visit to the eye doctor to have those peepers
checked. If that shows nothing make an appointment with a neurologist.
Because this problem is definitely in your head. It's just a matter of
where.

Seriously, telling people they should be seeing something isn't a very
dependable measure. Any first year law student learns that the hard way. You
claim this is an endemic problem there is apparent at a mere look, yet there
is no widespread complaint. That says everything to me.
 
J

Julian Cassin

Move quickly to the left and right and guess what? You see tearing.
Just
Then I'd suggest a visit to the eye doctor to have those peepers
checked. If that shows nothing make an appointment with a neurologist.
Because this problem is definitely in your head. It's just a matter of
where.

Seriously, telling people they should be seeing something isn't a very
dependable measure. Any first year law student learns that the hard way. You
claim this is an endemic problem there is apparent at a mere look, yet there
is no widespread complaint. That says everything to me.

Yeah right, same as everyone elses head at our network sessions.

Not like you are just plain blind to see it.
 
E

Eric Pobirs

Julian Cassin said:
Yeah right, same as everyone elses head at our network sessions.

Not like you are just plain blind to see it.

Have I ever seen tearing in a polygonal display? Sure.

Has it been notably more pronounced in DirectX apps? Nope.

Tearing is an inherent problem to 3D rendering systems. Avoiding it is a
matter for developers when crating applications and gauging performance of
the target platform. Since the PC is an extremely variable platform, even
for machine of similar specs, it is nightmarish to try to assure perfect
rendering while also trying to produce a game that appears better what came
before.

Unless you set your sights very low in the visual aspects of the game
there are going to be numerous PCs that fail to display the scene without
flaws like tearing. This is true regardless of the software layers lying
between the game and the hardware, be it OpenGL, DirectX, a custom API, etc.
I've never seen a real-time rendering system that did not produce some flaws
when under a major load, up to and including systems that had eight digit
price tags in their day. There are certain elements that are unpredictable
when dealing in heavy loads in real-time interaction because the system has
to make quick compromises to maintain framerate that are different due to
player input. This is not a factor when you're rendering a movie and can
spend a month making sure a single minute of screen time is perfect.

If you're convinced this is purely a DirectX issue you're in a very
small minority.
 
J

Julian Cassin

Tearing is an inherent problem to 3D rendering systems. Avoiding it is
a
matter for developers when crating applications and gauging performance of
the target platform. Since the PC is an extremely variable platform, even
for machine of similar specs, it is nightmarish to try to assure perfect
rendering while also trying to produce a game that appears better what came
before.

Unless you set your sights very low in the visual aspects of the game
there are going to be numerous PCs that fail to display the scene without
flaws like tearing. This is true regardless of the software layers lying
between the game and the hardware, be it OpenGL, DirectX, a custom API, etc.
I've never seen a real-time rendering system that did not produce some flaws
when under a major load, up to and including systems that had eight digit
price tags in their day. There are certain elements that are unpredictable
when dealing in heavy loads in real-time interaction because the system has
to make quick compromises to maintain framerate that are different due to
player input. This is not a factor when you're rendering a movie and can
spend a month making sure a single minute of screen time is perfect.

If you're convinced this is purely a DirectX issue you're in a very
small minority.

I never said the cause was due to DirectX, but DirectX doesn't do anything
to get rid of the problem does it?

As I said, even my 8bit computers have no tearing. Tearing is unbearable.

Julian
 
E

Eric Pobirs

Julian Cassin said:
is

I never said the cause was due to DirectX, but DirectX doesn't do anything
to get rid of the problem does it?

As I said, even my 8bit computers have no tearing. Tearing is unbearable.

Julian

Your original post certainly sounded to me as if you specifically blamed
DirectX.

How many filled polygon games did you ever see on an 8-bit system? My
recollection of the systems offers an extremely short list. WayOut... and
that other one...
 
J

Julian Cassin

Eric Pobirs said:
it performance

Your original post certainly sounded to me as if you specifically blamed
DirectX.

How many filled polygon games did you ever see on an 8-bit system? My
recollection of the systems offers an extremely short list. WayOut... and
that other one...

The problem now with the latest DirectX is that DirectDraw has been removed
so
to make 2D games you have to use Polygons to simulate 2D graphics.

Also, there are many many polygon games on 8bit systems, every played
Elite, Starion, Mercneary, Cholo to name a few?

One other thing, Polygons don't seem to be the reason, didn't you notice
that
Windows itself suffers with tearing when you drag windows around?

Or are you going to claim once more that the problem is in my mind like
before
until you suddently admit to the problem?


Julian
 
E

Eric Pobirs

Julian Cassin said:
The problem now with the latest DirectX is that DirectDraw has been removed
so
to make 2D games you have to use Polygons to simulate 2D graphics.

Also, there are many many polygon games on 8bit systems, every played
Elite, Starion, Mercneary, Cholo to name a few?

First of all, DirectDraw games still work. The API functionality is
still there in a backward compatible form just like many other portions of
Windows that still exist solely for supporting older software.

DirectDraw is not considered part of the current generation of the API
because it was decided by all concerned, both at Microsoft and at the game
developers who have considerable influence on how DirectX evolves, that the
methodology behind DirectDraw was not in tune with where the hardware was
going nor with how developers wanted to address that hardware..

Secondly, I said 'filled polygons.' All of the games you list are
wireframe exercises. Filled polygons were very rare in 8 bit systems due
both to limited color palettes and processing power. WayOut was a rarity in
that it used filled polygons in a very simple fashion to allow real-time
movement through a 3D maze. Think Wolfenstein 3D without any enemies except
a strong wind that prevented movement across certain areas.

Filled polygons didn't really become common place on home consumer
hardware until the 16-bit generation, most notably with Starglider II by the
same team who went on to produce the SNES Star Fox game for Nintendo.

If you're trying to suggest that the displays of 8-bit systems were
without flaw you'll have to prepare to be answered with laughter. I spent a
lot of time in game testing back when 8-bit systems were still a market. A
lot of time was spent isolating and minimizing all sorts of display issues.
Usually the problems had to do with the video hardware for scrolling and
sprites. While tearing as seen on PCs wasn't as common there instead
constant issues of flicker, both due to managing activity during the VBI and
attempting to get more sprites on screen than the hardware was intended to
support. The problems set in when you needed more sprites on a single
horizontal portion of the screen. The same sprite could made to appear
multiple times though the vertical scan and you just had to be careful to
track which occurance of that sprite was to be acted on when a collision
occurred.

Which isn't to say you couldn't tearing on 8-bit machines. If your email
address is an indicator you probably didn't see much of the Apple ][ system
but they had a major game market here. (You may have encountered some
popular Apple ports that treated machines like the C-64 as Apples, ignoring
most of the hardware advantages.) These systems had essentially no hardware
assistance for video functions. It all had to be done by hand. Any game that
needed to do a lot of fast scrolling, like the popular Choplifter, would
show a great deal of tearing in the display. It was simply beyond the means
of a sub-2 MHz 6502 to update the display quickly enough to avoid the
problem. We Atari 800 fans were quite annoyed at the number of of games that
failed to take advantage of the chipset and just treated it as an Apple.

This problem prevailed also in the PC world. Video hardware acceleration
really didn't come into its own on the PC until a unified API became
standard. A 16 MHz 286 with a 256K VGA card could replicate the Apple ][
version of Choplifter (which was essentially black & white with artifacted
color) with a much smoother display but by then the audience would expect
much better graphics. Inevitably the improved graphics would be enough to
swamp the system and the same kinds of display flaws were back in evidence.

Much the same could be seen in games produced for both the Amiga and
Atari ST. The Amiga had hardware scrolling, hardware sprites and hardware
support for more complex graphic objects called MOBS in Amiga-ese, as well
as a bunch of other really useful things from a game developer's
perspective. The Atari lacked all this, being more price conscious. (A
blitter chip was added to later model but poorly supported by game
developers who didn't want to lose the installed base of older ST models.)
Games written first for the Amiga and ported to the Atari often lost quite a
bit in the process, not only due to the lesser color range and depth on the
Atari but also due to the greater difficulty doing all the display
manipulation in software. And of course, games ported quickly the ST to the
Amiga lacked the full splendor of native Amiga games.

That didn't stop them from being good games on their own merits but
people like to see their choice of system favored. The moral is that no
matter how much power you add to your video platform game developers will
soon operate at its limits.

Plus there is a further issue of desktop PCs I'll get to below.
One other thing, Polygons don't seem to be the reason, didn't you notice
that
Windows itself suffers with tearing when you drag windows around?

Or are you going to claim once more that the problem is in my mind like
before
until you suddently admit to the problem?

No, I'm just going to suggest that you're desperately ignorant of the
way the Win32 APIs work. That, and you need to be more clear about the
nature of your complaint.

The routines used for the GDI are designed to work independently of
hardware acceleration and are completely separate from the DirectX suite..
They'll use it if available but otherwise they get by on the most minimal of
systems for the generation in question. The level of hardware acceleration
used by Windows is a easily accessed control panel setting. Perhaps you
should check yours.

On this fairly ancient machine in front in me (dual Celeron 533, Voodoo
3 3000 16 MB, 256 MB RAM, Win2K) I can grab the window containing this
message in progress and move it around while have the text remain completely
readable. The edges haves a bit of redraw ugliness but only if I move the
window so quickly its contents are no longer legible. It was just a few
years earlier that this was a big deal for any windowing system and many
would not even attempt to preserve the display until the system thought you
were done moving things around. At one point they added a control panel
checkbox to Windows to allow user control. Older machines with weaker CPU
and/or video hardware could slow to a crawl attempting to keep up with the
redraw task. On such a machine it was better to simply allow the window to
remain blank during movement since it would be unusual to be manipulating
any data while moving the window containing it.

Compare this window, an object created with little hardware
acceleration, to any hardware sprite on an 8-bit system. In terms of data
volume the window is orders of magnitude greater and has few of the
constraints , such as size color depth, typical of hardware sprites. The
simple text I'm looking at now could just as easily be a PhotoShop measured
in tens of megabytes. That window would bring this system to its knees just
as moving a Word window could on a typical Win95 generation machine.

Now, if you want rock solid windows with full hardware acceleration on a
consumer system, check out the current generation Mac OSX systems. Due to
the close control they have of their hardware Apple was able to dictate that
all machines from a certain date forward would have a specified minimum of
video hardware functionality. (Essentially the DirectX 7 generation although
Apple would put it in terms of an OpenGL version.) This became part of the
minimum specs for OSX, specifically the Quartz rendering system. (OSX
running as a headless server doesn't care about video hardware, of course.)
Microsoft is doing this also in the next major release of Windows, currently
referred to as Longhorn. If you look around you can find some video clips
for demonstrations of the fully hardware accelerated GDI that integrates
with DirectX.

It looks great but the funny thing is that the windows are no longer
treated as a sort of hardware sprite. They're going to be polygons with the
windows controls and contents painted on. The plan is to treat DirectX 7
class video chips as entry level without all the whiz-bangs and DirectX 9
class chips as the top of heap, at least until somebody can think of a
must-have application for more advanced video hardware for desktop apps.

Now, you might wonder, why is this coming so long after the hardware has
become common? Because the hardware is less common than you think. Microsoft
needs to sell new versions of Windows to companies that have vast numbers of
systems and don't change them out that often. Those corporate desktop
typically have minimal video hardware compared to consumer systems. Recently
the embedded solution in Intel chipsets have come to match the minimal
requirements for Longhorn, so the single most important customer finally
became ripe to run a fully hardware driven desktop OS.

Does this mean displays in games will become perfect? Nope. Just as a
window that could have an older machine to a dead halt while being moved is
now trivial game developers are going to find the limits of future PCs. The
desktop environment for Longhorn is fairly predictable and should come
nowhere near taxing a system released any time in the near future since
Longhorn will have to behave reasonably well on older machines.

Game developers are driven by a different set of motivations. While they
don't want to restrict their potential audience they also need to make use
of the newer hardware to make products more dazzling than their predecessors
already available in the bargain bins. Players will have the standard
options for lowering the game's power needs but most will try to get as much
as they can before the display becomes completely useless even if it means
putting up with some flaws like tearing.

The worst part is that this is pretty much impossible to avoid on a PC.
On a console developers can depend on unit #1 through unit #10,000,000
behaving exactly the same. This allows a great deal of fine tuning and is
why, even though it has much in common with a PC, you don't typically see
tearing on a Xbox game from a company with a good QA operation.

The same cannot be said for a PC. You can have two machines of almost
exactly the same specs but one has an Intel chipset and the other a VIA
chipset. These machines will produce slightly different behavior when you
push them to any extent. This can be largely invisible except on benchmarks
and that area where humans are very sensitive, visual pattern recognition.

It isn't just on IBM descended PCs. We ran into big problems way back
when the Amiga 500 and 2000 came out. For most purposes a first generation
512K 500/200 was the same platform as the earlier 1000 but there were little
differences that cause timing nightmares. On one game with a tactical map
display (Lords of the Rising Sun) the update moved at only about 25% of the
1000's speed on a 500/2000 system. It became necessary to test for which
type of Amiga the game was loading on and adjust accordingly. (I'm not sure
if this was ever implemented because I left the company before the game
shipped and QA testing left me with no interest in playing it on my own
time.)

This can be avoided if you can convince developers and gamers to settle
for games that treat four year old PCs as the current pinnacle. Don't
program for 2004 PCs until 2008. Not very likely, you think?
 
J

Julian Cassin

First of all, DirectDraw games still work. The API functionality is
still there in a backward compatible form just like many other portions of
Windows that still exist solely for supporting older software.

DirectDraw is not considered part of the current generation of the API
because it was decided by all concerned, both at Microsoft and at the game
developers who have considerable influence on how DirectX evolves, that the
methodology behind DirectDraw was not in tune with where the hardware was
going nor with how developers wanted to address that hardware..

Secondly, I said 'filled polygons.' All of the games you list are

Have you ever played StarStrike 2? Elite? (yes, some versions have *FILLED*
polygons),
I am sure I can dig up more titles if I tried...
wireframe exercises. Filled polygons were very rare in 8 bit systems due
both to limited color palettes and processing power. WayOut was a rarity in
that it used filled polygons in a very simple fashion to allow real-time
movement through a 3D maze. Think Wolfenstein 3D without any enemies except
a strong wind that prevented movement across certain areas.

Yes, more rare than vector graphics games, but not non-existant.
Filled polygons didn't really become common place on home consumer
hardware until the 16-bit generation, most notably with Starglider II by the
same team who went on to produce the SNES Star Fox game for Nintendo.

Yep, and neither have any tearing.
If you're trying to suggest that the displays of 8-bit systems were
without flaw you'll have to prepare to be answered with laughter. I spent a
lot of time in game testing back when 8-bit systems were still a market. A
lot of time was spent isolating and minimizing all sorts of display
issues.

Of course they have flaws, one is often of frame rate, but still no tearing.
Usually the problems had to do with the video hardware for scrolling and
sprites. While tearing as seen on PCs wasn't as common there instead
constant issues of flicker, both due to managing activity during the VBI and
attempting to get more sprites on screen than the hardware was intended to
support. The problems set in when you needed more sprites on a single
horizontal portion of the screen. The same sprite could made to appear
multiple times though the vertical scan and you just had to be careful to
track which occurance of that sprite was to be acted on when a collision
occurred.

Which isn't to say you couldn't tearing on 8-bit machines. If your email
address is an indicator you probably didn't see much of the
pple ][ system

No, I haven't seen any games on an Apple ][ system, but I did have a C64,
Amstrad CPC (no hardware sprites and still no tearing, although sometimes
jerkyness),
ZX Spectrum (attribute clash, but no tearing), PC Engine absolutely no
tearing,
NES, SMS, SC3000... all good
but they had a major game market here. (You may have encountered some
popular Apple ports that treated machines like the C-64 as Apples, ignoring
most of the hardware advantages.) These systems had essentially no hardware
assistance for video functions. It all had to be done by hand. Any game that
needed to do a lot of fast scrolling, like the popular Choplifter, would
show a great deal of tearing in the display. It was simply beyond the means
of a sub-2 MHz 6502 to update the display quickly enough to avoid the
problem. We Atari 800 fans were quite annoyed at the number of of games
that

I guess they didn't (or weren't able for a technical reason) to syncronise
the display with
the frame flyback signal?
failed to take advantage of the chipset and just treated it as an Apple.

This problem prevailed also in the PC world. Video hardware acceleration
really didn't come into its own on the PC until a unified API became
standard. A 16 MHz 286 with a 256K VGA card could replicate the Apple ][
version of Choplifter (which was essentially black & white with artifacted
color) with a much smoother display but by then the audience would expect
much better graphics. Inevitably the improved graphics would be enough to
swamp the system and the same kinds of display flaws were back in evidence.

Much the same could be seen in games produced for both the Amiga and
Atari ST. The Amiga had hardware scrolling, hardware sprites and hardware
support for more complex graphic objects called MOBS in Amiga-ese, as well
as a bunch of other really useful things from a game developer's
perspective. The Atari lacked all this, being more price conscious. (A
blitter chip was added to later model but poorly supported by game
developers who didn't want to lose the installed base of older ST models.)

And even the Atari ST doesn't have tearing (although they do have Jerky
sprites sometimes
and Jerky scrolling).
Games written first for the Amiga and ported to the Atari often lost quite a
bit in the process, not only due to the lesser color range and depth on the
Atari but also due to the greater difficulty doing all the display
manipulation in software. And of course, games ported quickly the ST to the
Amiga lacked the full splendor of native Amiga games.

That didn't stop them from being good games on their own merits but
people like to see their choice of system favored. The moral is that no
matter how much power you add to your video platform game developers will
soon operate at its limits.

Plus there is a further issue of desktop PCs I'll get to below.


No, I'm just going to suggest that you're desperately ignorant of the
way the Win32 APIs work. That, and you need to be more clear about the
nature of your complaint.

The routines used for the GDI are designed to work independently of
hardware acceleration and are completely separate from the DirectX suite..
They'll use it if available but otherwise they get by on the most minimal of
systems for the generation in question. The level of hardware acceleration
used by Windows is a easily accessed control panel setting. Perhaps you
should check yours.

Of course it says Full Hardware Acceleration. I can see that sometimes a
game
might need to make trade offs because different systems may be setup for
different
screen refresh rates therefore the higher the ratio between the refresh rate
and
the CPU speed, the less time a game has to prepare the next frame, but that
is only
if they want to spend lots of time making screen shots that would look nice
if they were
static, when they move they tear. Why couldn't they make the games in
question perhaps
less colourful (yes, some games have such options as turning off texture
resultion and double/tripple
buffering sometimes helps a bit), but they seem to not be able to get around
the problem fully.
On this fairly ancient machine in front in me (dual Celeron 533, Voodoo
3 3000 16 MB, 256 MB RAM, Win2K) I can grab the window containing this
message in progress and move it around while have the text remain completely
readable. The edges haves a bit of redraw ugliness but only if I move the
window so quickly its contents are no longer legible. It was just a few
years earlier that this was a big deal for any windowing system and many
would not even attempt to preserve the display until the system thought you
were done moving things around. At one point they added a control panel
checkbox to Windows to allow user control. Older machines with weaker CPU
and/or video hardware could slow to a crawl attempting to keep up with the
redraw task. On such a machine it was better to simply allow the window to
remain blank during movement since it would be unusual to be manipulating
any data while moving the window containing it.

Compare this window, an object created with little hardware
acceleration, to any hardware sprite on an 8-bit system. In terms of data
volume the window is orders of magnitude greater and has few of the
constraints , such as size color depth, typical of hardware sprites. The
simple text I'm looking at now could just as easily be a PhotoShop measured
in tens of megabytes. That window would bring this system to its knees just
as moving a Word window could on a typical Win95 generation machine.

But they are trying to move more than the machine can obviously handle
without glitches.
Now, if you want rock solid windows with full hardware acceleration on a
consumer system, check out the current generation Mac OSX systems. Due to
the close control they have of their hardware Apple was able to dictate that
all machines from a certain date forward would have a specified minimum of
video hardware functionality. (Essentially the DirectX 7 generation although
Apple would put it in terms of an OpenGL version.) This became part of the
minimum specs for OSX, specifically the Quartz rendering system. (OSX
running as a headless server doesn't care about video hardware, of course.)
Microsoft is doing this also in the next major release of Windows, currently
referred to as Longhorn. If you look around you can find some video clips
for demonstrations of the fully hardware accelerated GDI that integrates
with DirectX.

It looks great but the funny thing is that the windows are no longer
treated as a sort of hardware sprite. They're going to be polygons with the
windows controls and contents painted on. The plan is to treat DirectX 7
class video chips as entry level without all the whiz-bangs and DirectX 9
class chips as the top of heap, at least until somebody can think of a
must-have application for more advanced video hardware for desktop apps.

Now, you might wonder, why is this coming so long after the hardware has
become common? Because the hardware is less common than you think. Microsoft
needs to sell new versions of Windows to companies that have vast numbers of
systems and don't change them out that often. Those corporate desktop

Maybe for corporate, but home systems, they could have made it nicer from
the start
or at least as an option, there are so many less important options (to me at
least) that
have made it into Windows long ago.
typically have minimal video hardware compared to consumer systems. Recently
the embedded solution in Intel chipsets have come to match the minimal
requirements for Longhorn, so the single most important customer finally
became ripe to run a fully hardware driven desktop OS.

Does this mean displays in games will become perfect? Nope. Just as a
window that could have an older machine to a dead halt while being moved is
now trivial game developers are going to find the limits of future PCs. The
desktop environment for Longhorn is fairly predictable and should come
nowhere near taxing a system released any time in the near future since
Longhorn will have to behave reasonably well on older machines.

Game developers are driven by a different set of motivations. While they
don't want to restrict their potential audience they also need to make use
of the newer hardware to make products more dazzling than their predecessors
already available in the bargain bins. Players will have the standard
options for lowering the game's power needs but most will try to get as much
as they can before the display becomes completely useless even if it means
putting up with some flaws like tearing.

I classify Tearing as an annoyance like Jerky or Flickery Sprites, Jerky
Scrolling etc.
In all seriousness, I play Windows games a lot less often than I would if
the problem
wasn't there. Maybe it is just me and a couple of other friends (not all of
them care),
but it definately isn't a pleasant sight. Of course graphics don't make a
good game.
The worst part is that this is pretty much impossible to avoid on a PC.
On a console developers can depend on unit #1 through unit #10,000,000
behaving exactly the same. This allows a great deal of fine tuning and is
why, even though it has much in common with a PC, you don't typically see
tearing on a Xbox game from a company with a good QA operation.

The same cannot be said for a PC. You can have two machines of almost
exactly the same specs but one has an Intel chipset and the other a VIA
chipset. These machines will produce slightly different behavior when you
push them to any extent. This can be largely invisible except on benchmarks
and that area where humans are very sensitive, visual pattern recognition.

It isn't just on IBM descended PCs. We ran into big problems way back
when the Amiga 500 and 2000 came out. For most purposes a first generation
512K 500/200 was the same platform as the earlier 1000 but there were little
differences that cause timing nightmares. On one game with a tactical map
display (Lords of the Rising Sun) the update moved at only about 25% of the
1000's speed on a 500/2000 system. It became necessary to test for which
type of Amiga the game was loading on and adjust accordingly. (I'm not sure
if this was ever implemented because I left the company before the game
shipped and QA testing left me with no interest in playing it on my own
time.)

This can be avoided if you can convince developers and gamers to settle
for games that treat four year old PCs as the current pinnacle. Don't
program for 2004 PCs until 2008. Not very likely, you think?

But, even the most up to date currently fastest consumer hardware has the
same problem, it
has nothing to do with it being 4 years old. Of course my system is about 4
years old, but the
problem isn't only on my system. I am yet to see a system without the
problem.

You may consider what I have discussed not important, but if you remember
the days of
the ZX Spectrum, for every game reviewed or released, attribute class was an
important point
of discussion. As for technical limitatations, depending on the amount of
colour required and the type
of game it was in some instances impossible to avoid the problem totally,
but PC hardware is many
many many times more advanced than a ZX spectrum and we get yet a different
type of graphical
glitch that seems to show itself as common on PC hardware as it is with the
attribute problem on a
ZX Spectrum. Surely if Microsoft wants people to develop good glitch-free
games for Windows
(as long as the developers are actually capable) they would attempt to solve
the problem in their
desired game API - which currently is DirectX.

Regards, Julian
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top