Is there a technical reason why there's no DirectX 10 & 11 for XP?

  • Thread starter Thread starter Doc
  • Start date Start date
D

Doc

Is there a genuine technical reason or is it just Micro$oft pushing you to their next offering?
 
Doc said:
Is there a genuine technical reason or is it just Micro$oft pushing you to their next offering?

There are some differences, but we can pretend they were introduced
for business reasons, and not for some technical reason. There is
no particular reason to continue compositing windows, while your
favorite 3D game owns the whole screen, but that's how they made it
work. They could have just paged out the desktop when a user
wanted to game.

http://en.wikipedia.org/wiki/DirectX

"Direct3D 9Ex, Direct3D 10, and Direct3D 11 are only available
for Windows Vista and Windows 7 because each of these new versions
was built to depend upon the new Windows Display Driver Model
that was introduced for Windows Vista.

The new Vista/WDDM graphics architecture includes a new
video memory manager supporting virtualization of graphics hardware
for various applications and services like the Desktop Window Manager."

http://en.wikipedia.org/wiki/Desktop_Window_Manager

"Architecture

The Desktop Window Manager is a compositing window manager. This means
that each program has a buffer that it writes data to; DWM then
composites each program's buffer into a final image. By comparison,
the stacking window manager in Windows XP and earlier (and also Windows
Vista and Windows 7 with Windows Aero disabled) comprises of a
single display buffer to which each all programs write."

I think what that means, is video card memory is used to hold
each program window. Whereas, in the earlier graphics
implementations, each program window is held in system memory,
and video card memory only holds the final (frame buffer)
image. And as far as I know, when you play a 3D game on a
DWM enabled system, the desktop compositing is still held
in the video card while the game plays. Even though you
can't see the desktop at the time. The video card memory
probably has less than 128MB held up doing that. (So
if you're gaming with a 512MB card on a DX10 system,
some of that memory is being "wasted".)

On a DirectX 9 system, when a game starts to play, I think
the game basically "owns a display channel". Whereas in the
DirectX 10/WDDM/DWM world, the access is no longer exclusive,
and the game actually shares some resources with other
things on the computer. Your desktop display is "alive", but
you just can't see it. This is a good reason to have a bit
more video memory on the video card, and at the same time,
switch to 64 bit operation so you don;t run out of address
space while doing so.

I'm not convinced they had to do it that way. They could
have retrofitted the new display ideas onto the old
architecture, if they wanted. The video card is flexible
enough from a hardware standpoint, you could have done
things either way.

Paul
 
From: "Doc said:
Is there a genuine technical reason or is it just Micro$oft pushing you to their next
offering?

The latter. It is their way of forcing an OS EoL.
 
Back
Top