Ziff-Davis Benchmarking Operation

T

Ted Jones

Anyone aware of any decent, free video benchmarking utilities
available since the demise of the Ziff-Davis Benchmarking Operation
[ZDBop] ?





======

======




Quick clicks about 3D WinBench 2000
3D WinBench 2000 measures a PC's 3D subsystem performance using the
Direct3D interface. The purpose of 3D WinBench 2000 is to give you a
way to test hardware graphics adapters, drivers, and the value of such
enhancing technologies as MMX(TM), 3DNow!(TM), Streaming SIMD
Extensions (SSE), and hardware accelerated transformation and
lighting.
Here's the least you need to know to get the most out of 3D WinBench
2000:
• 3D WinBench 2000 lets you test hardware graphics adapters, drivers,
and the value of such enhancing technologies as SSE and 3DNow!. It
focuses on the 3D operations that you're likely to find in present and
future games.
• You must have DirectX(TM) 7 (or later) for 3D WinBench 2000 to
operate. DirectX 7 operates on Windows 98/98 Second Edition or Windows
2000.
• 3D WinBench 2000 provides two main suites of tests as well as other
specialty suites you can use when checking specific 3D performance
issues. In addition, 3D WinBench 2000 provides a number of individual
tests you can run. (For a complete list of the 3D WinBench 2000 tests,
see the Select Tests dialog box.)
• Different tests return different results to help you learn about
your system's performance. The 3D WinMark 2000 suite returns an
overall result that summarizes the computer's performance in frames
per second. The bigger the score, the better the performance.
• You can also run individual 3D WinMark tests or additional scene
tests not included in the suite to return a frames per second score.
In addition, 3D WinBench includes some new features over 3D WinBench
2000 Version 1.0. Related topics
Latest news as of 8/8/2000
Requirements
Running a test
The main 3D WinBench 2000 tests
License agreement
Trademarks and copyrights
Acknowledgments

LICENSE AGREEMENT FOR ETESTING LABS' 3D WINBENCHâ„¢ VERSION 1.1
READ THIS AGREEMENT CAREFULLY BEFORE USING THE SOFTWARE EMBODIED IN
THE 3D WINBENCHâ„¢ 2000 VERSION 1.1 CD-ROM OR, IF PRELOADED ON YOUR HARD
DISK, DOWNLOADED OR IF PROVIDED AS PART OF A COLLECTION, THE
PRELOADED, DOWNLOADED OR COLLECTED FILE(S) (the "Media"). Embodied in
the 3D WinBenchâ„¢ 2000 Media is the 3D WinBenchâ„¢ 2000 computer program
and related documentation (the "Software"). eTesting Labs Inc., having
a place of business at 1001 Aviation Parkway, Morrisville NC 27560, is
the Licensor under this Agreement and you are the licensee. By using
the Software, in whole or in part, you agree to be bound by the terms
of this Agreement. If you do not agree to the terms of this Agreement,
promptly return the Software to eTesting Labs at the above address
(or, if downloaded or preloaded on your hard disk, delete the
Software, or if provided as part of a collection, cease use of the
Software). Title to the Software and all copyrights, trade secrets and
other proprietary rights therein are owned by eTesting Labs Inc. All
rights therein, except those expressly granted to you in this
Agreement, are reserved by eTesting Labs Inc.
1. Limited License
This Agreement grants you only limited rights to use the Software.
eTesting Labs Inc. grants you a non-exclusive, non-transferable, non-
assignable license to use the Software on a single dedicated computer
or on a file server networked with multiple PC computers for the sole
purpose of conducting benchmark tests to measure the performance of
computer hardware and operating system configurations. The Software is
protected by copyright laws and international treaty provisions.
Therefore, you must treat the Software like any other copyrighted
material and, other than installing the Software pursuant to the
license granted above, you are prohibited from copying the Software.
If you install the Software, you shall keep any original CD-ROM solely
for backup or archival purposes.
eTesting Labs Inc. hereby grants you the right to publish, except in
any country or state where a third party claims during the term of
this license that such publication infringes that party's proprietary
rights, benchmark test results obtained by you from your use of the
Software, provided that with the publication of each such result you:
A. Identify eTesting Labs Inc., the name and version number of the
benchmark Software used (i.e., eTesting Labs Inc.'s 3D WinBenchâ„¢ 2000
Version 1.1);
B. State that the test was performed without independent verification
by eTesting Labs Inc. and that eTesting Labs Inc. makes no
representations or warranties as to the result of the test;
C. Follow proper trademark usage and acknowledge the trademark rights
of eTesting Labs Inc. and its affiliates (e.g., "[ ] achieved a 3D
WinMarkâ„¢ 2000 score of X on 3D WinBenchâ„¢ 2000 Version 1.1. 3D
WinBenchâ„¢ and WinMarkâ„¢ are trademarks of Ziff Davis Publishing
Holdings Inc., an affiliate of eTesting Labs Inc., in the U.S. and
other countries.");
D. Identify the specific 3D WinBenchâ„¢ 2000 score(s) being reported
(i.e., eTesting Labs Inc.'s 3D WinBenchâ„¢ 2000 Version 1.1 3D WinMarkâ„¢
2000) and in all cases include the 3D WinMark 2000 overall score;
E. Identify the exact name, processor speed and type, number of
processors, amount of RAM, and amount of secondary RAM cache, if any,
of the PC used for the test (e.g., WXY Corp. Model 400 with dual 400
MHz Intel Pentium® II CPU, 128MB of RAM, and 512KB L2 RAM cache on
each processor);
F. Identify the exact graphics adapter (or graphics acceleration
technology, if no adapter) name, amount and type of RAM on it,
graphics driver name and version, desktop resolution and color depth,
and DirectXâ„¢ version that produced the result as well as any other
graphics acceleration subsystem or driver settings that could affect
the test results (e.g., XYZ Corp. XYZ Graphics adapter with 32MB SDRAM
with XYZ.DRV version 1.23 driver, 1024 by 768 pixels with 16-bit
color, and DirectXâ„¢ version 7.0);
G. Identify the settings in the 3D and Mode tabs of the 3D WinBench
2000 Test Settings dialog box used for the test (e.g., Direct3Dâ„¢ HAL,
double buffer, flip, 1024 by 768 pixels, 32-bit color, 32-bit Z
buffer, refresh rate 75Hz, no anti-aliasing, full screen, v-sync off,
stereo off);
H. Identify the operating system version (e.g., Microsoft® Windows® 98
Second Edition) and the DirectXâ„¢ version that produced the result
(e.g., DirectXâ„¢ version 7.0);
I. State that all products used in the test were shipping versions
available to the general public; and
J. Supply the results of 3D WinBenchâ„¢ 2000's 3D Quality suite.
Notwithstanding the foregoing, if, and only if, you wish to publish
the benchmark test results obtained by using the Software in
advertisements, you may do so, provided that you:
1. Identify eTesting Labs Inc., the name and version number of the
benchmark Software used (i.e., eTesting Labs Inc.'s 3D WinBenchâ„¢ 2000
Version 1.1);
2. State that the test was performed without independent verification
by eTesting Labs Inc. and that eTesting Labs Inc. makes no
representations or warranties as to the result of the test;
3. Follow proper trademark usage and acknowledge the trademark rights
of eTesting Labs Inc. and its affiliates (e.g., "[ ] achieved a 3D
WinMarkâ„¢ 2000 score of X on 3D WinBenchâ„¢ 2000 Version 1.1. 3D
WinBenchâ„¢ and WinMarkâ„¢ are trademarks of Ziff Davis Publishing
Holdings Inc., an affiliate of eTesting Labs Inc., in the U.S. and
other countries.");
4. Identify the specific 3D WinBenchâ„¢ 2000 score(s) being reported
(i.e., eTesting Labs Inc.'s 3D WinBenchâ„¢ 2000 Version 1.1 3D WinMarkâ„¢
2000) and in all cases include the 3D WinMark 2000 overall score; and
5. Include a statement in the advertisement that a description of the
environment under which the test was performed is available upon
request and you shall provide a fax number, telephone number, e-mail
address, or URL on the World Wide Web where such information may be
obtained. Upon such request, you shall provide the information
required under paragraphs E through J above.
This Agreement and your rights hereunder shall automatically terminate
if you fail to comply with any provision of this Agreement. Upon such
termination, you shall cease all use of the Software, cease the
transfer of any copies of the Software, and cease the publication of
benchmark test results obtained by you from use of the Software.
Further, you shall delete the Software and destroy all tangible copies
of the Software and other materials related to the Software in your
possession or under your control; or, if downloaded or preloaded on
your hard disk or if provided as part of a collection, you shall cease
use of and destroy any and all copies of the Software in your
possession or under your control.
2. Additional Restrictions
A. You shall not (and shall not permit other persons or entities to)
rent, lease, sell, sublicense, assign, or otherwise transfer the
Software or this Agreement. Any attempt to do so shall be void and of
no effect.
B. You shall not (and shall not permit other persons or entities to)
reverse engineer, decompile, disassemble, merge, modify, include in
other software or translate the Software, or use the Software for any
commercial purposes, except for the publication of test results, as
provided above.
C. You shall not (and shall not permit other persons or entities to)
remove or obscure the copyright, trademark or other proprietary
notices or legends of eTesting Labs Inc. or its affiliates or
licensors, from any of the materials contained in this package or
downloaded.
D. You acknowledge that the Software contains eTesting Labs Inc.'s
trade secret information and you shall not disclose or disseminate
such information other than as provided herein.
3. Disclaimer of Warranty; Limitation of Liability
THE SOFTWARE AND THE MEDIA ARE PROVIDED "AS IS" WITHOUT WARRANTY OF
ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION,
ANY WARRANTY OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
THE ENTIRE RISK AS TO THE RESULTS AND PERFORMANCE OF THE SOFTWARE AND
THE MEDIA IS ASSUMED BY YOU. ETESTING LABS INC. AND ITS AUTHORIZED
DISTRIBUTORS ASSUME NO RESPONSIBILITY FOR THE ACCURACY OR APPLICATION
OF OR ERRORS OR OMISSIONS IN THE SOFTWARE OR THE MEDIA. IN NO EVENT
SHALL ETESTING LABS INC. OR ITS AUTHORIZED DISTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES
ARISING OUT OF THE USE OR INABILITY TO USE THE SOFTWARE OR THE MEDIA,
EVEN IF ETESTING LABS INC. OR ITS AUTHORIZED DISTRIBUTORS HAVE BEEN
ADVISED OF THE LIKELIHOOD OF SUCH DAMAGES OCCURRING. ETESTING LABS
INC. AND ITS AUTHORIZED DISTRIBUTORS SHALL NOT BE LIABLE FOR ANY LOSS,
DAMAGES OR COSTS, ARISING OUT OF, BUT NOT LIMITED TO, LOST PROFITS OR
REVENUE, LOSS OF USE OF THE SOFTWARE OR THE MEDIA, LOSS OF DATA OR
EQUIPMENT, THE COSTS OF RECOVERING THE SOFTWARE, THE MEDIA, DATA OR
EQUIPMENT, THE COST OF SUBSTITUTE SOFTWARE, MEDIA, DATA OR EQUIPMENT
OR CLAIMS BY THIRD PARTIES, OR OTHER SIMILAR COSTS.
SOME STATES DO NOT ALLOW EXCLUSION OR LIMITATION OF IMPLIED WARRANTIES
OR LIMITATION OF LIABILITY FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES; SO
THE ABOVE LIMITATIONS OR EXCLUSIONS MAY NOT APPLY TO YOU.
4. U.S. Government Restricted Rights
The Software is licensed subject to RESTRICTED RIGHTS. Use,
duplication or disclosure by the Government or any person or entity
acting on its behalf is subject to restrictions as set forth in
subdivision (c)(1)(ii) of the Rights in Technical Data and Computer
Software Clause at DFARS (48 CFR 252.227-7013) for DoD contracts, in
paragraphs (c)(1) and (2) of the Commercial Computer Software-
Restricted Rights clause in the FAR (48 CFR 52.227-19) for civilian
agencies, or in the case of NASA, in Clause 18-52.227-86(d) of the
NASA Supplement to the FAR, or in other comparable agency clauses.
5. General Provisions
Nothing in this Agreement constitutes a waiver of eTesting Labs Inc.'s
rights under U.S. copyright laws or any other Federal, state, local or
foreign law. You are responsible for installation, management, and
operation of the Software. However, if you have questions or problems
regarding the Software or Media, you can write to eTesting Labs, 1001
Aviation Parkway, Suite 400, Morrisville, NC 27560 Attn: Marcom
Assistant. This Agreement constitutes the entire agreement between the
parties with respect to the Software and the Media and supersedes all
prior or contemporaneous understandings or agreements, written or
oral, regarding such subject matter. This Agreement shall be governed
by and construed in accordance with the laws of the State of New York,
applicable to agreements made and performed in New York. If any
provision of this Agreement shall be held by a court of competent
jurisdiction to be illegal, invalid or unenforceable, the remaining
provisions shall remain in full force and effect and the unenforceable
provision shall be reformed without further action by the parties and
only to the extent necessary to make such provision valid and
enforceable and to achieve the like economic intent and effect of such
provision.
Trademarks
NetBench(R), ServerBench(R), WinBench(R), and Winstone(R) are
registered trademarks and 3D WinBench(TM), Audio WinBench(TM),
BatteryMark(TM), CD WinBench(TM), CPUmark(TM), JMark(TM),
WebBench(TM), WinMark(TM), ZDigit(TM), and ZDNet(TM) are trademarks of
Ziff Davis Media Inc.
3DNow!(TM) is a trademark of Advanced Micro Devices, Inc.
ARK8100(TM) and ARK8800(TM) are trademarks of ARK Logic, Inc.
Banshee(TM) and Voodoo Graphics(TM) are trademarks of 3dfx
Interactive, Inc.
Cirrus Logic(TM) and Laguna(TM) are trademarks of Cirrus Logic, Inc.
Intel(R) and Pentium(R) are registered trademarks and MMX(TM) is a
trademark of Intel Corporation.
Millennium G200(TM) and Millennium G400(TM) are trademarks of Matrox
Graphics, Inc.
Microsoft(R), Windows(R), and Windows NT(R) are registered trademarks
and DirectDraw(TM), Direct3D(TM), and DirectX(TM) are trademarks of
Microsoft Corporation.
PCX2(TM) and PowerVR(TM) are trademarks of NEC Electronics Inc.
Permedia(R)2 is a registered trademark of 3Dlabs Inc.
Rage 128(TM) is a trademark of ATI Technologies Inc.
RIVA TNT(TM), RIVA 128(TM), and RIVA 128ZX(TM) are trademarks of
NVIDIA Corporation.
Savage 4(TM) and ViRGE(TM) are trademarks of S3 Incorporated.
V2x00(TM) and Verite 2x00(TM) are trademarks of Rendition
Incorporated.
Copyrights
WinBench 99 Version 1.0 © 1993-1999, Winstone 99 Version 1.0 ©
1993-1998, 3D WinBench 99 Version 1.0 © 1997-1999, Audio WinBench
Version 1.0 © 1998, BatteryMark Version 2.0 © 1997, CD WinBench ©
1998-1999, Ziff Davis Media Inc. All rights reserved.
PSAPI.DLL © 1997-1996. Microsoft Corporation. All rights reserved.
Acknowledgments
Numerous people worked together to create 3D WinBench 2000 version
1.1. Members of the primary development team are:
Dave Morey, Chief Technologist
Jess Diard, Developer
Bo Wilson, Developer
Grady Ormond, Technical Specialist
L. Louise VanOsdol, Technical Writer
Thanks goes to NVIDIA and Intel Corporation for providing 3D models
used in some of the 3D WinMark tests.
We appreciate the following vendors who generously loaned equipment
used during our benchmark testing period:
3dfx Interactive, Inc.
3Dlabs Inc.
Advanced Micro Devices, Inc.
Alliance Semiconductor
ARK Logic, Inc.
ATI Technologies Inc.
Chromatic Research, Inc.
Cirrus Logic, Inc.
Compaq Computer Corporation
Creative Labs Technology
Cyrix Corporation
Dell Computer Corporation
Diamond Multimedia Systems, Inc.
ELSA Inc.
Gateway 2000, Inc.
Hercules Computer Technology, Inc.
IBM Corporation
Intel Corporation
Leadtek Research Inc.
Matrox Graphics, Inc.
Metabyte, Inc.
Micro-Star International Co., Ltd.
NEC Electronics, Inc.
Number Nine
NVIDIA Corporation
OAK Technology
Orchid Technology
Premio Inc.
Quantex Microsystems Inc.
Quantum 3D
Real 3D, Inc
Rendition Incorporated
S3 Incorporated
Silicon Integrated Systems Corporation
Spider Graphics Inc.
STB Systems Inc.
Trident Microsystems, Inc.
VideoLogic Limited
Additional eTesting Labs staff who helped make 3D WinBench 2000
possible are:
Elizabeth Barnes
Joe Benardello
Bill Catchings
Lee Dorrier
Jeff Downey
Jo Drake
Jennie Faries
Tom Franz
Beth Geibig
Eric Hale
Laura Higgins
Peter Howard
Chadd Hudson
David Keim
Libby Keim
John Knaus
Scott Lane
James Lee
Kasey Lee
Chris Lemmons
Nadine Maloney
Gina Massel-Castater
Melissa Michael
Eric Ogburn
Barry Shelton
Lew Shiner
Gary Smith
Keith Turner
John Upchurch
Mark Van Name
Randi Vieberg
Allyn Vogel
James Ward
Laura H. White
Jennifer Wollin
Many people in different units of Ziff Davis Media contributed to the
design, testing, and production of 3D WinBench 2000. They are:
David Bardes
Lulit Bezuayehu
John Blackford
Steve Buehler
Lloyd Case
Michael Caton
Peter Coffee
Rich Fisco
Giorgio Gobbi
Laurence Grayson
Ibrahim Gul
Edward Henning
Bob Kane
Jim Louderback
Michael Miller
Mark Mitrani
Patrick Norton
Daniel Robinson
Jeff Sacilotto
Dave Salvator
Kai Schmerer
Christoph Scholze
Nick Stam
Kelvyn Taylor
Jae Yang
Chris M. Yates
Purpose of 3D WinBench 2000
3D WinBench 2000 is concerned mostly with the 3D that you might find
in games and less with the 3D you might find in CAD, VRML, or
presentation applications. 3D WinBench 2000 provides you with
objective performance scores and measures, such as how quickly an
adapter renders a scene. It also leaves space for subjective
impressions:
• Is the fogging effect smooth?
• Are objects rendered with correct perspective effects?
• Are textures mapped to objects accurately?
Game players find these important considerations. 3D WinBench 2000
helps you distinguish a superior product from its competitors.
3D applications and adapters are a rapidly emerging market, and very
few applications or adapters exploit all the possibilities and options
available within Microsoft's Direct3D specification. 3D WinBench 2000
aims to measure both the current and the future state of hardware 3D
accelerator performance.
The main 3D WinBench 2000 tests
3D WinBench 2000 contains numerous suites and tests that help you
evaluate your PC's 3D performance.
You can find tables that summarize many of the new features plus
various complexity measurements of the 3D WinMark tests via the 3D
WinBench Web site (http://www.3dwinbench.com).
NOTE: For a complete list of these tests, go to the Select Tests
dialog box.
Here are two key tests that you will want to run (in the order that
you need to run them):
1. 3D Quality suite contains 69 tests that measure different Direct3D
functions. You need to answer Yes or No after each test to indicate
whether the PC displayed the function correctly. (If you run these
tests in automated mode, 3D WinBench supplies the answers for you.)
These tests help the benchmark and you know the hardware's
capabilities.
The Quality tests fall into two groups: 3D Quality/WinMark, which
includes only those quality features required by the 3D WinMark suite,
and 3D Quality/Additional, which contains the remaining Quality tests.
You must run at least the 3D Quality/WinMark group before you can run
the 3D WinMark suite.
2. 3D WinMark suite contains scene tests that vary in both complexity--
the number of triangles they use to form their images--and the number
of quality-enhancing options (such as trilinear filtering and specular
highlights) they employ.
Each test requires a set of features. Use the Quality tests to see if
these features have been implemented correctly. If they're not, the
test won't run. Most of the time, if a test fails because a feature is
missing, 3D WinBench 2000 does not display an error message.
Each test flies through a scene using a predefined path and measures
the rendering speed in frames per second. This suite returns an
overall 3D WinMark result summarizing the computer's performance. The
3D WinMark result is the average of the individual test results. It is
reported as a frames/second value.
Running 3D WinBench 2000
Once you have installed 3D WinBench 2000, you can easily run its tests
and get results. The key test suite is the 3D WinMark. However, that
suite requires that you first run the 3D Quality suite.
This is an overview of a typical test scenario:
1. Start 3D WinBench 2000.
2. Run the 3D Quality suite first so 3D WinBench can determine the
hardware's Direct3D capabilities. You must run the 3D Quality suite
(or at least the 3D Quality/WinMark suite) before you can run the 3D
WinMark suite.
3. Run the 3D WinMark suite.
The detailed steps are:
• Step 1: Start 3D WinBench 2000
• Step 2: Select a test
• Step 3: Run the 3D Quality suite
• Step 4: Run the 3D WinMark
• Step 5: Save the results
• Step 6: Compare your results to other PCs'
• Step 7: Exit 3D WinBench 2000
To see all of the benchmark's test suites and individual tests, open
the Select Tests dialog box. In addition to running the suites that
come with 3D WinBench 2000, you can also create custom test suites
that run the tests you specify.
Related topics
Hardware and software requirements
Before you run a test
Test settings
Results

Hardware and software requirements
These are the requirements for 3D WinBench 2000:
• Microsoft Windows 98, Windows 98SE (Second Edition), Windows Me, or
Windows 2000 SP1 (or later).
3D WinBench 2000 will not run on Windows NT 4.0 because that version
of NT only supports DirectX 3.
To verify that you have Windows Me, Windows 98, or Windows 98 Second
Edition installed:
1. Right click the My Computer icon on the desktop and select
Properties.
2. Click the General tab. The System information will state the
version of Windows 98 or Windows Me running on your computer.
• A Pentium or higher processor.
• 128MB of RAM. (3D WinBench 2000 will run in less RAM, but it may
produce invalid results due to paging activity.)
NOTE: For more details on the memory requirements for 2D and 3D
applications, see Memory requirements for 3D applications.
• 81MB of free disk space for installation. An additional 10MB of free
disk space in order to run the Quality tests.
• A VGA graphics adapter supporting a resolution of at least 800X600
and at least 16-bit color pixels. We recommend a resolution of 1024 x
768 at 75Hz refresh rate. We also recommend a frame buffer memory of
8MB or more, although 3D WinBench 2000 will run with 4MB.
NOTE: With certain adapters, you will need to perform some special
steps.
• DirectX 7 or greater (available from Microsoft at www.microsoft.com/directx).
NOTE: While you must install DirectX 7 for 3D WinBench 2000 to run,
it should work fine with most DirectX 6 display drivers.
You may want to take a look at some additional notes on settings.
Baseline features of display adapters
To run any of the 3D WinMark tests, the display adapters must have the
following baseline features:
• Gouraud shading
• Perspective correction of texture coordinates
• Z buffering
• Modulate texture blending
• Nearest mipmap linear texture filtering (bilinear mipmapping)
• Source alpha pixel blending
Some 3D WinMark 2000 tests require the following features:
• Specular highlights
• Add pixel blending
• Modulate (multiply) pixel blending
• Modulate2x (multiply and then double) pixel blending
• Linear mipmap linear texture filtering (trilinear mipmapping)
Better performance if adapters support these features
In addition, you might see better performance from the 3D WinMark 2000
tests if the adapters support the following features:
• V-sync off
• Palletized textures
• AGP Execute Mode
• DXT compressed textures
• Hardware accelerated transformation and lighting
• Single pass multitexture support
Press the Esc key to stop any running test. 3D WinBench then displays
the Abort/Retry/Ignore dialog box.
If the adapter is not performing properly during a 3D WinMark
performance test, press Ctrl + F to automatically fail the test. (To
fail a test, the test must be running.) 3D WinBench will display a
dialog box asking you to type an explanation of why you failed the
test. After you click OK at this dialog, the Abort/Retry/Ignore dialog
box is displayed.
Press the Esc key to stop any running test. 3D WinBench then displays
the Abort/Retry/Ignore dialog box.
If the adapter is not performing properly during a 3D WinMark
performance test, press Ctrl + F to automatically fail the test. (To
fail a test, the test must be running.) 3D WinBench will display a
dialog box asking you to type an explanation of why you failed the
test. After you click OK at this dialog, the Abort/Retry/Ignore dialog
box is displayed.
The main 3D WinBench 2000 tests
3D WinBench 2000 contains numerous suites and tests that help you
evaluate your PC's 3D performance.
You can find tables that summarize many of the new features plus
various complexity measurements of the 3D WinMark tests via the 3D
WinBench Web site (http://www.3dwinbench.com).
NOTE: For a complete list of these tests, go to the Select Tests
dialog box.
Here are two key tests that you will want to run (in the order that
you need to run them):
1. 3D Quality suite contains 69 tests that measure different Direct3D
functions. You need to answer Yes or No after each test to indicate
whether the PC displayed the function correctly. (If you run these
tests in automated mode, 3D WinBench supplies the answers for you.)
These tests help the benchmark and you know the hardware's
capabilities.
The Quality tests fall into two groups: 3D Quality/WinMark, which
includes only those quality features required by the 3D WinMark suite,
and 3D Quality/Additional, which contains the remaining Quality tests.
You must run at least the 3D Quality/WinMark group before you can run
the 3D WinMark suite.
2. 3D WinMark suite contains scene tests that vary in both complexity--
the number of triangles they use to form their images--and the number
of quality-enhancing options (such as trilinear filtering and specular
highlights) they employ.
Each test requires a set of features. Use the Quality tests to see if
these features have been implemented correctly. If they're not, the
test won't run. Most of the time, if a test fails because a feature is
missing, 3D WinBench 2000 does not display an error message.
Each test flies through a scene using a predefined path and measures
the rendering speed in frames per second. This suite returns an
overall 3D WinMark result summarizing the computer's performance. The
3D WinMark result is the average of the individual test results. It is
reported as a frames/second value.
The latest 3D WinBench 2000 news
(08/08/00) Changes in Version 1.1
We describe the changes between Version 1.0 and Version 1.1 below.
Performance measurements have not changed, so you can compare Version
1.0 scores Version 1.1 scores.
Fewer files installed and less disk space required
In Version 1.1, we consolidated the thousands of image and scene files
installed in the \ZDBENCH\3DWB2000 directory into a small number of
compressed ZIP files. The benchmark decompresses the image and scene
files from the ZIP files on-the-fly, as needed.
User-interface changes
The user-interface will now remind the tester to reboot before running
any tests. Various logos and company names have been changed to
reflect recent reorganizations.
Sample results
The Sample 3D.ZTD and Sample 3D CPU.ZTD databases in the \ZDBENCH
\RESULTS directory contain results from Version 1.0.
Spurious queued frames warnings
3D WinBench 2000 Version 1.0 often reported spurious warnings about
queued frames. For example:
At the end of the named test, n frames seemed to be queued for
drawing. The adapter was t seconds behind the application.
For responsive game play, display devices should not queue more than
two frames of information.
Examine the notes attached to each result to determine if other tests
also queued too many frames.
Version 1.1 will only issue the warning when more that 10 frames
appear to be queued at the end of a test.
Status line garbled on Windows 2000
The status line at the bottom of the quality tests is displayed
correctly in Version 1.1 when using Windows 2000.
Sun is "displaced" in Stations scene
The sun in the Stations scene no longer has a dim "glow" around it in
Version 1.1.
Help file problem
The Table of Contents in the 3D WinBench 2000 Help section, which was
distributed with the product, no longer states "3D WinBench 3000." It
now reads "3D WinBench 2000."
The Compressed Textures quality test (number 59) may fail some
implementations of DXT texture compression.
In Version 1.0 we based the good image on the NVIDIA GeForce
implementation (which was the only one that worked at the time the
benchmark was released). The GeForce decompression is not completely
accurate. Implementations that are inaccurate in the opposite
direction (as the GeForce) failed to meet the automation accuracy
requirements when compared to the good image from the GeForce. In
Version 1.1 we use a good image from an accurate decompressor to solve
the problem.
D3DRENDERSTATE_COLORVERTEX is set to TRUE
In Version 1.0 the benchmark did not change the value of the
D3DRENDERSTATE_COLORVERTEX state, which defaults to TRUE. An
application can use the TRUE state to indicate it uses per-vertex
colors instead of material colors. However, the benchmark does not
supply per-vertex colors. The display driver should use the material
color in this case. The benchmark may confuse some drivers by leaving
this render state set to TRUE and supplying a vertex format that does
not have a diffuse color. Version 1.1 sets the state to FALSE to avoid
any confusion.
Twelve lights are temporarily enabled in the Lighting quality test
Version 1.0 inadvertently enabled twelve lights when initializing the
Lighting quality test (number 56). The enabled lights may cause a
failure if the tester requests hardware accelerated transformation and
lighting and the implementation supports fewer than 12 lights. Version
1.1 disables all lights as they are created to avoid the problem. Some
of the performance tests do require up to hardware 8 lights at a time.
Transformation and Clipping quality test fails on Rage Mobility based
accelerators
In Version 1.1. the quality test automation for the Transformation and
Clipping test (number 55) no longer fails accelerators based on the
Rage Mobility chipset.
Transformation and Clipping quality test fails with a General
Protection Fault under Windows 2000
The Transformation and Clipping quality test (number 55) no longer
fails with a General Protection Fault under Windows 2000.
Large Textures quality test fails with "An exception occurred in
BEND3DIM.EXE"
The Large Textures quality test (number 58) no longer fails with the
following error:
An exception occurred in BEND3DIM.EXE.
DotProduct3 quality test fails on Permedia3 based accelerators
In Version 1.1. the quality test automation for the DotProduct3 test
(number 24) no longer fails accelerators based on the Permedia3
chipset when using 32 bit color. The chipset does not compute the dot
product very accurately in 32 bit color mode.
Z Comparison and Z Bias tests fails with a GPF when anti-aliasing is
forced on
The automation for the Z Comparison and Z Bias quality tests (numbers
43-47) no longer fail with a General Protection Fault when anti-
aliasing is forced on in the newer 3dfx Voodoo5 and NVIDIA GeForce
drivers. Note that you should not run any 3D WinBench tests with anti-
aliasing forced on as the automation for some quality tests will be
confused and return incorrect results.
Bump Mapping quality tests do not support 32 bit bump texture formats
In Version 1.1, The Bump Mapping quality tests (number 51) will use a
32 bit bump texture format if available. Version 1.0 only used 16 or
24 bit formats. The ATI Radeon only supports 16 and 32 bit formats and
would not work correctly in Version 1.0.
Front only render mode with V-sync off fails with a bad refresh rate
error message
In Version 1.0, when using the "Front only" render mode with v-sync
turned off, the benchmark would return an error indicating the refresh
rate was incorrect. The front buffer/v-sync off combination works in
Version 1.1."
(11/04/99) Important new features in this version of 3D WinBench are:
a choice of whether or not to use hardware accelerated transformation
and lighting, two new WinMark scenes (Hangar and Speedway), visual and
audible notifications on the completion of the rendering process, and
additional Quality tests.
Here's a summary of what we changed:
• 3D WinBench 2000 supports hardware accelerated transformation and
lighting. You can test devices that support this feature with the
acceleration enabled or disabled.
• The 3D WinMark is now the average of the results for nine tests. One
of the tests requires trilinear mipmap filtering while the other eight
require bilinear mipmap filtering. The nine tests cover a wider range
of triangle or pixel complexity than tests from last year.
• We've added new Quality tests to 3D WinBench 2000, bringing us to69
Quality tests in all. The new Quality tests verify correct
implementation of the following features: stencil buffers,
transformation and clipping, lighting, anisotropic filtering,
environment mapped bump mapping, large textures, compressed textures,
dotproduct3 texture blending, and partial texture downloads.
• We've improved the dithering, mipmapping, perspective correction,
high triangle count, texture swapping, anti-aliasing and fog table
Quality tests.
• We've changed the standard out-of-the-box testing mode to 1024 x
768, 32 bpp color, 32 bit Z buffer, double buffered with flip and v-
sync off. The benchmark detects whether your display driver supports v-
sync off, and verfies that the option works correctly. If the display
driver does not support v-sync off, use v-sync on and triple
buffering.
• The 3D WinBench 2000 Processor Test suite uses the Null device to
run the entire 3D WinMark test. The tests use the guard band (which
Microsoft set to +/- 2048 in DirectX 7) in the Null device.
• 3D WinBench 2000 now supports DirectX 7 compatible stereo display
devices.
• We've improved the benchmark's texture management, thus eliminating
the need for a system memory copy of most of the textures. Animated
textures replace the forced reloading of textures that 3D WinBench 99
used.
• Finally, 3D WinBench detects whether the display driver is queuing
too many frames of data. In order to remain responsive in actual
games, the driver should not queue more than two or three frames of
data.
We've made the following improvements to the Test Settings menu:
• We split the the 3D tab into two tabs (3D and Mode) to make room for
more controls.
• The new Capabilities (formerly the D3D HAL Problems) tab now lists
capabilities for the device you've currently selected, instead of
always displaying the capabilities for the HAL device.
• The 3D tab sports a new slider bar that makes it easy to select a
speed for the Quality Test Automation. In full automation, you'll have
only 1 second to accept or reject the automated verdict. The slide bar
allows you to vary length of the pause. The range is from Full to
Manual.
• Additional controls determine whether the benchmark should use v-
sync, hardware accelerated transformation and lighting, and stereo
display modes. You also have the option to set an audible alarm when
the benchmark starts and stops its performance timers.

Previous versions of 3D WinBench:
3D WinBench 99 Version 1.0 includes two upgrades: Version 1.2 and
Version 1.1. These new versions include some enhancements and several
bug fixes. In both versions, the changes affect at least some of the
scores. As a result, we recommend you only compare scores from the
same version of 3D WinBench.
You can easily check out the changes in each version of 3D WinBench
99:
• What's changed in Version 1.2 of 3D WinBench 99 (Release date: May
1999)
• What's changed in Version 1.1 of 3D WinBench 99 (Release date:
January 1999)
• What's new in Version 1.0 of 3D WinBench 99 (Release date: November
1998)
If you're having a problem, you might check out troubleshooting
information.

What's changed in Version 1.2 of 3D WinBench 99
(5/3/99) The primary changes in Version 1.2 affect the 3D Processing
test scores and any test runs involving the Null device. (The 3D
Processing tests render the same Chapel scene used in the fourth 3D
WinMark test with the Null device.)
Because the changes in Version 1.2 have a dramatic effect on the
relative performance of various CPU architectures, you should not
compare your scores between Version 1.2 and Version 1.1. We believe
the Version 1.2 results are a better reflection of the performance
differences than Version 1.1.
These changes do not have a significant impact on the 3D WinMark 99
scores for a graphics accelerator when using recent fast CPUs and the
standard testing resolution of 1024 x 768. However, the 3D WinMark 99
scores for Version 1.2 are generally higher than the 1.1 scores when
using slower CPUs with faster graphics accelerators.
Here's a summary of what we changed in Version 1.2.
• Version 1.2 uses a different method to draw the status information
triangles (the alpha blended text and logos). This change eliminates
the need for Direct3D to check the triangles for clipping against the
screen edge. In Version 1.1, Direct3D had to check to see if any of
the triangles in the status information intersected the screen edges
multiple times for each frame.
• When running a test that uses the Null device, Version 1.2 disables
the large guard band of +/- 32768 that the Null device exposes.
Current hardware does not expose a guard band, except for the NVIDIA
RIVA TNT. It exposes a guard band of +/- 2048.
• Version 1.2 contains new object bounding box code that is more
efficient than the code in Version 1.1. Version 1.1 used the Direct3D
TransformVertices function to perform the bounding box check. It did
not take the guard band, if any, into account. Version 1.2 uses an
optimized matrix multiply with early detection of z and y axis
rejections. Version 1.2 also uses the guard band (except when using
the Null device). 3D WinBench determines object rejections using the
viewing frustum. Version 1.2 determines the need for object clipping
using the guard band frustum. Only objects that cross both the guard
band and viewing frustum extents are marked for potential clipping.
• We shrank the bounding boxes for some objects in the Chapel scene
(this scene is used by the 3D Processing tests). Some of the bounding
boxes in Version 1.1 were larger than they could have been.
• We also made a small change in the 3D Quality test automation. The
Source Alpha Pixel Blend and Alpha Transparency quality test
automation now accepts the ARK Logic ARK8100 or ARK8800 chipset.


What's changed in Version 1.1 of 3D WinBench 99
(1/29/99) Version 1.1 includes two changes that affect the scores of
3dfx Voodoo2-based adapters when you use 3dfx's new DirectX 6 drivers.
These changes do not affect the scores for other adapters. Here's a
summary of what changed in Version 1.1.
• The Voodoo2 adapter now creates the background for the StationsB2
and StationsB3 scenes in a single pass instead of two passes. The
fifth, sixth, and eighth 3D WinMark 99 tests use multitexture
operations to create composite backgrounds for these scenes. We
intended for these tests to take advantage of the Voodoo2's two
texture units. However, we coded the operation incorrectly and
inadvertently forced the Voodoo2 to make two passes to create the
composite background. Because 3dfx did not have multitexturing DirectX
6 drivers available for us to use at the time we created the test, we
didn't catch this bug. We have fixed this problem in Version 1.1.
(This change also affects adapters based on the soon-to-be-released
Voodoo3 chipset.)
• We've changed the locking procedure for the tests to support the
Voodoo2's locking rules and penalize the adapter by the amount of time
it takes to do one refresh if the first lock of the back buffer does
not work. While the Voodoo2 supports triple buffering with a Z buffer,
it does not support buffer locks on all three of its frame buffers. It
can't lock the third buffer that an application allocates, nor can it
lock the buffer that is not currently the front or the back buffer. 3D
WinBench Version 1.0 locked all three frame buffers at the end of each
test to ensure that the adapter had drawn all triangles before the
benchmark stopped timing the test. We changed Version 1.1 to lock only
the back buffer. If the adapter fails to lock the back buffer, the
test performs another flip and tries again. Applications need to lock
the frame buffer when they want to read it back. Typically this is
only done for screen captures.
Bug fixes in Version 1.1
• If the Source Alpha Pixel Blending quality test fails, or if your
adapter does not support source alpha pixel blending, you won't be
able to run any of the 3D WinMark 99 tests and the 3D WinMark test
score will be zero. Previously, up to three tests might have run.
• 3D WinBench 99 now displays an explanatory message if you run a 3D
Scene/User Defined test without first specifying the name of the .SDL
file containing the scene you want the test to use. You select this
file at the User Scene tab of the Test Settings dialog box.
• We've fixed some minor bugs in the 3D WinBench-specific user
interface, such as the tab order of the controls in the Quality test
dialog box.
• We corrected the spelling of "Nearest" in the name of two of the
tests in the Filter performance test suite (we had "Neareast" in
Version 1.0).
Quality test improvements in Version 1.1
• The Mipmap LOD Bias quality test automation now works correctly on
adapters that have lower mipmap levels than those found in the good
image.
• The Dithering test no longer fails an adapter running in true-color
mode (34 or 32 bits per pixel), as long as the adapter renders the
colors requested.
• The Z Accuracy Gauge quality test now works on chipsets that perform
highly accurate triangle rasterization.
• Version 1.1 includes a new Perspective Correct Fog quality test,
which verifies that the adapter interpolates per vertex fog values in
a perspective-correct manner across a triangle.
• The Version 1.1 mipmap quality test automation works better than the
Version 1.0 automation on cards that always dither colors, even when
the application has turned the render state off. In addition, we have
updated the checks for correct mipmap level changes to reflect
OpenGL's limits (Direct3D is silent on this matter). We also include
new good images for the mipmap tests that use OpenGL's preferred
mipmap level calculation.
• The Z Bias quality test automation now works no matter what triangle
rasterization rules the adapter uses. Remember, 3D WinBench 99 allows
quality test images to be off in their triangle rasterization by a
single pixel (except on the Triangle Rasterization quality test
itself). In Version 1.0, the Z Bias quality test failed images that
were off by single pixel; in Version 1.1 this test passes those same
images.
• Version 1.1 has an additional bad image for the Texture Fidelity
quality test that illustrates the effect of S3 texture compression. We
designed this test to find out if an adapter compresses textures in a
lossy manner without a specific request to do so.
• The Quality Test wizard now includes a control that tells you both
the color (in R, G, and B values) of the image pixel currently under
your cursor and its location (in x and y values). This information can
be very useful when comparing good and bad images or difference
images. It lets you quantify exactly how much an image varies from the
correct instance.
General improvements in Version 1.1
• Version 1.1 includes an updated database of sample results (Sample
3D.ZTD). We've added results for a Voodoo2 SLI configuration with beta
DX6 drivers. These results are only reference points. The best source
of official results continues to be the Ziff Davis Media
publications.
• In the 3D Scene/User Defined test, 3D WinBench 99 now uses DirectX
6's three filtering modes: magnification, "minification," and mipmap
texture filtering. Each of these modes offers three value settings,
for a total of 27 different value combinations. The magnification and
minification modes support the anisotropic setting. Version 1.0 used
DirectX 5's single filtering mode (with its total of six values) in
the 3D Scene/User Defined test. You select the settings you want at
the User Scene tab in the Test Settings dialog box.
NOTE: We have not found any hardware that performs anisotropic
filtering correctly at the time of this release. The Reference
Rasterizer in DirectX 6, a software emulator that you can use in place
of HAL to run the benchmark, does support anisotropic filtering.
• Version 1.1 now identifies the correct display adapter when you have
a PowerVR PCX2-based adapter installed, but disabled. The PCX2 is an
add-on card that draws into the frame buffer of a host adapter. In
Version 1.1, if you disable the PCX2, you'll see the host adapter's
name in the 3D tab of the Test Settings dialog box; in Version 1.0 the
panel said "PCX2." In addition, 3D WinBench 99 now saves the quality
test results under the name of the host graphics adapter. In Version
1.0, it would save them under PCX2, even if you'd disabled that
adapter.
• If you run 3D WinMark 99 tests using the quality mode, the benchmark
now displays the number of triangles it sends to Direct3D (in
thousands of triangles per second, or KTPS). Keep in mind that this is
the number of triangles the test sends, not necessarily the number the
adapter displays.
• The Restore Defaults dialog box (which you get to from the 3D tabof
the Test Settings dialog box) now includes instructions for DirectX 5
and DirectX 6 Voodoo2 drivers as well as PowerVR PCX2 adapters.
Improvements common to all the PC benchmarks
All the PC benchmarks use a common user interface (UI), which includes
the PC Benchmarks Results Viewer. When you install a new version of a
benchmark, such as Version 1.1 of 3D WinBench 99, you're also
installing the common UI. This version of the UI replaces the existing
version. So you'll benefit from these UI updates even if you're also
running Version 1.0 of one of the other PC benchmarks, such as Audio
WinBench 99 or Winstone 99. Here are the common changes to the
benchmarks that take effect in Version 1.1:
• The PC Benchmark Results Viewer now opens read-only files.
• The PC Benchmark Results Viewer lets you open a database file with
spaces in its name by double-clicking on the name. Previously, you had
to open any file name containing spaces from within the viewer.
• Double quotes in the comment field of a results database file no
longer cause problems. Before, the viewer corrupted any file that
contained double quotes in that field.
• The graphics test screens now display on top of the main benchmark
window when you run 3D WinBench 99 on Windows NT 2000.

What's new in Version 1.0 of 3D WinBench 99
You'll find a number of changes in 3D WinBench 99 that will improve
your testing and save you time. Here are the highlights of 3D WinBench
99's new features.
• Automated quality tests. With these tests, you can specify that 3D
WinBench 99 automatically determine whether a machine passes or fails
a test. If you use partial automation (the default mode), 3D WinBench
passes the quality tests that perform correctly, but waits for you to
confirm any test that fails to meet its criteria. (You can also run
these tests in manual mode, if you like.)
• 3D WinBench 99 uses the new Vertex Buffer interfaces in DirectX 6.
(While the benchmark will still work with DirectX 5 device drivers,
you must have DirectX 6 installed.)
• 3D WinBench 99 uses triple (or double) buffered rendering insteadof
the front buffer-only rendering 3D WinBench 98 used. This change
allows 3D WinBench 99 to include buffer flips in the performance
measurement.
• New scenes give the 3D WinMark(TM) 99 a more game-like appearance
and function. The scenes include more pixel blending, multitexture
rendering, and less gouraud lighting than those in 3D WinBench 98. All
3D WinMark 99 tests require bilinear mipmapping and source alpha pixel
blending as baseline features. All new 3D chipsets can run all of the
3D WinMark 99 tests.
• The triangle complexity of the tests in 3D WinMark 99 is wider
ranging than in 3D WinMark 98. The new Chamber and Rust Valley scenes
extend the low end of the complexity curve and the new Canyon scene
extends the high-end.
• You can run the tests with either Z buffering or W buffering. Select
this feature on the 3D tab in the Test Settings dialog box. (The type
of buffer you use, if any, may affect results. Generally, W buffers
work better for outdoor scenes, while Z buffers work with indoor
scenes.)
• Software emulation is no longer a fall-back in cases where the HAL
doesn't support a feature. In general, a HAL is much more feature-
laden than a software emulator. In addition, the software emulator is
simply too slow to be useful in real-time games.
• You can now chose the software emulator, the reference rasterizer,
the Null device, or the HAL to be the display device. (The Null device
is useful for determining the maximum possible 3D WinMark 99 score.)
• 3D WinBench 99 adds palletized textures. The benchmark uses a mixof
16-bit RGB textures and palletized textures on display adapters that
support both. About half of all opaque textures are created as 16-bit
RGB textures; the other half are created using 8-bit palletized
format.
• The 3D WinMark 99 tests now measure the performance of texture
loading, no matter how much texture memory is available. The test
reloads textures after they're used about 30 times. This simulates the
reloading of textures that occurs in games as the player moves around
the game world.
• 3D WinBench 99 treats anti-aliasing as a resolution setting. You can
now run the entire 3D WinMark 99 with or without anti-aliasing. As a
result, you can compare the performance cost of anti-aliasing. You can
also compare a higher resolution with anti-aliasing to a lower
resolution without it. The 3D WinMark tests no longer require sort-
independent anti-aliasing.
• 3D WinBench 99 replaces the old processor test with two new ones
that run the Chapel scene using a Null device included in DirectX 6.
The Lighting and Transformation Test measures the processor's ability
to transform and light vertices. The Transformation Test measures
transformation only. With these tests, you can measure the performance
of the CPU on the transformation and lighting portions of the graphics
pipeline.
• The quality tests now let you magnify the good, bad, and current
images up to 8 times. This way you can examine individual pixels of
the image. This feature is especially useful when you are examining
anti-aliasing and dithering tests.
• The quality tests let you display an image that shows the difference
between the good and current test images or the bad and current test
images. You can use the difference image to identify areas that aren't
correct. For example, the difference images will often include
silhouette edges around objects or regions of solid color, which is
frequently due to incorrect triangle rasterization rules.
• If you're running a performance test, you can press CTRL + F to fail
the test if you notice screen corruption or incorrect rendering. 3D
WinBench will then prompt you to enter a comment explaining why you
failed the test.
• When a performance test uses pixel blending, it now displays status
information in the upper left corner of the screen. This display tells
you:
o The test name.
o The resolution, color depth, Z or W buffer depth, buffering mode,
and anti-aliasing mode.
o The name of the DirectDraw device being used.
o The 3D device being used: RGB Emulator, Reference Rasterizer or
Hardware.
o The frame rate, averaged during the last 1/4 second.
o An instantaneous frame rate meter.
o A texture meter where:
a. a. The vertical line tells you how many texture bytes have been
used since the beginning of the test.
b. b. The upper bar indicates the number of texture bytes used in the
last frame.
c. c. The lower bar shows how many texture bytes were loaded in the
last frame.
• 3D WinBench 99 no longer contains the 3D Triangle Tests suite.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top