Core clock/memory clock vs engine/memory clock

F

ftran999

I'm a bit confused about these two things. Are they the same thing with
just different terms or are we comparing apples and oranges? I ask this
because I have a Radeon ATI 9800 pro (128) card. The specs on the ATI
website lists the memory clock @680 and the engine clock @380. However, on
both ATI Tool and Rage3d the default setting is shown as memory clock 337.50
and core clock 378. Is core clock the same as engine clock? (I would assume
so since the values from the website and ATI tool/Rage3d are nearly
similar). Is there any reason that the default setting of my memory clock
would be so low, 337 vs 680.
If it means anything I'm using the CAT 4.7 drivers and my OS is XP Home.
Also the Bios on my MB shows the AGP/PCI freq. set to auto. The other
choices are 66.66/33.33, 72.73/36.36, and 80.00/40.00. Would changing to
any of these settings help.
Thanks in advance for your help.
 
D

DaveW

The timing are right. The DDR (Double Data Rate) memory on that card runs
with a real clock speed of ~340 MHz, which, because it is Double Data Rate
RAM, runs at an Effective rate of 680 MHz (double the real rate.)
 
I

Inglo

I'm a bit confused about these two things. Are they the same thing with
just different terms or are we comparing apples and oranges? I ask this
because I have a Radeon ATI 9800 pro (128) card. The specs on the ATI
website lists the memory clock @680 and the engine clock @380. However, on
both ATI Tool and Rage3d the default setting is shown as memory clock 337.50
and core clock 378. Is core clock the same as engine clock? (I would assume
so since the values from the website and ATI tool/Rage3d are nearly
similar). Is there any reason that the default setting of my memory clock
would be so low, 337 vs 680.
If it means anything I'm using the CAT 4.7 drivers and my OS is XP Home.
Also the Bios on my MB shows the AGP/PCI freq. set to auto. The other
choices are 66.66/33.33, 72.73/36.36, and 80.00/40.00. Would changing to
any of these settings help.
Thanks in advance for your help.

Everything's fine you're just wading in a bit of typical hype.
337 rounds up to 340 which when doubled, as in Double Data Rate Memory
(DDR), equals 680.
Core clock is the speed of the GPU, the videocard's processor. Which I
guess could be referred to as engine clock, though I hadn't noticed that
before.

Everywhere you go speeds are made to sound as impressive as possible,
Intel chips with quad speed refer to their front side bus as being 800
MHz, pffft it's really only 200.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top