GPU upgrade ??

T

tumppiw

I have:
MB Asus F1A75-M pro
CPU AMD A6-3650
RAM Kingston Hyper KHX1600C9D3/4GX (4*4GB)
HD1 Samsung HD103UJ 1TB
HD2 Samsung HD1541UI 1.5TB
GPU Asus EAH5670
PSU Antec Basiq 430W
2*DVD-RW
2-port Asmedia USB3 -card
Display Asus VS239H FullHD IPS panel

Everything is running at stock or default speeds, including the RAM at
685MHz, (I haven't tweaked BIOS to the XMP-1600 it can take)


If I were to upgrade that Asus EAH5670 (AMD Radeon HD5670) to a Radeon
HD7770 card, could I expect some gain in WoW screen refresh?? Now I'm
running at about 20-25fps (I like settings on high)
(Can I have both in use at the same time??)

And would my PSU survive (be enough)??

Money is a problem, so I can't upgrade both GPU and PSU
This HD7770 is about the highest spec'd GPU i can afford

I also (for the very small payout) do mint BTCs and LTCs
(where nVidias offerings are far from par)


--
 
P

Paul

tumppiw said:
I have:
MB Asus F1A75-M pro
CPU AMD A6-3650
RAM Kingston Hyper KHX1600C9D3/4GX (4*4GB)
HD1 Samsung HD103UJ 1TB
HD2 Samsung HD1541UI 1.5TB
GPU Asus EAH5670
PSU Antec Basiq 430W
2*DVD-RW
2-port Asmedia USB3 -card
Display Asus VS239H FullHD IPS panel

Everything is running at stock or default speeds, including the RAM at
685MHz, (I haven't tweaked BIOS to the XMP-1600 it can take)


If I were to upgrade that Asus EAH5670 (AMD Radeon HD5670) to a Radeon
HD7770 card, could I expect some gain in WoW screen refresh?? Now I'm
running at about 20-25fps (I like settings on high)
(Can I have both in use at the same time??)

And would my PSU survive (be enough)??

Money is a problem, so I can't upgrade both GPU and PSU
This HD7770 is about the highest spec'd GPU i can afford

I also (for the very small payout) do mint BTCs and LTCs
(where nVidias offerings are far from par)


--
-----------------------------------------------------
Thomas Wendell
Helsinki, Finland
Translation to/from FI/SWE not always accurate
-----------------------------------------------------

The cards are roughly in the same class.

http://www.gpureview.com/show_cards.php?card1=623&card2=675

A single point benchmark:

http://www.videocardbenchmark.net/gpu_list.php

Passmark G3D
Radeon HD 5670 1066
Radeon HD 7770 2147

Using a benchmark like that isn't very good, because
your game and the driver response, might not match how
the benchmark works. So all we can conclude from
that benchmark, is they're "close to being equal".
If there was a ratio of 5 between them, I'd be more
confident of a graphics improvement. One lists
"VLIW5" as the architecture and the other "RISC MIMD",
and who knows, maybe WOW works better with one than
the other.

The opinion here, is WoW would be more CPU bound.

http://answers.yahoo.com/question/index?qid=20111229122936AAc4HCx

Your processor is respectable. A Q6600 is still quite
usable. You can get a factor of three improvement
in multithreaded applications, for about 300 dollars
plus the price of a new motherboard. Which would
exceed your available budget.

http://www.cpubenchmark.net/cpu_list.php

Passmark CPU
Intel Core2 Quad Q6600 @ 2.40GHz 2963
AMD A6-3650 APU (Quad 2.6GHz) 3335
Intel Core i7-2600K @ 3.40GHz 8492
Intel Core i7-3770K @ 3.50GHz 9627 $308

Does your CPU support overclocking at all ?
I would try overclocking it by 10%, and see
if the game play improves at all. The idea there,
is to see how CPU bound it is. I have one game here,
that just a tiny change in CPU, makes a difference to
how smooth it is. All I needed was a small overclock,
to simulate an upgrade.

While you can overclock video cards, I've never tried
that myself, so have no feeling for how useful that is.

And, for balance, I can say that not all my overclocking
experiments have been positive. On a single core AthlonXP,
overclocking was pointless. It's possible in that
case, that the system bus was the limiting factor.
That's not a problem for your processor. Having an
integrated memory controller, as yours does,
makes a big difference. You should be able to overclock,
and get more from the processor.

In terms of power:

CPU 100W/12V * 1/0.9 = 9.3 amps on 12V2
New video card 80W = 6.7A on 12V1
Couple hard drives 2*0.6A on 12V1

Basiq 430W (there could be multiple generations of these...)

5V @ 20A, 3.3V @ 20A, 12V1 @ 17A, 12V2 @ 16A, -12V @ 0.8A, 5VSB @ 2.5A

3.3V and 5V combined power, less than 115W. (Estimated load = 60W)
12V combined power 384W. 384/12 = 32 amps. Your load is 17.2 amps.
Roughly speaking, you're using about half the power supply.

And when a game is CPU or GPU bound, then one of the two
pieces of hardware may not be running at max power. When
you do a power analysis, there's an assumption there is
some way for the application to force maximum power, which
may not be the case.

You can also find websites which will do power estimates
for you. You don't have to use my numbers.

Note that, if you read a video card advertisement, they'll
tell you that you need "26 amps". I've worked out a number
for myself, the 17.2 amp number. You really need to
do the math, to get a better estimate (the number will be
less than the video card advertisement web page). While I
haven't done the power calculation in full detail, I don't
see a reason to panic here.

What's important for an Antec, is who actually made it :)
They contract out their manufacturing.

*******

If you're on a limited budget, try the overclocking
experiment first, and see if a 10% CPU overclock makes
a difference to frame rate. If it did, perhaps
you are CPU limited.

Paul
 
T

tumppiw

The cards are roughly in the same class.

http://www.gpureview.com/show_cards.php?card1=623&card2=675

A single point benchmark:

http://www.videocardbenchmark.net/gpu_list.php

Passmark G3D
Radeon HD 5670 1066
Radeon HD 7770 2147

Using a benchmark like that isn't very good, because
your game and the driver response, might not match how
the benchmark works. So all we can conclude from
that benchmark, is they're "close to being equal".
If there was a ratio of 5 between them, I'd be more
confident of a graphics improvement. One lists
"VLIW5" as the architecture and the other "RISC MIMD",
and who knows, maybe WOW works better with one than
the other.

The opinion here, is WoW would be more CPU bound.

http://answers.yahoo.com/question/index?qid=20111229122936AAc4HCx

Your processor is respectable. A Q6600 is still quite
usable. You can get a factor of three improvement
in multithreaded applications, for about 300 dollars
plus the price of a new motherboard. Which would
exceed your available budget.

http://www.cpubenchmark.net/cpu_list.php

Passmark CPU
Intel Core2 Quad Q6600 @ 2.40GHz 2963
AMD A6-3650 APU (Quad 2.6GHz) 3335
Intel Core i7-2600K @ 3.40GHz 8492
Intel Core i7-3770K @ 3.50GHz 9627 $308

Does your CPU support overclocking at all ?
I would try overclocking it by 10%, and see
if the game play improves at all. The idea there,
is to see how CPU bound it is. I have one game here,
that just a tiny change in CPU, makes a difference to
how smooth it is. All I needed was a small overclock,
to simulate an upgrade.

While you can overclock video cards, I've never tried
that myself, so have no feeling for how useful that is.

And, for balance, I can say that not all my overclocking
experiments have been positive. On a single core AthlonXP,
overclocking was pointless. It's possible in that
case, that the system bus was the limiting factor.
That's not a problem for your processor. Having an
integrated memory controller, as yours does,
makes a big difference. You should be able to overclock,
and get more from the processor.

In terms of power:

CPU 100W/12V * 1/0.9 = 9.3 amps on 12V2
New video card 80W = 6.7A on 12V1
Couple hard drives 2*0.6A on 12V1

Basiq 430W (there could be multiple generations of these...)

5V @ 20A, 3.3V @ 20A, 12V1 @ 17A, 12V2 @ 16A, -12V @ 0.8A, 5VSB @ 2.5A

3.3V and 5V combined power, less than 115W. (Estimated load = 60W)
12V combined power 384W. 384/12 = 32 amps. Your load is 17.2 amps.
Roughly speaking, you're using about half the power supply.

And when a game is CPU or GPU bound, then one of the two
pieces of hardware may not be running at max power. When
you do a power analysis, there's an assumption there is
some way for the application to force maximum power, which
may not be the case.

You can also find websites which will do power estimates
for you. You don't have to use my numbers.

Note that, if you read a video card advertisement, they'll
tell you that you need "26 amps". I've worked out a number
for myself, the 17.2 amp number. You really need to
do the math, to get a better estimate (the number will be
less than the video card advertisement web page). While I
haven't done the power calculation in full detail, I don't
see a reason to panic here.

What's important for an Antec, is who actually made it :)
They contract out their manufacturing.

*******

If you're on a limited budget, try the overclocking
experiment first, and see if a 10% CPU overclock makes
a difference to frame rate. If it did, perhaps
you are CPU limited.

Paul

OK, thanks.
As I can't afford those better Intel Core i5/7 models and MB, I'm going
for the GPU upgrade to a HD7770
Maybe not so much improvement in WoW, but according to this

https://en.bitcoin.it/wiki/Mining_hardware_comparison

I should get roghly 2x on mining...(Maybe in a year or two I can upgrade
other parts..:) )


--
 
P

Paul

tumppiw said:
OK, thanks.
As I can't afford those better Intel Core i5/7 models and MB, I'm going
for the GPU upgrade to a HD7770
Maybe not so much improvement in WoW, but according to this

https://en.bitcoin.it/wiki/Mining_hardware_comparison

I should get roghly 2x on mining...(Maybe in a year or two I can upgrade
other parts..:) )

Some new "mining equipment" will be available soon for BitCoin.
it's possible such equipment, could obsolete "home mining",
for all but the biggest operations. I don't understand
the economics well enough, to say how prevalent the
new "FPGA mining boxes" will be. A full box costs around
$30,000. But when the price of BitCoin is high, that
may provide an incentive to go crazy. Even though there
are fewer coins per period to fight over now.

As each new hardware solution for mining comes along,
it obsoletes the old solution. GPU wiped out CPU.
And FPGA could wipe out GPU. It's all a matter of
capital costs and payback periods. And with the high
price of coins for the moment, electricity is
no longer the concern that it was. (At $3 a BitCoin,
in some areas of the world, the cost of electricity
was too high to do profitable mining.)

Paul
 
T

tumppiw

Some new "mining equipment" will be available soon for BitCoin.
it's possible such equipment, could obsolete "home mining",
for all but the biggest operations. I don't understand
the economics well enough, to say how prevalent the
new "FPGA mining boxes" will be. A full box costs around
$30,000. But when the price of BitCoin is high, that
may provide an incentive to go crazy. Even though there
are fewer coins per period to fight over now.

As each new hardware solution for mining comes along,
it obsoletes the old solution. GPU wiped out CPU.
And FPGA could wipe out GPU. It's all a matter of
capital costs and payback periods. And with the high
price of coins for the moment, electricity is
no longer the concern that it was. (At $3 a BitCoin,
in some areas of the world, the cost of electricity
was too high to do profitable mining.)

Paul

True, but as my machine is on 24/7 , it might do something useful anyway..


--
 
N

Norm X

Paul said:
Some new "mining equipment" will be available soon for BitCoin.
it's possible such equipment, could obsolete "home mining",
for all but the biggest operations. I don't understand
the economics well enough, to say how prevalent the
new "FPGA mining boxes" will be. A full box costs around
$30,000. But when the price of BitCoin is high, that
may provide an incentive to go crazy. Even though there
are fewer coins per period to fight over now.

As each new hardware solution for mining comes along,
it obsoletes the old solution. GPU wiped out CPU.
And FPGA could wipe out GPU. It's all a matter of
capital costs and payback periods. And with the high
price of coins for the moment, electricity is
no longer the concern that it was. (At $3 a BitCoin,
in some areas of the world, the cost of electricity
was too high to do profitable mining.)

Paul

My two cents.

1) I went to
http://sourceforge.net/projects/bit...-0.8.1/bitcoin-0.8.1-win32-setup.exe/download
to download the Windows installer for bitcoin 0.8.1 and Windows Security
Essentials flagged the software as malicious. No go.
2) I live in a cold climate and pay for expensive electricity to heat my
home with resistance coils in the baseboard. It is better to fill my living
quarters with valuable digital hardware which is on 24/7. For me, computing
is free (energy wise).

X

PS The average Canadian could save up to $150 per month in Internet
connection fees by use of software in Debian Kali Linux. NASA has accepted
Debian for use by astronauts and rejected use of Windows XP on the networked
laptops they use in orbit. Go Debian, go!
 
P

Paul

Norm said:
My two cents.

1) I went to
http://sourceforge.net/projects/bit...-0.8.1/bitcoin-0.8.1-win32-setup.exe/download
to download the Windows installer for bitcoin 0.8.1 and Windows Security
Essentials flagged the software as malicious. No go.
2) I live in a cold climate and pay for expensive electricity to heat my
home with resistance coils in the baseboard. It is better to fill my living
quarters with valuable digital hardware which is on 24/7. For me, computing
is free (energy wise).

X

PS The average Canadian could save up to $150 per month in Internet
connection fees by use of software in Debian Kali Linux. NASA has accepted
Debian for use by astronauts and rejected use of Windows XP on the networked
laptops they use in orbit. Go Debian, go!

Well, this sounds like fun :) Let me try my hand at it.

http://sourceforge.net/projects/bit...-0.8.1/bitcoin-0.8.1-win32-setup.exe/download

Well, first, we could check and see if source is available.
And figure out why it might, or might not, trigger a security warning.

Or, I'll just upload that file to virustotal.com and see what they think.

https://www.virustotal.com/en/file/...00c5634f45818d39cc12ce27ad964c905a6/analysis/

File name: bitcoin-0.8.1-win32-setup.exe
Detection ratio: 0 / 46
Analysis date: 2013-05-11 17:59:19 UTC ( 7 hours, 48 minutes ago )

So maybe the detection you're seeing, is heuristic, and based on
the things that program needs to do, to not upset normal usage
of the computer ? (Like, backing off if the user is active.)

The linux.tar.gz, has a source directory. As does the win32.zip.
If you needed to, you could compile them from source.

http://sourceforge.net/projects/bitcoin/files/Bitcoin/bitcoin-0.8.1/

Paul
 
N

Norm X

[massive snippage]

[massive snippage]

This whole thing about bitcoin mining smacks of a failure of due diligence
and ignorance of principles of economy of scale:

AWS Free Usage Tier
http://aws.amazon.com/free/

Amazon will give you 750 hours per month of cloud supercomputing. They do
this as a loss leader for those guys who might actually be doing some useful
computing, like maybe computational fluid mechanics for weapons design.

http://aws.amazon.com/ec2/

"Cluster GPU Quadruple Extra Large 22 GiB memory, 33.5 EC2 Compute Units, 2
x NVIDIA Tesla "Fermi" M2050 GPUs, 1690 GB of local instance storage, 64-bit
platform, 10 Gigabit Ethernet"

From:
http://www.nvidia.ca/object/tesla-servers.html ....

"
Peak single precision floating point performance .... Tesla M2090 .... 1331
Gigaflops.
"

However, bitcoin calculations are all about crypto and hashing, massively
iterative and vectorized AND, OR, NOT, XOR. Simple and fast compared to even
single precision floating point.

The bitcoin crowd reminds me of the Greenhouse effect, global warming and
the IPCC. Groupthink!. If they can't do simple computing research, integral
calculus, and show a mastery of economics, their ideas are no more than
fools gold.

X
 
Top