Reduce Fie Size, Speed Up Calculation

C

chg

Hey all,

I got a worksheet where i do some financial backtests using daily data
for the last 10 years for 10 different countries with about 20 diff.
variables.

I got 10 Data Sheets: One sheet for every country-data containing 20
colums + 3000 rows
1 Sheet containing all the formulas to basically calculate my backtest
(array- formulae etc..., vba-functions, etc).
5 other sheets containing charts etc.

Thats way i got a filesize of close to 90MB

Could you think of any method to improve file size and speed (saving
takes up to 2min).
Should i outsource the country data in a sperate worksheets; spliting
them from the actual calculations? Should I read it in a vba array.
Anything else?

Your help is very much appreciated.

Chris
 
C

Charles Williams

Hi Chris,

See my website
http://www.decisionmodels.com/optspeedb.htm

and my MSDN white paper
http://msdn.microsoft.com/en-us/library/aa730921.aspx

for guidance on how to find out where the bottlenecks are and how to speed
up calculation etc.
(The performance of VBA functions is very sensitive to the way they are
written, and array formulae can be very slow.)

WRTO file size, check the used range on each sheet (if you have not already
done this) using Ctrl/End on each sheet.

Depending on which Excel version you are using it might be worthwhile
splitting out the charts and reporting stuff from the calculation stuff.
Usually its better to keep the data in the same workbook as the calculation
engine, unless the calculations are only using a small subset of the data.

Charles
___________________________________
The Excel Calculation Site
http://www.decisionmodels.com
 
C

chg

Charles, Thy for great links. maybe I can improve by optimizing my
offset formulas, i hadn't thought of that

I'm calculating one country at a time so i tried to spilt the data
from the calc as of now. From 85 MB combined, to 71 Data, 14 Calc -
but excel doesnt handle that kind of data very well (at least not with
my experience level)

I already have optimized the ranges, but I think my array formulas
could be optimized as well (variance for colums with blanks etc)

But maybe its better to split reports from calculus.... any further
comments on how to optimize my method is very much appreciated

Thx
 
C

Charles Williams

Yes Links are a disaster area waiting to happen.

Its only worthwhile using a RDB such as Jet/Access if the spreadsheet only
needs a small subset of the data at any one time.
but if thats what happens in your backtesting model it could be worthwhile.

Of course if the calculations are simple enough to be handled easily by SQL
then you could migrate the whole thing.

Charles
___________________________________
The Excel Calculation Site
http://www.decisionmodels.com
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top