Opening/Closing Large Files

N

Nigel

Hi Jim
They are xl2003 and the calculation is not switched to manual, so I will do
that but in fact there is not much to calculate in these looks, they contain
a lot of forms and formatting, hence the large size.

I have not tried to move them locally so that certainly might be an option
if the network is the limiting factor.

I am working on a formula approach and will be testing it soon.

Thanks
 
D

Dave Peterson

Thanks for posting back.

I'm not sure I'd try the save, close, reopen, resume stuff. That doesn't sound
too different (to me) than just doing the first ### of the 1350 files. (But
I've been wrong lots of times.)
 
C

Charles Williams

one other thing to check out is the number of files in your temp directory:
large numbers can slow things down

Charles
__________________________________________________
Outlines for my Sessions at the Australia Excel Users Group
http://www.decisionmodels.com/OZEUC.htm

Nigel said:
Thanks Guys for stimulating the debate, the string pool solution did not
speed it up, it did slow it down in so much that I loaded a large pool as
per MS article and timed the open-extract-close steps which increased, I
then removed the temporary sheet and the process went back to normal
speed. If the limit is 21,000 pool items I guess I reach that at around
700 files the point at which my process as at a near standstill, so I
could save-open and resume maybe?

However I am working on a formula version as proposed and will be testing
that soon.

I will let you know how it goes. Thanks for all your advice.

--

Regards,
Nigel
(e-mail address removed)
 
N

Nigel

The formula approach worked. Reduced processing time to under 20 minutes to
make all connections and update data. I chose the do a copy/ replace with
values after the full loading of files, but it might have been better to
replace as I looped through each file.

Thanks for all your input, advice and suggestions. We all learnt something!

--

Regards,
Nigel
(e-mail address removed)



Charles Williams said:
one other thing to check out is the number of files in your temp
directory: large numbers can slow things down

Charles
__________________________________________________
Outlines for my Sessions at the Australia Excel Users Group
http://www.decisionmodels.com/OZEUC.htm
 
B

Brouws

Nigel,

You may have to add an "erase dataWb" statement inside the for/next
loop. I believe objects may not be overwritten when you re-set them,
which may still cause a memory leak, even though you may not actually
be running out of memory.

Good luck,
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top