What is proper way to Open Close Release xls files in VBA?

M

MikeZz

Hi,
I have a VBA project that opens many files, extracts info, then closes them
(between 10 & 500 files sequentially).

I open the data file, get what I need, close it, then open the next data file.

The files themselves are all pretty small - each under 60k.

However, as the application cycles through all the files, the memory usage
in Task Manager keeps growing and the app slows down.

I've checked in VBA Editor after it's done and it shows all the files that I
opened and closed.

Can someone show me an example of the proper way to handle files so that the
memory is actually released back to the system?

Thanks!
MikeZz
 
M

MikeZz

I have another routine that puts the datafilenames in an array: arrFiles
This routine is called in a loop that goes through all the datafiles.
It's passed the file number open, read into an array (arrMaster), then data
file is closed.
the arrMaster, gets cleared and resized each time a new file is open.
The array is then used, and the process is repeated until all the files have
been used.

Thanks again for any help,

Sub ReadNewContract(fileNo, arrMaster)

Dim MasterFile
Dim f, c, r
Dim lngCount As Long

Dim Master As Workbook
Dim masterSht As Worksheet
Dim rowsMaster, colsMaster, lastCellMaster
Dim rowMax, rightCol
Dim FoundIndent

Dim matCount, matTotal
Dim ctCellMaster
Dim testRowCountMat

'############################################################################################
'########### READ IN MASTER FILE
'############################################################################################

Workbooks.Open (arrFiles(fileNo, colFileName))
Set Master = ActiveWorkbook
Set masterSht = ActiveSheet
MasterFile = Master.Name

lastCellMaster = LastCellIn(masterSht)
rowsMaster = LastRowIn(masterSht)
arrFiles(fileNo, colFileRows) = rowsMaster
colsMaster = LastColIn(masterSht)

If rowsMaster = Empty Or colsMaster = Empty Then
'File was not converted to Excel propertly
arrFiles(fileNo, colFileStat) = "Empty File"
Master.Close SaveChanges:=False
errCount = errCount + 1
arrErrors(errCount, colKeyErrFile) = Get_File_Info(arrFiles(fileNo,
colFileName), "FileName") ' 1 'KeyErr Anchor: Name
arrErrors(errCount, colKeyErrCode) = "Empty Excel File" ' 2 'KeyErr
Anchor: Anchor or Code
arrErrors(errCount, colKeyErrDesc) = "Empty Excel File" ' 3 'KeyErr
Anchor: Name
arrErrors(errCount, colKeyErrStat) = "Empty Excel File" 'KeyErr Anchor:
Special Function to perform (such as count repeats)
Exit Sub
End If

ctCellMaster = 0
ReDim arrMaster(0)
ReDim arrMaster(1 To rowsMaster, 0 To colsMaster)
For r = 1 To rowsMaster
LeftIndent = 0
FoundIndent = False
rightCol = 0
For c = 1 To colsMaster
If alignLeft = True And FoundIndent = False And Len(ActiveSheet.Cells(r,
c)) = 0 Then
LeftIndent = LeftIndent + 1: GoTo nextMc
End If
FoundIndent = True
arrMaster(r, c - LeftIndent) = ActiveSheet.Cells(r, c)
' If Len(arrMaster(r, c)) <> 0 Then ctCellMaster = ctCellMaster + 1:
rightCol = c - LeftIndent
If Len(arrMaster(r, c - LeftIndent)) <> 0 Then rightCol = c - LeftIndent
nextMc:
Next c
arrMaster(r, 0) = rightCol
Next r

Master.Close SaveChanges:=False

End Sub
 
M

MikeZz

Just for some additional info,

I set the file variable to nothing and retried... same issue.
I get through the first 150-200 data files with no problem but it eventually
comes to a screaming halt when the Excel Memory in Task Manager gets over
60-70MB.
In the beginning, it can scan a file in 1/2 second
by #225, it takes about 5-10 seconds,
It's on 254 now with Memory at 94M and it's
about 22 seconds a file.
CPU Usage is always pegged at 50% (Dual Core processor).

If I remember right, at some point over 100MB Memory it actually seems to
speed up....

Thanks again,
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top