Problem in Recursive copy folder

U

usman

Hi

I have a windows service that backups a folder onto another location on the
same computer. The service is written in C#. The size of the original folder
is large i.e. over 8 GB. Also the folder structure is very deep, i.e lot of
recursion. What happens is that the backup process terminates after sometime
taking backup of around 2 GB. If I debug the same code I dont get any error,
instead the whole folder is copied successfully. Is there any chance that
the function terminates because of recursion being very high or may be
because of the time involved in it, the Directory and File objects refering
to files or folders start destroying because of automatic garbage
collection.

Below is the the code that I'm using for recursively copying folder.

Consider IsPauseSignalled, DoPause, IsFileInFilter, IsFolderInFilter and
FileDiffer functions tested for all errors.


----------------------------------------------------------------------------------------------------------------------------------

private bool CopyFolder(string szSrcFolder, string szTargetFolder, string
szFileFilter, string szFolderFilter)
{
bool bRet = false;

if(IsPauseSignalled())
DoPause();

DirectoryInfo dirInfo = new DirectoryInfo(szSrcFolder);
if(dirInfo != null && dirInfo.Exists)
{
FileInfo[] filesInfo = dirInfo.GetFiles();
foreach(FileInfo info in filesInfo)
{
if( !IsFileInFilter(info.Name, szFileFilter))
{
string szSrcPath = szSrcFolder + "\\" + info.Name;
string szTrgPath = szTargetFolder + "\\" + info.Name;
try
{
if(FilesDiffer(info, szTrgPath))
{
File.Copy(szSrcPath, szTrgPath, true);
}
}
catch(Exception ex)
{
String szError = ex.Message;
}
}
}

try
{
//Recursively copy directories
DirectoryInfo[] dirsInfo = dirInfo.GetDirectories();
foreach (DirectoryInfo dInfo in dirsInfo)
{
if (!IsFolderInFilter(dInfo.Name, szFolderFilter))
{
string szSrcPath = szSrcFolder + "\\" +
dInfo.Name;
string szTrgPath = szTargetFolder + "\\" +
dInfo.Name;
Directory.CreateDirectory(szTrgPath);
CopyFolder(szSrcPath, szTrgPath, szFileFilter,
szFolderFilter);
}
}
}
catch (Exception ex)
{

}
bRet = true;
}
return bRet;
}
 
M

Marc Gravell

Not code, but have you tried "robocopy" with the /E switch? It is part
of the platform SDK (IIRC) and can be downloaded from Microsoft.

Frankly it is going to be more efficient and more reliable than
anything you or I can create, especially in C#.

Marc
 
A

Andrew Morton

usman said:
I have a windows service that backups a folder onto another location
on the same computer. The service is written in C#. The size of the
original folder is large i.e. over 8 GB. Also the folder structure is
very deep, i.e lot of recursion. What happens is that the backup
process terminates after sometime taking backup of around 2 GB. If I
debug the same code I dont get any error, instead the whole folder is
copied successfully. Is there any chance that the function terminates
because of recursion being very high or may be because of the time
involved in it, the Directory and File objects refering to files or
folders start destroying because of automatic garbage collection.

If robocopy isn't an option, then perhaps you could try a thread.sleep every
now and again to let the file system catch up.

Is anything written to the event logs pertaining to the failure reason?

Andrew
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top