Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set this in
the cTor of the FileStream class) which could speed things up as much as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.
HTH,
Kryil
<code>
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.
//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}
//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
VM said:
Thanks for your reply. I'm using the StreamReader class.
This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on
what is stored there. If you are encountering problems with your
application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in
chunks rather than looping through all 400,000+ lines. One interesting
thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?
Kyril
How can I limit the use of the PC's virtual memory? I'm running a
process
that basically takes a txt file and loads it to a datatable. The
problem
is
that the file is over 400,000 lines long (77 MB) and after a while I
get
the
Windows message saying that the virtual memory's getting really low.
Plus
the machine gets really sluggish (with multi-threading). Is it possible
to
use the virtual memory until it reaches a certain limit and then use
HDD
space?
Thanks.