How to limit use of virtual memory in my Win application?

V

VM

How can I limit the use of the PC's virtual memory? I'm running a process
that basically takes a txt file and loads it to a datatable. The problem is
that the file is over 400,000 lines long (77 MB) and after a while I get the
Windows message saying that the virtual memory's getting really low. Plus
the machine gets really sluggish (with multi-threading). Is it possible to
use the virtual memory until it reaches a certain limit and then use HDD
space?

Thanks.
 
K

Kyril Magnos

As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on
what is stored there. If you are encountering problems with your application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in
chunks rather than looping through all 400,000+ lines. One interesting thing
to note is how you are accessing the file. Which of the stream classes are
you using to access the file?

Kyril
 
E

Eric Johannsen

Note, too, that virtual memory equals your physical memory plus hard disk
space allocated to swapping. It may be an option to have your users
allocate more memory to the page file.

However, I think the previous poster's comments to restructure your code in
order to make better use of available resources represent a better solution.

Eric
 
V

VM

Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
 
K

Kyril Magnos

Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set this in
the cTor of the FileStream class) which could speed things up as much as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default, but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
 
V

VM

Thanks very much for the info.
Before making these changes, I wanted to know what you thought of this idea
regarding the table and the 400,000+ rows.
What I'm trying to do is display this txt file in a windows datagrid.
Basically, I call a method (with parm fileName) that creates and fills a
table with the 400K file and returns the table (that now has 400k rows) to
the form. Then I attach the table to the grid. I wrote this without knowing
that the program had to read such immense files. Since I don't have to
display all 400,000 records in the grid (the most the user will see at a
time is 40 recs in the grid), theoretically, how would you load the file
into a table and attach it to the grid in small chunks?

Thanks again.

Kyril Magnos said:
Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set this in
the cTor of the FileStream class) which could speed things up as much as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default, but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
VM said:
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

possible
to
 
V

VM

I tried your suggestion but the file just freezes when reading the file.
Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge
file.


Kyril Magnos said:
Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set this in
the cTor of the FileStream class) which could speed things up as much as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default, but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
VM said:
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

possible
to
 
K

Kyril Magnos

Hmmm, good question... lol

The thing is with your app, the biggest performance hit is not in the UI
display, it is in getting the data. Disk access is slow. Even on GHz HT
Pentiums, it is slow compared to RAM. This is where you are going to take
your hit hardest. I would first read the file using the FileStream in a
DataTable. Then, I would store that datatable in a memory stream or some
other persisted medium (if you are using ASP.NET, then I would say you could
stuff it into the Cache and really speed things up). Then, I would create a
method that returns only 40 records in a temp datatable:

<pseudo-code>
public DataTable GetRecords(int startRecord, int numberofRecords)
{
DataTable tempTable = new DataTable("tempDataTable");
for(int i = startRecord; i <= numberofRecords; i++)
{
tempTable.ImportRow(dataTableSource.Rows);
}

return tempTable;
}
</pseudo-code>

Not the most elegant solution, but you are dealing with the most primitive
database known, text files! ;) I would make extra, extra sure to dispose
ANYTHING that you don't need. It's going to take the GC awhile to recover
the RAM used to read the text file initially and the StringReader to parse
it. So, take extra care to dispose of anything that you create while the GC
is handling other things in the background.

HTH,

Kyril

VM said:
Thanks very much for the info.
Before making these changes, I wanted to know what you thought of this
idea
regarding the table and the 400,000+ rows.
What I'm trying to do is display this txt file in a windows datagrid.
Basically, I call a method (with parm fileName) that creates and fills a
table with the 400K file and returns the table (that now has 400k rows) to
the form. Then I attach the table to the grid. I wrote this without
knowing
that the program had to read such immense files. Since I don't have to
display all 400,000 records in the grid (the most the user will see at a
time is 40 recs in the grid), theoretically, how would you load the file
into a table and attach it to the grid in small chunks?

Thanks again.

Kyril Magnos said:
Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set this in
the cTor of the FileStream class) which could speed things up as much as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
VM said:
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on
what is stored there. If you are encountering problems with your
application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in
chunks rather than looping through all 400,000+ lines. One interesting
thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?

Kyril

How can I limit the use of the PC's virtual memory? I'm running a
process
that basically takes a txt file and loads it to a datatable. The
problem
is
that the file is over 400,000 lines long (77 MB) and after a while I
get
the
Windows message saying that the virtual memory's getting really low.
Plus
the machine gets really sluggish (with multi-threading). Is it possible
to
use the virtual memory until it reaches a certain limit and then use
HDD
space?

Thanks.
 
K

Kyril Magnos

It should work just fine. I will test it here with a large file and post the
results.

~Kyril

VM said:
I tried your suggestion but the file just freezes when reading the file.
Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge
file.


Kyril Magnos said:
Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set this in
the cTor of the FileStream class) which could speed things up as much as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
VM said:
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on
what is stored there. If you are encountering problems with your
application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in
chunks rather than looping through all 400,000+ lines. One interesting
thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?

Kyril

How can I limit the use of the PC's virtual memory? I'm running a
process
that basically takes a txt file and loads it to a datatable. The
problem
is
that the file is over 400,000 lines long (77 MB) and after a while I
get
the
Windows message saying that the virtual memory's getting really low.
Plus
the machine gets really sluggish (with multi-threading). Is it possible
to
use the virtual memory until it reaches a certain limit and then use
HDD
space?

Thanks.
 
V

VM

For such a huge file, what would the best buffer size be?

Thanks.


Kyril Magnos said:
It should work just fine. I will test it here with a large file and post the
results.

~Kyril

VM said:
I tried your suggestion but the file just freezes when reading the file.
Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge
file.


Kyril Magnos said:
Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set
this
in
the cTor of the FileStream class) which could speed things up as much
as
50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

As far as I know, there is no way to manipulate Windows VM from
within
an
application. The VM is handled by the operating system and it
decides
on
what is stored there. If you are encountering problems with your
application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the
file
in
chunks rather than looping through all 400,000+ lines. One interesting
thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?

Kyril

How can I limit the use of the PC's virtual memory? I'm running a
process
that basically takes a txt file and loads it to a datatable. The
problem
is
that the file is over 400,000 lines long (77 MB) and after a while I
get
the
Windows message saying that the virtual memory's getting really low.
Plus
the machine gets really sluggish (with multi-threading). Is it possible
to
use the virtual memory until it reaches a certain limit and then use
HDD
space?

Thanks.
 
K

Kyril Magnos

Hi VM,

Sorry about the delay in getting back to you. I am still trying a few things
to see how to better implement this. I am currently using a buffer size of
65536 (64K) and it reads in from the FileStream quick enough. The trouble is
when I get the results into a DataTable. My mem usage was up to 400 Megs
yesterday afternoon! So, I am looking into some other things such as a
string array or something along those lines. I will keep you updated.

Kyril

VM said:
For such a huge file, what would the best buffer size be?

Thanks.


Kyril Magnos said:
It should work just fine. I will test it here with a large file and post the
results.

~Kyril

VM said:
I tried your suggestion but the file just freezes when reading the file.
Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge
file.


Ok, I would ***strongly*** recommend switching to FileStream and using
byte
arrays. You can do async file access with FileStream (you would set this
in
the cTor of the FileStream class) which could speed things up as much as
50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines
until
you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array
with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a
good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

As far as I know, there is no way to manipulate Windows VM from within
an
application. The VM is handled by the operating system and it decides
on
what is stored there. If you are encountering problems with your
application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file
in
chunks rather than looping through all 400,000+ lines. One interesting
thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?

Kyril

How can I limit the use of the PC's virtual memory? I'm running a
process
that basically takes a txt file and loads it to a datatable. The
problem
is
that the file is over 400,000 lines long (77 MB) and after a
while I
get
the
Windows message saying that the virtual memory's getting really low.
Plus
the machine gets really sluggish (with multi-threading). Is it
possible
to
use the virtual memory until it reaches a certain limit and then use
HDD
space?

Thanks.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top