best memory allocation function for dealing with large (possiblyin the 100s of MBs) byte arrays?

J

Jonathan Wilson

I am working on some software which has to deal with data that could be as
large as 500mb or so. Currently I am using new[] and delete[] to manage
this memory but I find it is not ideal and sometimes gives out of memory
errors if I open one large data item then free that data item then another
large data item even though I have enough memory (including 2GB of physical
RAM and 100GB of free disk space for swap file). Are there any functions
(either in the CRT or in the win32 API) that would be better for this job
(specifically functions that are guaranteed to return the memory to the OS
and make task manager show that the memory is no longer in use as well as
having as little overhead as possible when using it)

The app in question runs only on windows > XP and only has one thread
touching this memory at all.
 
J

Johannes Passing

Hi Jonathan,

assuming that you are talking about a 32 bit process here, you only have
(on XP) 2 GB of virtual memory accessible in user mode. Even if you have
more physical RAM and plenty of paging file space, your process will not
be able to use more than 2GB of VM.

Your modules, stacks and heaps are sprinkled throughout these 2 GB of
VM, so you have to assume that the 2 GB are already quite a bit
fragmented. Given these constraints, it is pretty obvious that
allocating 500 MB of contiguous VM is very likely to fail.

So first you should ask yourself if you really need such large blocks of
contiguous virtual memory or if you can split them into chains of
multiple, much smaller sized blocks (of which maybe you do not even need
all in memory at once).

Finally, if you cannot avoid using large contiguous blocks of VM,
consider reserving a fair amount of VM early in the lifecycle of your
process and commit parts of it as you need them.

--Johannes

Jonathan said:
I am working on some software which has to deal with data that could be
as large as 500mb or so. Currently I am using new[] and delete[] to
manage this memory but I find it is not ideal and sometimes gives out of
memory errors if I open one large data item then free that data item
then another large data item even though I have enough memory (including
2GB of physical RAM and 100GB of free disk space for swap file). Are
there any functions (either in the CRT or in the win32 API) that would
be better for this job (specifically functions that are guaranteed to
return the memory to the OS and make task manager show that the memory
is no longer in use as well as having as little overhead as possible
when using it)

The app in question runs only on windows > XP and only has one thread
touching this memory at all.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top