Memory allocation after deleting big waves

When working with datasets including big waves, e.g. 16-bit unsigned integer 3D waves with (512,384,2048) points (about 800 MB), sometimes it occurs that after deleting all waves and data folders within root, either programmatically or by using the data browser, memory does not seem to be de-allocated. This is under Mac OS 10.8.5 and Igor 6.34.

According to the Mac activity monitor, Igor then still occupies 800-900 MB of memory. When loading a new data set of this size into a 3D wave this results in an "out of memory" error, as the loading procedure contains a Duplicate command. Restarting Igor is the only option. This behaviour seems to be rather random, sometimes it works, sometimes not.

Is there any work-around for this? My goal is to do batch processing on such data sets which requires sequential killing and loading.

This probably the result of memory fragmentation where no contiguous block of memory large enough is available even though the total memory would seem to be enough.

Note on Mac, a 32 bit application like Igor has access to 4GB of memory.

We do have a 64 bit Igor but that (currently) runs only on Windows.

Larry Hutchinson
Thanks for replying!
Yes, true, the 4GB limit is never exceeded. So far I had no problems with batch processing and up to 40 datasets but the waves were smaller than 800 GB.
...years later, I wonder if the question above has ever been answered:

So, is there a work-around?
A way to force de- & re-allocation of a certain big memory block?

I am running Mac OS 10.10.2 & Igor 6.36 .
I iteratively create/delete/create/... a 1d integer wave of same maximal size (ca. 120kB) using "concatenate".

Is there a way to avoid excessive memory consumption and re-use the same memory block instead?