Large Memory Use

Jun 15, 2012 at 3:00 PM

I have about 60 sds dataset files that I'm reading into separate sds dataset variables that I store in a dictionary (like shown below).  All together they are about 100MB in file size (*.nc files).  


    Public Class DataSetCollectionClass
        Inherits Dictionary(Of String, sds.DataSet)
    End Class

The first operation I do is go through all the files and add them to the dictionary, something like this

For each f in files

At this point, the memory use is minimal.  Then I go through the files and call GetData for each of the 8 arrays.  Each file contains about 8 1D arrays of 750 points.  This causes the memory use to increase dramatically (about 4 GB in this case).  After I have transitioned the data from the dataset to my program I would like to somehow drop the memory.  The only way I have found to do this is to purge my dictionary object of the datasets and recreate them.  So my questions are:


1.  Is there a way to release memory from the dataset variable after using GetData?  This way I don't have to purge my dictionary and recreate it.

2.  Any reason why the dataset is using so much memory?  Anyway to make the memory use more efficient?

Aug 28, 2012 at 4:09 PM
Edited Aug 28, 2012 at 4:10 PM

[Edit] Oops, didn't notice how old this post was...

Do you need the dictionary for some reason? Perhaps you can just iterate through a string array of filename paths, open each DataSet referenced by the path, and call GetData there?

For example:

foreach (string f in files)
    using (DataSet ds = DataSet.Open(f))
        //call GetData
    //after the 'using' block goes out of scope,
    //the dataset is released from memory
    //...not sure what the VB.NET equivalent is