Issues with running a large model probabilistically

Answered

Comments

12 comments

  • Official comment
    Avatar
    Rick Kossik

    Keep in mind that although you are using a 64 bit version of Windows, GoldSim itself is a 32 bit application.  Hence, the memory issue still applies.

    Comment actions Permalink
  • Avatar
    Jason

    Kevin,

    Thanks for posting in our forum! Do you mind starting up Windows Task Manager and while this is open, start a simulation of your model and monitor the Memory usage? Tell me what you are seeing up until the point at which it crashes. If this doesn't provide enough information, perhaps we should see if it might be possible to get a copy of your model or set up a screen sharing call so I can see more.

    -Jason

    0
    Comment actions Permalink
  • Avatar
    Rick Kossik

    By the way, this is indeed a very large model.  It has 25 million outputs (I assume some large arrays).  That requires lots of memory.

    0
    Comment actions Permalink
  • Avatar
    Brown, Kevin George

    Jason -- The Task Manager indicates that the CPU stays below 3% (for the GoldSim application) and that the memory begins at about 2,100 MB and approaches 3,000 MB when the model crashes. Perhaps an address space issue? 

    Rick -- I understand that GoldSim is a 32-bit application (as distinguished from the 64-bit Windows OS); however, I thought that the address space was not limited like under the 32-bit Windows OS. I have reduced the number of final values saved in the model, changed the timestep, etc. to reduce the model size.

    Thank you for your help.

    0
    Comment actions Permalink
  • Avatar
    Rick Kossik

    Yes, this is an memory issue.  As you approach  3000 MB, things fall apart.  Unfortunately, the application (not the OS) determines this limitation.  We will eventually convert GoldSim to 64 bit (unfortunately, it is not a trivial task to do so).  So were you able to get it to run?

    0
    Comment actions Permalink
  • Avatar
    Brown, Kevin George

    Understood. I can get the model to run a limited number (20 or so) of realizations.

    I guess I will see what I can do to reduce the model size. 

    Thank you.

    0
    Comment actions Permalink
  • Avatar
    Rick Kossik

    I need to know more about the model to figure out if there is anything that can be done to reduce the memory footprint. I'm guessing there might be.  Having the model use 2.1GB of memory before it starts running is amazing.  Do you have some massive arrays?  25 million model outputs is a BIG number...  Can you send me the model to investigate?

    0
    Comment actions Permalink
  • Avatar
    Brown, Kevin George

    Rick -- Apologies, I cannot currently share the model. I could probably show you the model on a zoom, however. I have 74 species (including rad / non-rad) and a large number of sources / phases / types / measurement types / error models / etc. that combine and bifurcate to a large set of necessary inputs -- although I am not quite sure why I have such a large number of outputs. You are correct in that I do have a large number of large arrays in the model. Based on the nature of the data source, the inputs were (necessarily) arranged into a large number of large, sparse spreadsheets from which I am extracting the necessary inputs. Thank you.

    0
    Comment actions Permalink
  • Avatar
    Rick Kossik

    Well that is interesting. The linking to large spreadsheets could be the source of this. I assume you are pulling that data into large arrays?  As you know, each element has one or more outputs.  If the output is an array (say of 1000 items), that requires that memory of 1000 outputs.  So you must have some big arrays. If you want to set up a web meeting, let's do that.

    0
    Comment actions Permalink
  • Avatar
    Brown, Kevin George

    Rick -- Thank you again for your interest and suggestions. I will send you a follow-up email so we can discuss via a web meeting. 

    0
    Comment actions Permalink
  • Avatar
    David Esh

    Hi Kevin - Hope you are doing well!  (and you are a crazy man it sounds)

    It may not be possible for your application but I have run into some resource issues when dealing with large models linking to and from Excel.  Run times can be excessively long and increase in a non-linear way when adding additional inputs/outputs etc.  I would suggest using a dll and or text.  If you have large sparse matricies you may read in only the non-zero data and location in your array then use a script to build the full array in GoldSim.  I think this would save memory.  Let me know if you try it and it works.

    Dave

    1
    Comment actions Permalink
  • Avatar
    Brown, Kevin George

    Hi David -- I hope all is well. And not the first nor likely the last time the word "crazy" will be used in relation to me....

    Based on a conversation with Rick K (who has been a big help), I rewrote the entire input scheme (although still using Excel) to reduce the sparseness of the resulting matrices in the model -- this slightly increased the number of inputs but decreased the number of outputs from ~25 million (as indicated above) to ~2.5 million. That revision helped a bit in terms of "fitting" the model into the 3GB address space, but the model is still limited in how many realizations I can run. 

    The source of input data does not lend itself easily to a text-based input structure (although I will look into it again). I will also take a look at a dll option. Have you worked much with the database options? I have not yet ventured there for this particular model.

    0
    Comment actions Permalink

Please sign in to leave a comment.