cancel
Showing results for 
Search instead for 
Did you mean: 

large strings and leaky memory

PatrickChilders
Level 3
The process I'm currently working on has some memory issues. I'm building emails from markup. These emails contain dynamically sized tables (more data = bigger table to email). The table cannot be an attachment. Unfortunately this means that I have run into some System.OutOfMemoryException issues. I have tracked it to a calculate stage that stiches various parts of the email together. It's basicaly a laundry list of [variable] & [variable2] & [variable3] etc... The problem is that as the process runs, and at this point more than any other, the memory consumed is not disposed after It's done being used. All of my data stages have the "Reset to Intiail Value whenever this page runs." checkbox checked. The contents of the calculate stage's output ends up being put into a queue. The specific exception thrown is: Exception occurred - System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. at System.String.Concat(String str0, String str1) at BluePrism.AutomateProcessCore.clsProcessOperators.DoOp_Concatenation(clsProcessValue objVal1, clsProcessValue objVal2, clsProcessValue& objRes, Boolean bValidate, String& sErr) at BluePrism.AutomateProcessCore.clsProcessOperators.DoOperation (String sOp, clsProcessValue val1, clsProcessValue val2, clsProcessValue& objRes, Boolean bValidate, String& sErr) I run into this exception not in just running the stage, but evaluating it as well. It would appear that evaluating the expression consumes memory that is not disposed of after evaluation (otherwise i imagine only running the stage would cause an exception). The stage's evaluation (clicking "Test") doesn't immediately cause the exception, but does so eventually with every run through it takes up a little more until one of the many happen to hit the memory ceiling. The test case that first got me that exception from evaluating in the calculation stage had a output of ~1.9 million characters (markup can add up quickly). I am on blue prism version 4.2.43.0 Am I doing something wrong that is causing this, or is this something I'll need to maneuver around? p.s. the memory is not released when process studio is closed, only when all of BP is closed.
2 REPLIES 2

Denis__Dennehy
Level 15
The only time I have seen similar issues recently was when a client was trying to use an overly massive collection (many 10's of thousands of rows). Does your solution contain anything similar to that? The solution for that client was to re-orchestrate his solution so that the massive dataset he was manipulating was placed in a database or work queue instead. As for memory - Blue Prism is a .NET application... I've given up trying to understand how .NET handles its memory... but its garbage disposal should evenutally sort itself out when it feels like it....

After a support ticket to blue prism while they recognized the issue they reccommended a work-around. I do use a massive collection, but the curious part was that this massive collection didn't consume much memory, and it was handled afterwards. It seems that using ""remove all rows"" in the internal collections object will flush the memory, but overwriting that collection will not (as in a ""get X as collection"" object). Exactly why is up to them to figure out. My workaround included storing as much data in the work queue as possible and splitting up operations. I still run into out of memory exceptions but I have made a work-a-round so that the process can be restarted in such an event and no data is lost. After I get next item from the queue, I immediately unlock the item in the queue, then I work the queue and only re-lock the item (by getting next item with the got item's value) and then marking it complete or exception. This way if at any point there is a fatal error and the process terminates, I am not left with locked items in my queue or excess exceptions. I can restart the process as many times as needed to work the entire queue.