cancel
Showing results for 
Search instead for 
Did you mean: 

Yes, here's another OutOfMemory problem

PvD_SE
Level 12
Hi folks,

I have a large process that does the following:
  • Get data in CSV format from tree external systems
  • Select relevant data and ave the data as XL
  • Compare the different XLs and create a difference report
The first step picks up rather large (70k - 150k rows) and emails them to the process.
Step two opens the CSVs with OLEDB, splits the data and saves it as XL in different files
The last step opens whatever XLs have to be compared and does a lot of filtering and comparing collections

Note that I did run the process on a clean and freshly rebooted VDI, running nothing else than required by the process (a mainframe terminal and outlook). Checking the memory in TaskManager revealed BP being the memory pirate peaking at 940Mb just prior to the process became the proverbial dead parrot. The Mainframe terminal traditionally uses almost no memory, and outlook remained idling most of the time.

Having had problems with this process for a longer time, we've managed to move all data selection possible to the mainframe server. So the data being emailed into step two is at a minimum today. But it still can be 70k - 150k rows.

Step one runs fine, as does (most of the time) step three. Step two usually ends in an OutOfMemory error. To tackle this, I did the following:
  • Use OLEDB wherever possible
  • Split filtering collection data in chunks of max 15k rows
  • Close (kill) XL after each use
  • Empty collections after each use

With the problem in step two still present I assume the next thing to be tackled would be to release memory occupied by the large collections. I do already clear all data in these collections, but assume this does not free up memory.

Now to my questions:

How do I cleanup memory being used in:
  • No longer needed collections?
  • CSV files previously read in process?
  • XL files previously read in process?
  • Is there a limit (eg. 1Gb) on BP and its processes?

Happy coding!
---------------
Paul 
Sweden
Happy coding!
Paul, Sweden
(By all means, do not mark this as the best answer!)
3 REPLIES 3

Hi Paul,

The memory consumed by the Operating System can be cleared by forcefully invoking the Garbage Collector that is available with .NET as Garbage Collection usually is called as a separate thread by the OS in the backend at regular interval of times but by the time it is invoked again memory leaks may occur. 

You can use the attached Garbage Collection Object and call it somewhere at the end of your process or in between the parts when you do not require the collections anymore in your workflow.

------------------------------
----------------------------------
Hope it helps you out and if my solution resolves your query, then please mark it as the 'Best Answer' so that the others members in the community having similar problem statement can track the answer easily in future

Regards,
Devneet Mohanty
Intelligent Process Automation Consultant | Sr. Consultant - Automation Developer,
Wonderbotz India Pvt. Ltd.
Blue Prism Community MVP | Blue Prism 7x Certified Professional
Website: https://devneet.github.io/
Email: devneetmohanty07@gmail.com

----------------------------------
------------------------------
---------------------------------------------------------------------------------------------------------------------------------------
Hope this helps you out and if so, please mark the current thread as the 'Answer', so others can refer to the same for reference in future.
Regards,
Devneet Mohanty,
SS&C Blueprism Community MVP 2024,
Automation Architect,
Wonderbotz India Pvt. Ltd.

PvD_SE
Level 12
Hi Devneet,

Imported and built in the GC object into my process. Unfortunately, to no avail...

The GC is called after each clearing of the collections and after each ending of XL. Still, with a 70k row CSV I get the OutOfMemory error at close to 900Mb memory use of the Automate task, which adds up to 30% of memory use overall according to the task manager. There's a total of 16Gb of RAM in the PC, so how the 30% is calculated by the task manager I don't know.

@ewilson:
Is there perhaps some sort of max memory limit that can be used by BP itself, like 900Mb?​ If not: why does my process run out of memory at 900Mb?​
Happy coding!
Paul, Sweden
(By all means, do not mark this as the best answer!)

ewilson
Staff
Staff
@PvD_SE,

While parts of Blue Prism are 64 bit, the base Automate.exe process is a 32 bit application. I don't recall the exact reason for this, but I'm fairly certain it has to do with some of the underlying libraries that are being used. Since Automate.exe is 32 bit, that means it is limited to accessing no more than 2GB of memory (there's a special case to this depending on certain flags blah blah blah, but most 32 bit apps are unaware of this).

Have you seen this post I made a while back regarding file chunking in the Utility - File Management VBO? It might help you with the large CSVs in step #2.  I tested it with CSVs containing up to 5 million rows and 14 columns. It's also using OleDB, but perhaps there are some subtle difference between how I implemented it and how you are. 🤷‍♂️

Cheers,
Eric​​