Looking for some advice on how to use Garbage collector
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
30-10-20 08:51 AM
For one of my process, BP automate.exe memory consumption is reaching > 1 GB after a step to kill excel process.
Before that step automate.exe is within 100mb. I am not sure how killing excel process impacts automate.exe memory. I know that excel process was running on high memory and this is why I am not closing workbook it rather killing excel process.
I was thinking to try Garbage collector as mentioned here - https://portal.blueprism.com/customer-support/support-center#/path/Operational-Support/Performance/Routine-Maintenance/1141009962/How-do-I-avoid-Out-Of-Memory-issues.htm
However I want to know the right way of implementing it, shall i create a new VBO with just 1 action and 1 code stage in that action.
Call this action from process.
I believe garbage collector works at system level, so it won't matter if its a separate VBO, however it will be called within same session of process.
Looking for some inputs if there is a better way to design and use it for results.
------------------------------
Mayank Goyal
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
30-10-20 09:39 AM
If you read a lot of data to collection before killing it, then it is expected that your memory consumption goes up.
Problem is that forcing GC won't help much here, because GC can only free memory that is not referenced in your proceess/VBOs anymore and it should naturally happen on it's own (but sometimes it still helps to force it).
Another problem to consider is that you have VBO that is reading data into collection inside VBO, then it gets copied into your process level, but never freed inside.
You can only clean up collection inside default VBO by running same action again on some empty sheet that would return empty or small collection, then run GC action - that should free up some memory. Alternatively you can refactor VBO to store all output data in shared global collection and have action to empty it and do GC.
------------------------------
Andrey Kudinov
Project Manager
MobileTelesystems PJSC
Europe/Moscow
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
30-10-20 01:26 PM
Having higher memory consumption is not generally bad, except automate.exe process stops working because it runs out of memory.
Why do you have to kill Excel process instead of closing the application normally?
This sounds a little like there are generally low resources available?!
How do you read data from Excel?
- Copy/paste of sheet content, cell by cell, ... sometimes Excel thinks more rows/columns are used than actually containing data, resulting in huge memory consumption
Do you work with (several) Collections?
- whole Collection structures and used memory could be multiplied even subset is needed
Do you use Collections as in-/output parameters?
- this could lead to duplication of consumed memory
Garbage collection will only work if there is any memory to be released. BP keeps memory of Collections as long as VBO is in use (please feel free to correct me). The VBO will be in use until the parent process is active. So you might want to structure your automation in sub-processes for certain set of actions. Garbage collection could be then enforced in the Clean Up steps of VBO and process.
If restructuring of processes/VBO is not possible, it might be worth trying "truncate" Collections to reduce the claimed memory.
------------------------------
Walter Koller
Solution Manager
Erste Group IT International GmbH
Europe/Vienna
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
30-10-20 03:56 PM
I don't have any large data copied in collection or any other variable.
------------------------------
Mayank Goyal
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
30-10-20 06:36 PM
Don't kill it, try Close instance.
------------------------------
Andrey Kudinov
Project Manager
MobileTelesystems PJSC
Europe/Moscow
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
30-10-20 09:18 PM
On the side note, I have a template with predefined pivots (many) and I have to paste data in one sheet in this template and refresh pivots. The data is huge and pivots are complex, when refreshing it takes excel process memory to above 2gb and thats where all these challenges are coming up. Do you know an alternate effective library (other than COM interop or Macros) where I can refresh these pivots?
------------------------------
Mayank Goyal
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
03-11-20 07:10 PM
Please check below link where you can download the Garbage collector utility.
https://rpatools.com/2019/02/garbage-collection-memory-management-in-blue-prism/
Thanks
Nilesh
------------------------------
Nilesh Jadhav
Senior RPA Specialist
ADP
India
------------------------------
Consultant
ADP,India
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
08-03-21 03:10 PM
I ran in 2 scenario
1st Scenario
I have excel with 2300 rows and 10 Columns , after writing into collection from excel , neither removing all rows nor writing empty sheet into collection , Collection contains all 2300 records.
Captured the results using GC.GettotalMemory(False) before explicitly run GC.Collet() and after explicitly run GC.Collet() post did copied from excel into collection.
The result Before GC.COllect() is 91 MB and it reclaimed memory output shown 46 MB .
2nd Scenario
I have used same excel with 2300 rows and 10 Columns , after writing into collection from excel , removed all rows using collecction manipulation now collection is empty
Captured the results using GC.GettotalMemory(False) before explicitly run GC.Collet() and after explicitly run GC.Collet() post did copied from excel into collection.
The result Before GC.COllect() is 96 MB and it reclaimed memory output shown 45 MB .
My question is
if this is true You can only clean up collection inside default VBO by running same action again on some empty sheet that would return empty or small collection, then run GC action - that should free up some memory.
,why and how GC.Collect reclaimed memory from 1st scenario almost like 2nd scenario .
Should i use another code to see the results or something wrong with GC.Collet()?
Thanks for response
------------------------------
Pradeep Surendar
IT admin
TCS
Asia/Kolkata
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
09-03-21 10:38 PM
I could consistently reproduce a process running into OOM on some data set, then dereferencing (assigning new Datatable() to a big collection) within a VBO avoided OOM on the same data set. I believe this is still true in current, because collection within a VBO is still referenced and consumes memory.
Using shared collection for multiple actions and clearing it keeps memory in check, but you need to be careful with empty inputs, because shared collection on another action page is not cleared on page start and empty input would not replace existing data, which leads to some unexpected results.
Another issue is that blueprism worker seems to leak memory. If you run a process that does big collection manipulations, worker never goes back to that 100mb ram usage it starts with. It can end up with like 400Mb+, then next process might get OOM, unless that next process initializes the same VBO. Simply restarting worker process fixes it. Clearing huge collections in VBOs before exiting seems to mostly make it work as expected.
This one might have been improved in 6.5+, although worker memory consumption still seem to build up over time.
------------------------------
Andrey Kudinov
Project Manager
MobileTelesystems PJSC
Europe/Moscow
------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
25-03-22 03:24 PM
Thanks
------------------------------
Janu RPA Developer
------------------------------
