29-05-25 11:42 AM
Context:
We are using MS Excel VBO and the Write Collection action to write data to Excel. Specifically, we are writing one row at a time, with each row containing approximately 20 columns, resulting in a total of around 3,000 rows. During each iteration, the memory accumulates without being released, leading to system crashes, RPA server errors, and loss of connections. This issue does not occur when Excel is visible.
Detailed Problem:
Request for Assistance:
We need further investigation and correction to address the memory release issue for objects interacting with Excel when it is hidden. This is crucial to ensure the stability and performance of our RPA processes.
Any insights or solutions to resolve this issue.
29-05-25 12:07 PM
Hi @MartaCSilva
Thank you for being an SS&C Customer and for using Blue Prism.
I am sorry you are having an issue with the Excel VBO. I will review the issue and try to replicate it and if I need further information from you, I will revert back here.
It may take me a couple of days to investigate this fully, but rest assured I will report back progress accordingly.
regards
29-05-25 04:18 PM
Hi,
https://community.blueprism.com/t5/Product-Forum/Looking-for-some-advice-on-how-to-use-Garbage-collector/m-p/52226#M7144 have a look into this.
If still facing the problem, try updating the Excel using OLEDB ( whole batch update instead of one row at a time)
02-06-25 08:15 PM
Hi,
One solution is to use the garbage collector to free up memory. (usually if I know I'm dealing with large amounts of data I clean the garbage before starting to write and also after writing. I think you can also find the utility object called like this on the Blue Prism exchange). Another thing that could help is to treat the collection as a csv and use the action in Excel to import CSV which is much faster than the normal writing as an Excel action.
Simone
05-06-25 03:22 PM
@MartaCSilva to properly troubleshoot this, we'll need more information. Is it possible to share a screenshot of the process workflow where you're writing the rows, so we can see all the actions you're performing? If not, please detail them in a response. We need to understand the entire interaction with Excel from the perspective of your process.
Cheers,
Eric
06-06-25 02:56 PM
Memory errors interfacing with large excel documents with lots and lots of columns is probably one of the oldest Blue Prism issues. Suggested areas to attach this:
09-06-25 06:40 AM
I'm not sure but I notice that even I use the garbage collector, memory issues still occurs. Specially when I run huge data or multiple collections. There are also cases that if there are characters or special characters in some row or columns that has data that exceeds it's maximum character limit then an error will occur.
Here are some of the solutions I tried to avoid system out of memory issues/error.
(1) Before end of the page, I create new loop calling the used collection. In the first loop I connect it to end stage.
(2) Using collection manipulation, create a logic that will copy each row without using the loop function.
- with this approach, you can validate and/or review the data or character that will encounter an error.
I hope it helps.
09-06-25 02:19 PM
Ther are various things that could be going on here, but the most important to understand is that while Blue Prism is based on the .NET Framework (a managed execution environment that automatically cleans up memory via the garbage collection process), Microsoft Excel is a traditional Windows (COM-based) application. We have to leverage COM-interop to be able to work with the local Excel client from Blue Prism, so there is always the potential for memory leaks even when you've executing commands from .NET.
@MartaCSilva one thing that's not clear is that you mention memory increases during each iteration. Between iterations is the process stopping? Do you ever perform a shutdown/cleanup of Excel itself, or is this process continually using the same Excel instance?
Cheers,
Eric
10-06-25 11:09 AM
Hi @MartaCSilva
I had a similar issue once and i made a few changes to the excel interactions which improved the issue and prevented the memory issue. Firstly check the version of excel you are on, older versions are limited to 2gb memory in the application and no amount of additional RAM on the machine will change this but if you are able to upgrade to a newer version of excel then this can increase the memory of the application. The biggest changes I made were to turn off the autocalculate function in the excel files while updating them as everytime you add a new row of data any formulas will automatically recalculate which increases the memory usage. Finally if possible limit the number of excel files you have open at the same time, if you only need one file open and any others can be closed then do that as the extra files are also consuming memory even when you arent doing anything with them.
I had the same issue and i couldnt upgrade to the latest version of excel but the other changes made huge improvements. I was working with a file with a large number of formulas and turning off auto calculate when the bot opened the file was the biggest fix to the memory issue, once i added the new data i turned it back on to recalculate which takes a bit of time to finish but i dont see the memory issues anymore.
Last point would be if you are writing from a collection to excel then oledb might work but i can cause issues with data type mismatch so an alternative could be to convert the collection to text then convert the text to a csv file and them import the cvs file to your working excel file. Or open it then copy and paste.
Hope these tips help 🙂