cancel
Showing results for 
Search instead for 
Did you mean: 

MS Excel VBO

MartaCSilva
Level 2

Context:

We are using MS Excel VBO and the Write Collection action to write data to Excel. Specifically, we are writing one row at a time, with each row containing approximately 20 columns, resulting in a total of around 3,000 rows. During each iteration, the memory accumulates without being released, leading to system crashes, RPA server errors, and loss of connections. This issue does not occur when Excel is visible.

Detailed Problem:

  1. Memory Accumulation: Each iteration of writing a row to Excel results in memory accumulation.
  2. Memory Not Released: The memory is not being released correctly, causing the system to eventually crash.
  3. Visibility Issue: The problem only occurs when Excel is hidden. When Excel is visible, the memory is managed correctly, and no crashes occur.

Request for Assistance:

We need further investigation and correction to address the memory release issue for objects interacting with Excel when it is hidden. This is crucial to ensure the stability and performance of our RPA processes.

Any insights or solutions to resolve this issue.

8 REPLIES 8

GeoffHirst
Staff
Staff

Hi @MartaCSilva 

Thank you for being an SS&C Customer and for using Blue Prism.

I am sorry you are having an issue with the Excel VBO. I will review the issue and try to replicate it and if I need further information from you, I will revert back here.

It may take me a couple of days to investigate this fully, but rest assured I will report back progress accordingly.

regards

Geoff Hirst Senior DX Engineer - Digital Exchange - EMEA SS&C Blue Prism

Hi,

https://community.blueprism.com/t5/Product-Forum/Looking-for-some-advice-on-how-to-use-Garbage-collector/m-p/52226#M7144 have a look into this. 
If still facing the problem, try updating the Excel using OLEDB ( whole batch update instead of one row at a time)

Hi,

One solution is to use the garbage collector to free up memory. (usually if I know I'm dealing with large amounts of data I clean the garbage before starting to write and also after writing. I think you can also find the utility object called like this on the Blue Prism exchange). Another thing that could help is to treat the collection as a csv and use the action in Excel to import CSV which is much faster than the normal writing as an Excel action.

Simone

ewilson
Staff
Staff

@MartaCSilva to properly troubleshoot this, we'll need more information. Is it possible to share a screenshot of the process workflow where you're writing the rows, so we can see all the actions you're performing? If not, please detail them in a response. We need to understand the entire interaction with Excel from the perspective of your process.

Cheers,
Eric

Denis__Dennehy
Level 16

Memory errors interfacing with large excel documents with lots and lots of columns is probably one of the oldest Blue Prism issues.  Suggested areas to attach this:

  • The link provided in a previous comment about invoking the garbage collector occasionally during your excel interactions.
  • Ensure you are using the base version of the Excel VBO as supplied by Blue Prism,  not a version that has been modified/hacked about with over the years to add potential issues.
  • Think about re-orchestrating your solution.  Do you need all the rows/columns in this spreadsheet.  Does it need to be a spreadsheet at all??  In my last role my company were proactively removing spreadsheets from business processes and replacing them with apps/forms where user interactions were needed, and databases/dashboards where data was for presentation.
  • For very big spreadsheets using the ODBC VBO to interface with your excel is far more efficient and far quicker, so look to see if that might be a better option for you.   It makes most sense where your excel is basically a large data repository requiring little or no formatting.

floresj
Level 3

I'm not sure but I notice that even I use the garbage collector, memory issues still occurs. Specially when I run huge data or multiple collections. There are also cases that if there are characters or special characters in some row or columns that has data that exceeds it's maximum character limit then an error will occur.

Here are some of the solutions I tried to avoid system out of memory issues/error.

(1) Before end of the page, I create new loop calling the used collection. In the first loop I connect it to end stage.


(2) Using collection manipulation, create a logic that will copy each row without using the loop function.

- with this approach, you can validate and/or review the data or character that will encounter an error.

I hope it helps.

ewilson
Staff
Staff

Ther are various things that could be going on here, but the most important to understand is that while Blue Prism is based on the .NET Framework (a managed execution environment that automatically cleans up memory via the garbage collection process), Microsoft Excel is a traditional Windows (COM-based) application. We have to leverage COM-interop to be able to work with the local Excel client from Blue Prism, so there is always the potential for memory leaks even when you've executing commands from .NET.

@MartaCSilva one thing that's not clear is that you mention memory increases during each iteration. Between iterations is the process stopping? Do you ever perform a shutdown/cleanup of Excel itself, or is this process continually using the same Excel instance?

Cheers,
Eric

michaeloneil
Verified Partner

Hi @MartaCSilva 

I had a similar issue once and i made a few changes to the excel interactions which improved the issue and prevented the memory issue. Firstly check the version of excel you are on, older versions are limited to 2gb memory in the application and no amount of additional RAM on the machine will change this but if you are able to upgrade to a newer version of excel then this can increase the memory of the application. The biggest changes I made were to turn off the autocalculate function in the excel files while updating them as everytime you add a new row of data any formulas will automatically recalculate which increases the memory usage. Finally if possible limit the number of excel files you have open at the same time, if you only need one file open and any others can be closed then do that as the extra files are also consuming memory even when you arent doing anything with them.

I had the same issue and i couldnt upgrade to the latest version of excel but the other changes made huge improvements. I was working with a file with a large number of formulas and turning off auto calculate when the bot opened the file was the biggest fix to the memory issue, once i added the new data i turned it back on to recalculate which takes a bit of time to finish but i dont see the memory issues anymore.

Last point would be if you are writing from a collection to excel then oledb might work but i can cause issues with data type mismatch so an alternative could be to convert the collection to text then convert the text to a csv file and them import the cvs file to your working excel file. Or open it then copy and paste.

Hope these tips help 🙂

#MVP