cancel
Showing results for 
Search instead for 
Did you mean: 

Memory Management query on Internal Work Queues

ashutoshkulkarn
Level 2
I am trying to load an excel sheet containing more than 100k records in BluePrism Process Studio using Excel VBO object (It might go upwards of 300k in some cases). I am loading the collection into an internal work queue of BluePrism, but I get an error as quoted below: 'Load Data Into Queue' ERROR: Internal : Exception of type 'System.OutOfMemoryException' was thrown. What is the safe limit with which I can load data in Work Queues? Can I modify that limit (or free up memory beforehand to avoid it)? I plan to process records one by one from queue, and put them into new excel sheets categorically. Loading all that data in a collection and looping over it may be memory consuming, so I am trying to find out a more efficient way. The alternatives that I already can think of: 1. Loop over the data in collection, processing them one by one. 2. Keep excel sheet open, and pick one record at a time, process it and send to appropriate collection categorically. I welcome suggestions on which one of these could prove more efficient than others. Thanks for all the help/tips in advance!
2 REPLIES 2

ashutoshkulkarn
Level 2
Hi all, any suggestions? Sorry for bumping this post, I am really stuck for an alternative (This is my first automation) :D

John__Carter
Staff
Staff
The upper limit depends on the PC being used, and the amount of data in each row is also a factor. I would recommend reading/loading in stages, perhaps 1K rows at a time. Also consider the effect of logging large collections - where appropriate switch off the logging where the whole collection is being recorded in the DB logs. Working cases directly from the spreadsheet isn't recommended - multiple robot feeding from the same file will disrupt each other, there will be no MI, and restarting in the right place after a termination will be difficult.