Memory Management query on Internal Work Queues
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
21-11-16 07:42 PM
I am trying to load an excel sheet containing more than 100k records in BluePrism Process Studio using Excel VBO object (It might go upwards of 300k in some cases). I am loading the collection into an internal work queue of BluePrism, but I get an error as quoted below:
'Load Data Into Queue' ERROR: Internal : Exception of type 'System.OutOfMemoryException' was thrown.
What is the safe limit with which I can load data in Work Queues? Can I modify that limit (or free up memory beforehand to avoid it)?
I plan to process records one by one from queue, and put them into new excel sheets categorically. Loading all that data in a collection and looping over it may be memory consuming, so I am trying to find out a more efficient way.
The alternatives that I already can think of:
1. Loop over the data in collection, processing them one by one.
2. Keep excel sheet open, and pick one record at a time, process it and send to appropriate collection categorically.
I welcome suggestions on which one of these could prove more efficient than others.
Thanks for all the help/tips in advance!
2 REPLIES 2
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
24-11-16 03:22 PM
Hi all, any suggestions?
Sorry for bumping this post, I am really stuck for an alternative (This is my first automation) 😄
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
24-11-16 03:38 PM
The upper limit depends on the PC being used, and the amount of data in each row is also a factor. I would recommend reading/loading in stages, perhaps 1K rows at a time. Also consider the effect of logging large collections - where appropriate switch off the logging where the whole collection is being recorded in the DB logs. Working cases directly from the spreadsheet isn't recommended - multiple robot feeding from the same file will disrupt each other, there will be no MI, and restarting in the right place after a termination will be difficult.
