<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Memory Management query on Internal Work Queues in Product Forum</title>
    <link>https://community.blueprism.com/t5/Product-Forum/Memory-Management-query-on-Internal-Work-Queues/m-p/45243#M1519</link>
    <description>I am trying to load an excel sheet containing more than 100k records in BluePrism Process Studio using Excel VBO object (It might go upwards of 300k in some cases). I am loading the collection into an  internal work queue of BluePrism, but I get an error as quoted below:
'Load Data Into Queue' ERROR: Internal : Exception of type 'System.OutOfMemoryException' was thrown.

What is the safe limit with which I can load data in Work Queues? Can I modify that limit (or free up memory beforehand to avoid it)?

I plan to process records one by one from queue, and put them into new excel sheets categorically. Loading all that data in a collection and looping over it may be memory consuming, so I am trying to find out a more efficient way.
The alternatives that I already can think of:
1. Loop over the data in collection, processing them one by one.
2. Keep excel sheet open, and pick one record at a time, process it and send to appropriate collection categorically.
I welcome suggestions on which one of these could prove more efficient than others.

Thanks for all the help/tips in advance!</description>
    <pubDate>Mon, 21 Nov 2016 19:42:00 GMT</pubDate>
    <dc:creator>ashutoshkulkarn</dc:creator>
    <dc:date>2016-11-21T19:42:00Z</dc:date>
    <item>
      <title>Memory Management query on Internal Work Queues</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Memory-Management-query-on-Internal-Work-Queues/m-p/45243#M1519</link>
      <description>I am trying to load an excel sheet containing more than 100k records in BluePrism Process Studio using Excel VBO object (It might go upwards of 300k in some cases). I am loading the collection into an  internal work queue of BluePrism, but I get an error as quoted below:
'Load Data Into Queue' ERROR: Internal : Exception of type 'System.OutOfMemoryException' was thrown.

What is the safe limit with which I can load data in Work Queues? Can I modify that limit (or free up memory beforehand to avoid it)?

I plan to process records one by one from queue, and put them into new excel sheets categorically. Loading all that data in a collection and looping over it may be memory consuming, so I am trying to find out a more efficient way.
The alternatives that I already can think of:
1. Loop over the data in collection, processing them one by one.
2. Keep excel sheet open, and pick one record at a time, process it and send to appropriate collection categorically.
I welcome suggestions on which one of these could prove more efficient than others.

Thanks for all the help/tips in advance!</description>
      <pubDate>Mon, 21 Nov 2016 19:42:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Memory-Management-query-on-Internal-Work-Queues/m-p/45243#M1519</guid>
      <dc:creator>ashutoshkulkarn</dc:creator>
      <dc:date>2016-11-21T19:42:00Z</dc:date>
    </item>
    <item>
      <title>Hi all, any suggestions?</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Memory-Management-query-on-Internal-Work-Queues/m-p/45244#M1520</link>
      <description>Hi all, any suggestions?
Sorry for bumping this post, I am really stuck for an alternative (This is my first automation) &lt;span class="lia-unicode-emoji" title=":grinning_face_with_smiling_eyes:"&gt;😄&lt;/span&gt;</description>
      <pubDate>Thu, 24 Nov 2016 15:22:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Memory-Management-query-on-Internal-Work-Queues/m-p/45244#M1520</guid>
      <dc:creator>ashutoshkulkarn</dc:creator>
      <dc:date>2016-11-24T15:22:00Z</dc:date>
    </item>
    <item>
      <title>The upper limit depends on</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Memory-Management-query-on-Internal-Work-Queues/m-p/45245#M1521</link>
      <description>The upper limit depends on the PC being used, and the amount of data in each row is also a factor. I would recommend reading/loading in stages, perhaps 1K rows at a time. Also consider the effect of logging large collections - where appropriate switch off the logging where the whole collection is being recorded in the DB logs. Working cases directly from the spreadsheet isn't recommended - multiple robot feeding from the same file will disrupt each other, there will be no MI, and restarting in the right place after a termination will be difficult.</description>
      <pubDate>Thu, 24 Nov 2016 15:38:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Memory-Management-query-on-Internal-Work-Queues/m-p/45245#M1521</guid>
      <dc:creator>John__Carter</dc:creator>
      <dc:date>2016-11-24T15:38:00Z</dc:date>
    </item>
  </channel>
</rss>

