Hi John,
I will let others add their feedback on potential mitigation suggestions, API workarounds, etc., and other best-practices to prevent OOM events and SQL timeouts when working with large collections, but wanted to ensure you have access to our best-practice guidance in our Knowledge Base article, "
How do I avoid Out Of Memory issues?", specifically the section on (and sub-sections under) 'Process or Object design considerations'. The article, "
How do I fix a SQL Server 'Timeout expired' error?" contains some steps you could implement for this scenario as well.
To address your question directly, there is not a specific maximum filesize for data items/collections that can cause these types of scenarios, as there are numerous factors (environmental, collection type, sizes of Processes/Objects, etc.) that can allow them to occur. Our general guidance for this is to break up the data into more manageable "chunks" if possible, implement regular garbage collection and wait stages in your Process design, and ensure the database is healthy enough to process large sets of data by following the guidance in our Maintaining a Blue Prism Database Server documentation.
------------------------------
Steve Boggs
Senior Software Support Engineer
Blue Prism
Austin, TX
------------------------------