cancel
Showing results for 
Search instead for 
Did you mean: 
RishankKumar
Level 2
Status: New
For huge collections, data items having large data in them when the process studio invokes any object and passes the items to object studio it makes a copy of that object in the memory space. This slows down processes where the requirement is to edit the original passed object (collection or data item) after processing it with business logic on object studio.

There should be an option (in start/end stage) to pass the address reference only of large data items/collections so that object is not copied again in a different memory space.

Pass by object vs pass by reference  in .net

Use case : process huge datasets like collection having million rows, or huge image editing.
Eg: 2 copies of collection is made which consumes lot of memory is slow.
process studio
START>Action stage (call object 1, input as collection 1, output to collection1)>END
Object studio
START>CODE(takes input from collection1, data processing,output to collection2)>END(return collection2)
1 Comment
AndreyKudinov
Level 10
This! I had to de-VBO one of my processes into one big piece of garbage with code stages to actually make it work without running OOM on big data sets.

Another option that could help - optionally clean up VBO collections after page is run and output is passed to the caller - that would at least free up memory that is still used to hold that collection inside VBO. Currently to achieve that you need to make collections shared and have an extra stage to clean them up and GC.