cancel
Showing results for 
Search instead for 
Did you mean: 

BEST PRACTICE - use 'Component' objects instead of Sub-Processes

Denis__Dennehy
Level 15
Hi all, There have been a couple of instances recently where I have seen client/partner processes using sub-processes everywhere to store reusable bits of their process or to store logic for parsing text etc. Best practice is to not use sub-processes in this kind of way, objects should be used instead. The reason is because of memory efficiency in the use of Blue Prism. I don€™t think this recommendation is actually written down anywhere anymore, especially since we stopped confusing new users with the concept of €˜Components€™ but we spent a bit of time at some enterprise clients a few years back re-orchestrating their solutions to use objects instead of sub-processes because of this issue. Every time a sub-process is called it is created anew in memory and destroyed (hopefully fully) when it ends. If your sub-process is being called for every case than this is very inefficient in memory and database use (having to poll for the entire process every time). Alternatively an object is just created once and remains in memory for next use. So the lesson is simple - use objects instead of sub-processes where possible.
7 REPLIES 7

TomBlackburn1
Level 7
Hi Denis, I know there was some discussion around this post, but can't find the comments.
Every time a sub-process is called it is created anew in memory and destroyed (hopefully fully) when it ends. If your sub-process is being called for every case than this is very inefficient in memory and database use (having to poll for the entire process every time). Alternatively an object is just created once and remains in memory for next use.
I have a task for a customer to create a master process, that will govern each of their core business processes. Understandably they have created processes, not an object to do this. Can you please clear up memory usage when calling sub-processes? Is their a datasheet in your library that you could point towards? - Tom

Anonymous
Not applicable
Hi Denis A little off subject but I currently have a process that has just started to cause the VM to run of memory. The process has been running fine for the past year but in the last few weeks a couple of large batches of work have pushed it over the memory limit. The page of process loops a collection of data (say 200+ rows) and for each row filters a collection of just under 4000 rows - it then loops those filtered results to identify matches (so a series of loops, filters, building collections etc. depending on matches each time). I've tried the clean memory utility but that really slows the processing time when called upon and still crashes out if the batch is big enough. The question your post has brought to mind is - if I was to put those stages inside an object and send in the collections (and output the matched collections) - would the object be less memory hungry than running the stages straight off the process page (perhaps if I looped rows into the object with large collection 1 at a time). I'm probably clutching at straws as this is not a subprocess being used (which your post refers). The other coincidence is that I've recently reduced the stage logging settings - which probably speeds it up. I don't know if that has added to the VM stress (seemed unlikey - but I've not retested with stages enabled again yet).

BuzzaT
Level 4
Every time a sub-process is called it is created anew in memory and destroyed (hopefully fully) when it ends. If your sub-process is being called for every case than this is very inefficient in memory and database use (having to poll for the entire process every time). Alternatively an object is just created once and remains in memory for next use.
How does this effect scope, out of curiosity? For instance, an object invoked from [Process]->[Object_A] compared to [Process]->[Some_Object]->[Object_A] Is the base object still loaded, but different data scopes are used, or is it an entirely new instance of Object_A? I'm curious because we actually have a few internal reference and external reference recursive object calls that use the idea of Object scope to have different active attachments or data sets.

John__Carter
Staff
Staff
By default it would be a new instance of Object_A. However BP v5 now has the concept of a 'shareable' application model, and you could make Some_Object share Object_A's model. Then it would be the same instance of Object_A. If you look in the BP help it has some diagrams to illustrate the concept.

John, I have a follow up question: Do shareable objects maintain static variable settings even if they're called at different levels of scope? Or is it just the object structure that's maintained but a new instance of variables are used? And just so I know, you're from the internal Blue Prism support group, yes?

BuzzaT
Level 4
Throwing this on here to hopefully get a 'New' tag for John to see on this topic. And with that it would be nice to have the ability to message or even notify other forum users.

Hi Denis, It is actually written in the Solution Designer Overview document, and there is a couple more references on this topic, nowadays we have some issues with a client who insist on create a lot of sub process, this had cause that the main process we develop become more and more slow through time.

------------------------------
Julio César Gallarzo Gutiérrez
Junior Systems Engineer
Minsait by Indra
America/Mexico_City
------------------------------