a month ago
Hi,
I have looking for practical advice on how to call nested objects more effectively in Blue Prism workflows.
Example.
Process -> ObjectFunction -> ObjectApp
Do you have any concern/risk/suggestion?
Thanks for your advance.
Answered! Go to Answer.
a month ago
Hi Prachaya,
Referencing App Objects within other Objects is not recommended due to the below:
|
The amount of memory used when the process consumes the object into memory will be increased.
This is because when an object refers to another object and is used by a process, both objects are then loaded into memory by the process.
The above example shows a short process containing 3 Objects:
As can be observed on the diagram, when launching the third Object, that contains all the contents from the Notepad Object and an Action that calls the Swingset Object, both Objects are loaded into memory resulting in higher memory usage.
Please also note that the larger the objects are, the higher the differences will become potentially even causing “System.OutOfMemoryException” errors.
a month ago
Prachaya,
Memory usage, as noted by Asilarow, is the most common argument. However, if you are truly running these applications concurrently, I think the argument is probably moot as the memory usage will probably be about the same.
But I would add in two points, only from my own experience.
1. If you absolutely must run the applications concurrently, then it is what it is, but I would strongly lean towards running my applications consecutively and using the BP work queues to manage the handoffs. This does help to address the memory issue.
2. Nested Objects create headaches when troubleshooting, and even creating Releases/Packages. If you check the Dependencies at the 'ObjectFunction-Group' level (from your example), it will not show the dependencies of the nested Objects.
We have about six 'Wrapper Objects' (there are a few threads on this topic) intended to address very specific, oft-repeated steps. As an example, the steps to extract values from an Excel Worksheet to a Collection. Once every few months or so, I forget to check the 'lower' dependencies, only to be reminded when I am importing a Release to the next environment. Not the end of the world, but it gets a bit annoying.
Take care,
Red
a month ago
Hi Prachaya,
Referencing App Objects within other Objects is not recommended due to the below:
|
The amount of memory used when the process consumes the object into memory will be increased.
This is because when an object refers to another object and is used by a process, both objects are then loaded into memory by the process.
The above example shows a short process containing 3 Objects:
As can be observed on the diagram, when launching the third Object, that contains all the contents from the Notepad Object and an Action that calls the Swingset Object, both Objects are loaded into memory resulting in higher memory usage.
Please also note that the larger the objects are, the higher the differences will become potentially even causing “System.OutOfMemoryException” errors.
a month ago
Prachaya,
Memory usage, as noted by Asilarow, is the most common argument. However, if you are truly running these applications concurrently, I think the argument is probably moot as the memory usage will probably be about the same.
But I would add in two points, only from my own experience.
1. If you absolutely must run the applications concurrently, then it is what it is, but I would strongly lean towards running my applications consecutively and using the BP work queues to manage the handoffs. This does help to address the memory issue.
2. Nested Objects create headaches when troubleshooting, and even creating Releases/Packages. If you check the Dependencies at the 'ObjectFunction-Group' level (from your example), it will not show the dependencies of the nested Objects.
We have about six 'Wrapper Objects' (there are a few threads on this topic) intended to address very specific, oft-repeated steps. As an example, the steps to extract values from an Excel Worksheet to a Collection. Once every few months or so, I forget to check the 'lower' dependencies, only to be reminded when I am importing a Release to the next environment. Not the end of the world, but it gets a bit annoying.
Take care,
Red
4 weeks ago
Hi Asilarow,
Thank you for your great information.
Let me explain the issue. I have a large process (over 70 pages) and am trying to come up with ideas to reduce the number of pages.
What do you think about using nested processes? Do you have any suggestions?
2 weeks ago - last edited 2 weeks ago
Hi Prachaya,
I recommend segmenting your design into sub-processes.
See example below:
In this case the design has been split into 4 parts:
- The main process that runs the sub-processes sequentially
- The "Initial Checks" sub-process which validates input data, loads the queue if required (if not loaded already), and prepares the environment for work (i.e. mapped drives, temp files, etc.)
- The "Execute Transactions" process, which gets the work items from the queue and works them to completion
- The "Create Reports" process, which creates and sends out MI reports for the work carried out during this run.
This example assumes that all of this should be done sequentially.
If your scenario would require running multiple concurrent sessions and split the workloads between them, then the work loading part would need to be done outside of the main process, via a separate "work loader" process, scheduled to run at required intervals.
Using sub-processes not only reduces the amount of pages per process, but also reduces the memory usage at any given time.
This is because when a Process starts, it loads all of the contents, as well as Objects marked as its dependencies into memory for execution, which is released only once the process ends (e.g. steps through the end stage on the main page)
So the more stages and more objects used, the higher the memory usage will be.
Dividing this into sub-processes reduces this, as when a sub-process is finished it releases the memory it was using.