cancel
Showing results for 
Search instead for 
Did you mean: 

Third Party Control/Scheduling

DylanPenman
Level 2
Hello all,

Looking to get some input from teams who have tried out using third party monitoring and have some experience. We are looking to automate with other means such as python/powershell in addition to Blue Prism. Ultimately we would like to be able to monitor all automation from one web application. I have seen some posts where developers mention using SQL Queries on the Blue Prism Database to get status information. What does that look like in terms of set up? Any Blue Prism documentation? What have been your experiences and lessons learned? 

Thanks in advance.

------------------------------
Dylan Penman
Associate Software Engineer
------------------------------
1 REPLY 1

BenKirimlidis
Level 7

Hi Dylan,

Strongly recommend using the BP SQL DB for reporting.  It will make your life a lot easier.  Other solution exist as well but I am not very familiar with them.

We initially used Blue Prism built in 'Work Queue'​ object to build reports in Excel but that is unwieldy and imprecise.

In our PROD deployment we have a BP App Server, a MS SQL DB but then we also have a Mirror DB created using SQL Availability Group Replication which mirrors any changes in PROD to a secondary database at virtually zero cost in terms of CPU cycles/overhead.

From there, we have a series of customer SQL queries that pull out specific table data and store them on a 3rd MS SQL database we call reporting.

This ensures the PROD DB is never touched, that the mirror DB is an exact match for PROD and then we can create customer tables, views, SPROCs to interact with that data on a separate server using refreshable data connections to excel reports.  Users frantically hitting refresh will never impact operations.

Doing this has allowed us to aggregate data for all operations on a process by process level for each customer within our business since the process went live, additionally we can show much more granular data for all activities in the previous 10 or so days depending on the user preference.

Further in the BPAQueue table there is a field called 'Data' which matches the collection called data in a work item collection.  Collections in BP are stored as XML strings so we can extract item level data, flags, notes, data read from systems etc. in additions to the normal work item data e.g. Loaded, Completed, Exception etc.  Only if the data collection has several nested collections you can run into some pains trying to extract data cleanly.

The best part is, all the data except for the specific fields used in a work item for a specific process are identical.  The only thing you need to do is point your query at a different process and adjust the names of some fields and BAM you got yourself a report with graphs, tables, separate sheets for exceptions and items that need to be reviewed by the business etc. for each new process you roll out.

Massive time saver, and improves on consistency between each project deployed so each developer is not just doing their own thing.

Also you can create an RPA control room report listing all the items completed, sessions ran, time saved per process etc.

Requirements:
1. Get a second DB server to mirror your PROD environment.
2. Get a 3rd DB to serve as a reporting DB that your customers call to directly
3. Play around with the DB and figure out the important parts you'll need.  Literally everything in he BP application, control room, studio is in the DB somewhere.
4. Build an generic shell of an excel report with pretty graphs etc.  Point it at a process and forget about it after.  Business can have almost real time reporting depending on how frequently the SPROCs in the reporting DB copy data over from the mirror.

STRONGLY RECOMMEND



------------------------------
Ben Kirimlidis
RPA Developer and Data Analyst
PTSB
Europe/London
------------------------------