cancel
Showing results for 
Search instead for 
Did you mean: 

Historical data from Blueprism workqueue

MythiliRaju
Level 2

Hi,

Can anyone help me with the better solution to get the historical data from BP WQ ?  Like we clear BP WQ every 3 months.  But at any point in time if we want to get data beyond 3 months dated, how do we get that real time? 

Please note, if we prepare monthly/quarterly or consolidated excel report on daily basis, How do we avoid the memory issue if the volume is high?  

Thanks in advance !



------------------------------
Mythili Raju
------------------------------
6 REPLIES 6

Hi mythili raju,

1) To avoid memory issues. One of the best way I did in the past was dump daily report data in to sql database using sql dump and eventually at some point based on data quarterly and yearly data might be huge.. so I implemented dumping the data in sql instead of excel butis require to preplan while building the automation  in early stage but it doesn't mean you can't build later stage of automation you can certainly able to achieve that.

This databases eventually integrated to Tableau for reporting purpose .

2) once you delete the data the data will not be avilable. I  totally understand the reason for deleting the data, if you prefer the historical data the best way is to mirror and sync the current production environment to the new database so that the data will be in sync and the historical data is available  in the new database.

As mentioned you can  integrated the mirror database with any reporting tool 



------------------------------
-----------------------
If I answered your query. Please mark it as the "Best Answer"

Harish Mogulluri
Lead developer
America/New_York TX
------------------------------
-----------------------
If I answered your query. Please mark it as the Best Answer

Harish Mogulluri

Hi Mythili Raju,

1)  Memory issue for high volume of data(Reporting): To avoid this kind of memory issues instead of excel it would be more appropriate to use reporting tool like power BI , tableau and other  tools with SQL database.

If you generate quarterly or yearly eventually the data will be huge,  so instead its better to preplan, while building the automation to dump all the data in to sql database and utilize eventually for reporting.

2) HIstorical Data:  If you intend to delete the  work queue every 3 months due to performance issue.

Mirroring the current prod database and sync the latest data to the mirrored database so even if you delete the work queues the data will be present in mirrored database  and use that data  for reporting  purpose.



------------------------------
-----------------------
If I answered your query. Please mark it as the "Best Answer"

Harish Mogulluri
Lead developer
America/New_York TX
------------------------------
-----------------------
If I answered your query. Please mark it as the Best Answer

Harish Mogulluri

Hi Harish,

Thanks for writing an answer. 

Could you please provide more details on how to do the below?

". One of the best way I did in the past was dump daily report data in to sql database using sql dump and eventually at some point based on data quarterly and yearly data might be huge.. so I implemented dumping the data in sql instead of excel"



------------------------------
Mythili Raju
------------------------------

Hi Mythili,

The best way to handle this situation is to first identify the most important information needed for future reference in each process. Back up the work queue data from the past three months into a database before cleaning up the queue. This will make it easier to create different types of dashboards to analyze the data in future.



------------------------------
Athiban Mahamathi - https://www.linkedin.com/in/athiban-mahamathi-544a008b/
Technical Consultant,
SimplifyNext,
Singapore
------------------------------

Mukeshh_k
MVP

Hi@MythiliRaju : You might find this thread helpful : https://community.blueprism.com/discussion/blue-prism-archiver-setup#bm5ca3c31a-a787-41fb-8c2a-01893f5fb709



------------------------------
Kindly up vote this as "Best Answer" if it adds value or resolves your query in anyway possible, happy to help.

Regards,

Mukesh Kumar - Senior Automation Developer

NHS, England, United Kingdom, GB
------------------------------
Regards,

Mukesh Kumar

As everyone correctly recommended, using a database would be the optimal choice. However, to prevent memory issues, you can consider reducing the archive frequency to either weekly or biweekly. This approach will result in archiving less data compared to running it every 3 months.

Additionally, you can think of implementing a smart logic that could decide when to run the archive based on the count of items.

As we move towards Intelligent Automation it becomes important to start making our bots more and more intelligent 🙂



------------------------------
If I was of assistance, please vote for it to be the "Best Answer".

Thanks & Regards,
Tejaskumar Darji - https://www.linkedin.com/in/tejaskumardarji/
Technical Lead
------------------------------