27-07-23 10:21 AM
Hi,
Can anyone help me with the better solution to get the historical data from BP WQ ? Like we clear BP WQ every 3 months. But at any point in time if we want to get data beyond 3 months dated, how do we get that real time?
Please note, if we prepare monthly/quarterly or consolidated excel report on daily basis, How do we avoid the memory issue if the volume is high?
Thanks in advance !
27-07-23 10:59 AM
Hi mythili raju,
1) To avoid memory issues. One of the best way I did in the past was dump daily report data in to sql database using sql dump and eventually at some point based on data quarterly and yearly data might be huge.. so I implemented dumping the data in sql instead of excel butis require to preplan while building the automation in early stage but it doesn't mean you can't build later stage of automation you can certainly able to achieve that.
This databases eventually integrated to Tableau for reporting purpose .
2) once you delete the data the data will not be avilable. I totally understand the reason for deleting the data, if you prefer the historical data the best way is to mirror and sync the current production environment to the new database so that the data will be in sync and the historical data is available in the new database.
As mentioned you can integrated the mirror database with any reporting tool
27-07-23 11:24 AM
Hi Mythili Raju,
1) Memory issue for high volume of data(Reporting): To avoid this kind of memory issues instead of excel it would be more appropriate to use reporting tool like power BI , tableau and other tools with SQL database.
If you generate quarterly or yearly eventually the data will be huge, so instead its better to preplan, while building the automation to dump all the data in to sql database and utilize eventually for reporting.
2) HIstorical Data: If you intend to delete the work queue every 3 months due to performance issue.
Mirroring the current prod database and sync the latest data to the mirrored database so even if you delete the work queues the data will be present in mirrored database and use that data for reporting purpose.
27-07-23 11:58 AM
Hi Harish,
Thanks for writing an answer.
Could you please provide more details on how to do the below?
". One of the best way I did in the past was dump daily report data in to sql database using sql dump and eventually at some point based on data quarterly and yearly data might be huge.. so I implemented dumping the data in sql instead of excel"
27-07-23 02:13 PM
Hi Mythili,
The best way to handle this situation is to first identify the most important information needed for future reference in each process. Back up the work queue data from the past three months into a database before cleaning up the queue. This will make it easier to create different types of dashboards to analyze the data in future.
27-07-23 02:30 PM
Hi@MythiliRaju : You might find this thread helpful : https://community.blueprism.com/discussion/blue-prism-archiver-setup#bm5ca3c31a-a787-41fb-8c2a-01893f5fb709
28-07-23 08:05 AM
As everyone correctly recommended, using a database would be the optimal choice. However, to prevent memory issues, you can consider reducing the archive frequency to either weekly or biweekly. This approach will result in archiving less data compared to running it every 3 months.
Additionally, you can think of implementing a smart logic that could decide when to run the archive based on the count of items.
As we move towards Intelligent Automation it becomes important to start making our bots more and more intelligent 🙂