Extract large volume of data from SQL table to excel via blueprism
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
27-10-22 11:20 AM
Hi,
I have an SQL table which is having 50 columns and 60000 rows that needs to be pulled and save in excel using blueprism. I am able to achieve this via bcp command but bcp utility should be installed in all the bots where ever i have to run this bluepirsm process. Could any one suggest if we have other way to pull the data from table with out causing timeout issue please.
I have an SQL table which is having 50 columns and 60000 rows that needs to be pulled and save in excel using blueprism. I am able to achieve this via bcp command but bcp utility should be installed in all the bots where ever i have to run this bluepirsm process. Could any one suggest if we have other way to pull the data from table with out causing timeout issue please.
5 REPLIES 5
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
27-10-22 01:29 PM
I'd seriously question if Excel is the best solution for such a large dataset, is there a reason that Excel needs to be used or can this process be re-designed to use tools designed for migrating and transforming large datasets?
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
28-10-22 03:50 PM
Hi Denis, Thank you for your response. We use the excel file which got generated to upload into another platform where it will accept excel file or for some kind of analysis.
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
28-10-22 04:10 PM
In that case I would suggest getting to know the OLEDB object (I think it is on the BP Digital Exchange), it allows you to treat excel as though it is a database with writes and queries. I think that would probably be quicker and more stable with a large dataset then using the Excel VBO. It might still be safer/wise to break down the data into multiple excel files to load in multiple uploads if the input system would allow that.
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
31-10-22 08:01 AM
Hi,
I would suggest you to step into the DB object whatever you are using and in the code increase the timeout or may be you can remove the timeout property so that you can pull all the data from SQL without getting timeout error thrown by object.
Thanks
Sasikanth
I would suggest you to step into the DB object whatever you are using and in the code increase the timeout or may be you can remove the timeout property so that you can pull all the data from SQL without getting timeout error thrown by object.
Thanks
Sasikanth
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
31-10-22 01:51 PM
@Padmaja Nadendla,
You mentioned you have a SQL table, but what type of database is it? Are we talking SQL Server or something else (ex. Oracle)? There are various VBOs on the Digital Exchange that can be used for querying different databases including dedicated VBOs for SQL Server, Oracle, SQLite, MySQL, PostgreSQL, etc. It might be beneficial to have a look at one of the dedicated VBOs.
As for the size of your table, you could always break your query up such that you pull 10K records at a time instead of all 60K at one time. That would give the process the ability to parse additional events and should get you away from any timeouts.
Cheers,
Eric
You mentioned you have a SQL table, but what type of database is it? Are we talking SQL Server or something else (ex. Oracle)? There are various VBOs on the Digital Exchange that can be used for querying different databases including dedicated VBOs for SQL Server, Oracle, SQLite, MySQL, PostgreSQL, etc. It might be beneficial to have a look at one of the dedicated VBOs.
As for the size of your table, you could always break your query up such that you pull 10K records at a time instead of all 60K at one time. That would give the process the ability to parse additional events and should get you away from any timeouts.
Cheers,
Eric