cancel
Showing results for 
Search instead for 
Did you mean: 

Extract large volume of data from SQL table to excel via blueprism

PadmajaNadendla
Level 2
Hi,

I have an SQL table which is having 50 columns and 60000 rows that needs to be pulled and save in excel using blueprism. I am able to achieve this via bcp command but bcp utility should be installed in all the bots where ever i have to run this bluepirsm process. Could any one suggest if we have other way to pull the data from table with out causing timeout issue please.
5 REPLIES 5

Denis__Dennehy
Level 15
I'd seriously question if Excel is the best solution for such a large dataset,  is there a reason that Excel needs to be used or can this process be re-designed to use tools designed for migrating and transforming large datasets?

PadmajaNadendla
Level 2
Hi Denis, Thank you for your response. We use the excel file which got generated to upload into another platform where it will accept excel file or for some kind of analysis.

Denis__Dennehy
Level 15
In that case I would suggest getting to know the OLEDB object (I think it is on the BP Digital Exchange),   it allows you to treat excel as though it is a database with writes and queries.    I think that would probably be quicker and more stable with a large dataset then using the Excel VBO.   It might still be safer/wise to break down the data into multiple excel files to load in multiple uploads if the input system would allow that.

SasikanthMachav
Level 3
Hi,

I would suggest you to step into the DB object whatever you are using and in the code increase the timeout or may be you can remove the timeout property so that you can pull all the data from SQL without getting timeout error thrown by object.


Thanks 
Sasikanth

ewilson
Staff
Staff
@Padmaja Nadendla,

You mentioned you have a SQL table, but what type of database is it? Are we talking SQL Server or something else (ex. Oracle)? There are various VBOs on the Digital Exchange that can be used for querying different databases including dedicated VBOs for SQL Server, Oracle, SQLite, MySQL, PostgreSQL, etc. It might be beneficial to have a look at one of the dedicated VBOs.

As for the size of your table, you could always break your query up such that you pull 10K records at a time instead of all 60K at one time. That would give the process the ability to parse additional events and should get you away from any timeouts.

Cheers,
Eric​