cancel
Showing results for 
Search instead for 
Did you mean: 

Collection Maximum Rows

JordanGreen
Level 3
I'm trying to move data from one spread sheet from another. The first spread sheet has roughly 1400 columns. I can pull these into a collection via a OLEDB connections and a SQL query, however, when I then try to write this collection back out, it only goes up to column 255. is this a hard limit? does anyone have any suggestions to get around this? I know I could pull the column in chunks, but this would mean using a series of very large SQL Queries, rather than just SELECT * Cheers guys.
9 REPLIES 9

John__Carter
Staff
Staff
I think that might be an Excel limitation? In theory BP has no max rows/columns for collections, but realistically a mega collection will eat memory and grind the PC to a halt.

John__Carter
Staff
Staff
Correction - Google suggests it's an OLEDB limitation.

JordanGreen
Level 3
Hi John,   yes I have seen the same. there seems to be some workarounds, such as connecting to a .csv rather than a .xlsx. However, I submitted a ticket to the help desk as well and they have advised that there is a limit currently to the number of columns in a collection, around 255. Looking to fix it in a later release, but nothing coming imminently.

MariaSerrano
Level 2
Hey John Carter,   We are working with a 50.000 row collection of 40 columns. All of them contain plain text or number. We are getting the error: ""Internal: System.OutOfMemoryExcepction"". Is this because the high number of rows?   Additionally, this rows are obtained from a CSV file. Once we got the collection filled in BP, we try to filter it by applying the BVO internal of collection manipulation (action filter).   Thanks,

John__Carter
Staff
Staff
Yes probably. The workaround is to try to consume less data in one go, eg do you need to read all 40 columns or can you read the 50K row in chunks. The big advantage of OLEDB is that with a query you can target the data you want instead of taking the whole lot.

ArnauPerramon_B
Level 2
Hello, We have a TXT filled of data delimited by pipes (|). This data considered a CSV can't be worked in BP because it gets an error of system out of memory. We tried to use OLEDB but it doen't work with this type of file. We tried and researched for importing this csv to the access, what is the best way to do it? Arnau

John__Carter
Staff
Staff
I'm sure there will be a way to do it somehow. Try https://www.connectionstrings.com/textfile/

NicholasBell1
Level 3
Arnau, perhaps writing VBA code in an Access macro (see first link below), then call that macro in Blue Prism via command line (see second link) http://www.accessmvp.com/kdsnell/EXCEL_Import.htm http://www.fmsinc.com/microsoftaccess/macro/index.htm   This avoids the use of Blue Prism collections which are inefficient for large data files.  

Hello @ArnauPerramon_B , I know this is an old thread. 

However, I have a few suggestions:
1. Can you open the CSV file using Blue Prism's Excel (or Excel Extended) VBO?
2. If step 1 above is successful, you could use the various actions in Excel VBO for the work you want to perform.

Thanks



------------------------------
Kingsley David
------------------------------