Hi Felipe,
Typically, you'd filter a collection with the appropriate action on the BP Collection object. As you have found out, this works nice for all but larger collections that tend to get the infamous OutOfMemory error. There are two easy ways to avoid this:
- Filter the large collection in smaller chunks
- Filter the source of the collection
1. Smaller chunks:You first filter the first 10k rows of your collection by using the BP Collection object, adding the results to a new collection. Then you do the same with the next 10k rows of the large collection, again adding to the same new collection. This you repeat until all is filtered. The 10k chunk size is an indication, depending on the number of columns in you large collection, you may use larger or smaller chunks. From what I have experienced, even this method can sometimes lead to the OutOfMemory exception.
2. Filter the source:If you got the collection data by the process downloading a CSV or XL file, you might want to filter
while downloading the CSV or XL data, rather than later in the process. The best way imho to do this is by using OLEDB. There are a number of posts on this community that in great detail describe how to do that. Notably, this method is super fast and low on memory allocation and would be my preferred solution to your problem.
Happy coding!
---------------
Paul
Sweden
Happy coding!
Paul, Sweden
(By all means, do not mark this as the best answer!)