<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Complaring collectons and removing duplicate Rows in Product Forum</title>
    <link>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57290#M11318</link>
    <description>Hi,
I have a process that runs at multiple different times of the day.
The process runs at 12pm, 2pm and 3pm daily.
The  input files are incremented each time the process runs. Eg: 2pm file contains 2pm and 12pm wire transactions, 3pm file contains 12pm, 2pm and 3pm wire transactions, 
I need to remove transactions that were previously processed.
So I need to compare collections are remove duplicate rows.

Suggestions welcome.</description>
    <pubDate>Fri, 28 Oct 2016 15:25:00 GMT</pubDate>
    <dc:creator>john_shiels</dc:creator>
    <dc:date>2016-10-28T15:25:00Z</dc:date>
    <item>
      <title>Complaring collectons and removing duplicate Rows</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57290#M11318</link>
      <description>Hi,
I have a process that runs at multiple different times of the day.
The process runs at 12pm, 2pm and 3pm daily.
The  input files are incremented each time the process runs. Eg: 2pm file contains 2pm and 12pm wire transactions, 3pm file contains 12pm, 2pm and 3pm wire transactions, 
I need to remove transactions that were previously processed.
So I need to compare collections are remove duplicate rows.

Suggestions welcome.</description>
      <pubDate>Fri, 28 Oct 2016 15:25:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57290#M11318</guid>
      <dc:creator>john_shiels</dc:creator>
      <dc:date>2016-10-28T15:25:00Z</dc:date>
    </item>
    <item>
      <title>Input file is excel.</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57291#M11319</link>
      <description>Input file is excel.
Robot loops through initial Input file and stores into a collection.
It will then need to loop through next incremented Input file, store into a collection. Then, the robot needs to remove the rows that were in the initial collection.</description>
      <pubDate>Fri, 28 Oct 2016 17:52:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57291#M11319</guid>
      <dc:creator>john_shiels</dc:creator>
      <dc:date>2016-10-28T17:52:00Z</dc:date>
    </item>
    <item>
      <title>If the collections are sorted</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57292#M11320</link>
      <description>If the collections are sorted, this is pretty easy - you loop through both simultaneously, and can increment them based on which is further behind, getting rid of the ones that overlap. Doing it this way can even be wrapped up in the single loop through to process the new array; essentially you loop through all recordsf rom the newer file, and skip them if they're also in the older one.
-Robin Toll</description>
      <pubDate>Fri, 28 Oct 2016 18:58:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57292#M11320</guid>
      <dc:creator>RobinToll</dc:creator>
      <dc:date>2016-10-28T18:58:00Z</dc:date>
    </item>
    <item>
      <title>As with all Blue Prism</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57293#M11321</link>
      <description>As with all Blue Prism processes, the expectation is that you are using a Blue Prism work queue as the basis for your work item processing.
As you load work into the Blue Prism work queue, you can simply check to ensure a duplicate item has not already been loaded into the queue to be worked - and if it has do not load it into the Blue Prism work queue again.  There are examples of loading work into the work queue  in one of the Process Examples (distributed in the same portal page as the process templates).</description>
      <pubDate>Mon, 31 Oct 2016 19:07:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57293#M11321</guid>
      <dc:creator>Denis__Dennehy</dc:creator>
      <dc:date>2016-10-31T19:07:00Z</dc:date>
    </item>
    <item>
      <title>yes but which actions to be</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57294#M11322</link>
      <description>yes but which actions to be used of VBO or we have to use code stage to achieve this?</description>
      <pubDate>Thu, 29 Jun 2017 17:29:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57294#M11322</guid>
      <dc:creator>SumitSingh</dc:creator>
      <dc:date>2017-06-29T17:29:00Z</dc:date>
    </item>
    <item>
      <title>yes but which actions to be</title>
      <link>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57295#M11323</link>
      <description>yes but which actions to be used of VBO or we have to use code stage to achieve this?</description>
      <pubDate>Thu, 29 Jun 2017 17:33:00 GMT</pubDate>
      <guid>https://community.blueprism.com/t5/Product-Forum/Complaring-collectons-and-removing-duplicate-Rows/m-p/57295#M11323</guid>
      <dc:creator>SumitSingh</dc:creator>
      <dc:date>2017-06-29T17:33:00Z</dc:date>
    </item>
  </channel>
</rss>

