When creating manage connections with JSON files for external connections, it is difficult to trouble shoot exactly why the manage connection does not work. Some sort of tools or utility to trouble shoot, see logs, or reference information to reduce support hours in making good Rest API 3.0 JSON files would be welcomed. Also, some base connections for Chorus integration; to ensure integration endpoints are configured correctly in Hosted SS&C Cloud.
... View more
Raising on behalf of one of our customers after a discussion this morning. The issue at hand was that it appears 2 users may have inadvertently updated an Environment Variable at, or around, the same time, resulting in the old variable remaining set. The idea is to have some form of lock, or cache/track of an Environment Variable being set to reduce or eliminate this possibility in the future.
... View more
Currently shared Outlook mailboxes are not supported with the Capture product. We would like the business model updated to support shared mailboxes. This helps the client by eliminating or reducing licensing requirements on the physical mailboxes.
... View more
I would suggest that Chorus have an email notification preference added that can be turned on or off depending on the individual preference. Basically, allowing users to turn on the ability to get an email notification every time there is a new item that gets assigned to them in their worklist. The email should have a link to the new item that will automatically open the work in the Chorus Portal client, i.e. similar to the email notifications from Workday or ServiceNow.
... View more
Currently, when editing a stage properties, or writing an expression, on the right hand-side of the window we have a data items filter: It would be really handy if this filter could have another group added - Exposure. This would help finding all environment variables, session variables, and statistic data items, and take away the need for data item naming conventions as a workaround to find them easily.
... View more
It would be preferable to have an additional setting with regards to logging and sensitive data. While logging can be disabled or set to not log the parameters, if such instance occurs where full logging has to be enabled at resource level to determine an issue - it would be ideal to have been able to set a level of logging that still would not be captured in the logs. We have many processes that deal with sensitive data and have to build numerous work arounds to get the data in a way that would not accidently be exposed in the logs. If there was a setting similar to not log parameters it would allow us to build "normally" knowing that if full logging needed to be enabled, this data would be safe from the logs.
... View more
With BluePrism Hub there is a need to store and distribute the API key. If this could be automated so users can request this and rabbitmq could distribute the key. It would probably need to have an approval or HITL stage gate. Moderator note: Changed the title of this idea to sentence case.
... View more
Raised on behalf of Antonio Mathias Kühle: We wish that it is possible to have a separate log that show all API request transactions, with all details, so you can track exactly what your request contains, and what the response was. This log could be a page visible in the BP interface, but the easy fix is just to create a DB table or log file for this. Important thing is that it is something you can toggle "on" "off" from API configuration settings, separately on each API. And also important for the DB table is that you can add "number of days" to keep the logs before they are automatically deleted.
... View more
When a user tries to open an email (.msg) file uploaded to Chorus, the Content Viewer (CV) window opens but the email file is not displayed in the CV window, rather the file is downloaded and then the user is prompted, by the browser, to open the file. This is OK unless the email file in question is large (e.g. contains a large attachment); when the user tries to open such a file, the CV window opens but in chromeless mode, meaning there is no visible cue that the download is running and there is a risk that after a period of time with no file opened and no cue that anything is happening the user will close the window and report that the file cannot be opened. Is it possible to show some sort of status bar in the CV window to indicate that the download is in progress? Either a custom bar or the built-in browser status bar (I believe this would require the CV window to be launched with a flag that forces download progress to be shown in the status bar).
... View more
The application passes sensitive parameters in the URL line of many of the requests. BUSINESS IMPACT: Sensitive data could be disclosed unintentionally through transmission in the URL. Description The application uses the URL to pass sensitive data from the client to the server. Data passed in the URL can be exposed because data passed in this manner ends up in unintended locations. These locations can include server logs, local browser history, and proxy logs. Reproduction Steps Using Burp Suite, navigate through the application. The application passes sensitive parameters in the URL line of many of the requests Affected Locations HUB v4.7 and Interact 4.7 Decipher v2.3 Recommendation When sensitive data is sent, it should be ensured that POST requests are used instead of GET requests. POST data is not treated the same way as URL data is when requests are sent through systems and do not typically get cached or logged. If there are technical constraints that require data to be sent in the URL then strong encryption should be used to encrypt values. It should be ensured that part of this encryption scheme contains protection against data replay so that captured cryptographic values cannot be replayed back to the server. References Information exposure through query strings in url | OWASP Foundation A02 Cryptographic Failures - OWASP Top 10:2021
... View more
Feedback from our processors: Request: Using the quick search lookup we are able to save lookups, however if one of the ‘Fields’ need updating, we cannot edit it, we have to delete the field from the query and re enter it. Are we able to make it editable instead. This will allow for day to day search to be made easier, particularly ones that the date needs to be altered every day for – streamlining our processing times.
... View more
Allow users to assign multiple categories to a single event in the timeline, enabling richer classification without the need to overwrite or remove existing tags. For example, categorize an event as both "FTE event" and "legacy tag applied." Without this feature: Users must repeatedly create or edit categories for different analyses, leading to inefficiencies, reduced flexibility, and potential loss of critical context for multi-faceted events. Goal: enhance event categorization flexibility, allowing users to perform diverse and overlapping analyses seamlessly while preserving existing tags for future use.
... View more
Enable the creation of metrics to count the number of times a specific pattern (e.g., X → Y → Z) occurs across all timelines, rather than limiting the count to the number of timelines where the pattern exists. Without this feature: Users cannot accurately quantify the frequency of specific patterns within timelines, leading to incomplete analysis and missed insights into recurring behaviour. Goal: provide detailed insights into pattern frequency across timelines, improving the depth and accuracy of process analysis in the Predecessor Analysis module.
... View more
Add the ability to use logical operators (AND, OR, NOT) in derive field conditions, allowing users to define more complex and precise criteria for deriving new fields in the dataset. Without This Feature: Users are limited to basic conditions, requiring workarounds or multiple iterations to achieve desired results, leading to inefficiencies and reduced flexibility in field derivation Goal: enable more advanced and flexible field derivations by incorporating logical operators, enhancing analytical capabilities and reducing manual effort.
... View more
Introduce a percentage format option for derived metrics when shown in Side by Side module, allowing users to display results as percentages (e.g., 92%) instead of decimals (e.g., 0.92) in the modules. Currently, when derived metrics or KPIs used in Side by Side, we cant use number formatting to achieve percentage values Goal: this would be improving data presentation and user experience.
... View more
Include a breakdown table in the Predecessor Analysis module to provide detailed tabular insights, such as the frequency of each predecessor pattern, associated attributes, and their distribution across timelines. Currently Breakdown feature used in Predecessor module do not have table which gives values in tabular format allowing users to copy these values easily(which is possible in Breakdown module) Goal: this will help for enhancing analysis depth, enabling better comparisons, easy copy data from tabular format.
... View more
Enable users to select events and attributes across all modules where they are used, similar to how dimensions are filtered, allowing for consistent and efficient selection of data throughout the platform. Currently accessing attributes and events options are different from module to module. Without this feature, Users must manually search for and select events/attributes in each module, leading to inefficiencies, potential errors, and a fragmented experience when working across multiple modules. Goal: streamline data selection by providing a unified, efficient way to select events and attributes across modules, improving workflow consistency and reducing manual effort.
... View more
Allow users to open and apply saved queries from the Query Module directly within the Predecessor and Classifications modules, enabling easier access to previously defined queries for streamlined analysis. Users must manually recreate queries for different modules, leading to inefficiencies, redundancy, and potential errors in replicating complex query conditions across modules. Goal: improve workflow efficiency and consistency by enabling users to quickly access and use saved queries across different modules, reducing repetitive tasks and enhancing analysis speed.
... View more