If you have customer/transaction data in Salv, you can use Scenario testing feature to test a future monitoring scenario, check how adjusting an existing scenario changes the amount of matches or double check the hits that were skipped due to your alert suppression interval.
How to set up scenario testing
Go to Analyst toolbox β Monitoring scenarios.
Click on the scenario you want to test OR add a new one by clicking on "+ New Scenario".
Select the version of the scenario you want to test, click on the three dots under "Actions", choose "Test".
In case of Real-time or Post-event scenarios, select:
the period of the sample data that will be used for the test. Timeframe is based on the
timestamp
for Transactions andupdated_time
for Persons.Person and Transaction Segments which should trigger the scenario.
In case of Periodic scenario select:
Testing time range - select the date range for the test. The "From" date marks the start of the Periodic scenario.
Periodic scenario interval, i.e. define how often the periodic scenario should run. The bottom of the screen will show the number of planned runs within the selected time range.
You can select up to 1 year of historical data to test.
Define the parameter values you want to test. By default, they will match the version's parameters, but you can adjust them for the test if needed. The version's parameters will not be changed.
If needed, add Alerting suppression intervals to the test.
Add a test note to clarify what is being tested or what changes you've made (e.g., "Threshold increased"). This note will appear in the Test Runs window, helping you differentiate test runs more easily.
If needed, toggle "Run on alerted entities only" to limit the test to entities previously alerted by this scenario within the selected time range for Real-time and Post-event scenarios. This option significantly reduces test duration and is ideal for comparing new versions to historically active ones.
Click Start test run.
The Real-time and Post-event scenarios must contain a $transactionId
parameter for scenarios where Transaction is the alerting entity and a $personId
parameter for where it's Person, otherwise the testing option will not appear for the version.
Some important notes:
You can start as many test runs as you wish; however, only one test runs at a time. Others remain scheduled and will start only when the previous ones are finished.
In the case of dynamically changing query templates, risk scores, or custom lists being used in the scenario, the scenario test uses the state at the current time, not the state at the test time.
Persons with active Scenario snoozing will be included in the scenario test.
Viewing test results
You can find all tests of a scenario under Test runs tab in the scenario page.
Here, you will see all runs that have been completed, running or are scheduled. Click on a specific test run to view its results and access the detailed results dashboard.
Test run results page
The Test Run Results page provides a summary of test execution, allowing you to review key details, analyse results and take further actions.
At the top of the page, you can see the status of the test run (e.g., Finished). You also have the option to Go to Dashboard for more detailed data analysis (read more about the dashboard below).
Test run details - this section includes essential test execution information. The Sample Data field shows the time range of the data used, while Execution Time provides the start and end timestamps along with the elapsed time. The Scenario Run Speed indicates how many runs were performed, and the Run on Alerted Entities Only field confirms whether the test was executed exclusively on previously alerted entities.
Testing attributes - this section outlines the attributes (parameters and segments) applied during the test. From here, you can Create a New Version with Applied Attributes, allowing you to save the tested parameters as a new scenario version if needed.
Scenario query - this section displays the original query of the scenario version. If you need to review the full version details, including all original attributes and parameters, you can click View Original Version to compare them against the tested settings.
Results - this section lists each test run with its execution date and shows how many transactions or persons matched the test criteria. If no entity met the conditions, this is clearly indicated.
Rerun test - this option allows you to quickly adjust attributes and parameters and execute a new test without needing to reconfigure everything manually. This enables an efficient iteration process, helping you refine the scenario based on test outcomes.
The results page of a Periodic scenario provides the results of each run, so you will see multiple test runs with the run date indicated.
Please be aware that test results are limited to 50 000 alerts, meaning that if you get more hits than 50 000, with or without added alert suppression intervals, 50 000 will be shown as the number of hits.
Test results dashboard
To see detailed information of the scenario test run, click on the "Go to dashboard" on the test run results page.
General analysis
The general analysis tab of the dashboard provides detailed information about the run and persons or transactions that have been hit by the scenario. You can find detailed explanations of some metrics by hovering over the information tooltips in the dashboard.
Alert suppression interval and periodic run analysis
To check detailed results of how each alert suppression interval would affect the scenario results, navigate to the Alert suppression interval and periodic run analysis tab.
In cases of Real-time and Post-event scenario runs, you can check detailed results of each alert suppression interval by clicking on the days parameter, for example, clicking on 1, 3 or 7 will show you the results the selected alert suppression interval (including the visual graph of alerts per calendar week and month with the alert suppression interval).
In case of Periodic scenario run, in addition to checking results of each alert suppression interval, you can see the results of each periodic run. For example, in the picture below, you'd be viewing the results of a periodic scenario with a 30 days alert suppression interval when a run has taken place on the 5th of August: