Auto-test
User Intents
Teneo Studio Desktop allows performing automatic tests to ensure that the right Flow triggers and transitions are triggered by a given input; the Auto-test uses the positive and negative examples of User Intent to test the Flows, so - before running an Auto-test - examples of User Intent should be added to the Flows in the solution.
Run an Auto-test
To see and run an Auto-test, follow the below steps:
- Go to the backstage of Teneo Studio Desktop (Solution tab)
- Select Auto-test in the left-side menu
- In the top of the Test page, click Run Test to run an Auto-test with the default settings.
Defining Auto-test settings
In the top of the Test page, it is possible to define the settings of the Auto-test to run.
The first three icons allows the user to specify which elements to include in the test: Triggers, Transitions, and Links. By default, all three elements are pre-selected. To deselect any, simply click the icon.

Auto-test levels
Auto-tests can be done at three levels:
- Flow (note that this test level is performed within the Flow)
- Folder
- Solution
The Solution button at the top of the Test page allows the user to browse for a specific folder and define the level of the test.
The Run Test button allows the user to choose between two different test options:
- Run Test and
- Run Test Using Flow scope

When selecting Run Test, the triggers and transitions in the scope (Flow, Folder or Solution) are tested in the following ways:
- If the positive examples of User Intent match the Matches of the trigger/transition they belong to,
- If the positive examples of User Intent do not match the Matches of the trigger/transition they belong to, and
- if the trigger/transition is actually triggered for the examples that match the Matches, or
- if the example(s) "is stolen" by another trigger/transition with a higher ranking in Ordering.
When selecting Run Test Using Flow Scope, which is particularly useful when working with Matches to ensure that these cover the examples of User Intent, the performed test checks:
- How the triggers perform as a stand-alone, and
- all other triggers in the solution are ignored.
Auto-test at Flow level
Follow these steps to set up and run an Auto-test at Flow level:
- Open the Flow to be tested
- Select the items to be included in the test: Triggers - Transitions - URLs
- Click Run Test or Run Test Using Flow Scope at the top of the view.
When the Auto-test has finished, the results can be filtered by Succeeded, Failed, Succeeded with warnings or by Non-testable items.
Auto-test at Folder level
Follow one of the below options to set up and run an Auto-test at Folder level:
- Either, in the Solution Explorer, right-click the folder to test and select Auto-test in the context menu, or
- go to the Auto-test page in the backstage of Teneo Studio (Solution tab > Auto-test)
- Select the items to be included in the test: Triggers - Transitions - URLs
- Click the Solution button to open the drop-down menu and select Browse for folder
- In the Folder Browser window, select the folder and click OK
- Lastly, click Run Test or Run Test Using Flow Scope in the top of the Auto-test page.
Auto-test at Solution level
To test all Triggers, Transitions and URLs set up in the solution, follow these steps:
- Go to the Auto-test page in the backstage of Teneo Studio (Solution tab > Auto-test)
- Select the items to be included in the test: Intent Triggers - Transitions - URLs
- Lastly, click Run Test.
Use Stable Documents
The Use Stable Documents/Use Latest Documents refers to Version flag and allows to choose between running Auto-test on all documents in the solution marked as stable or on the latest version of all documents within the solution.
Auto-tests run on the stable documents are marked with a Tested using stable versions flag in the top of the test results.
Cancel an Auto-test
As soon as an Auto-test has started, the Cancel test button appears in the page. The user can, by clicking it, cancel the run of the Auto-test. Note that the cancellation can take a little as it needs to be communicated to the server for the Auto-test to be cancelled.

Intent Model
In case the Learn model isn't ready when starting an Auto-test, for example, because the model cache has expired or because a model training is in progress, the execution of the Auto-test will launch a new model training (or wait for the running training to finish) before starting the actual test of the User Intents in the solution. While the model isn't ready, a Training model... message is displayed below the Cancel test button.
When a CLU intent model is assigned to the solution, Auto-test uses the CLU model for the generation of the test results.
Learn more about Intent classification in Teneo.
Review former Auto-test results
The History panel, at the bottom of the Auto-test page, displays a summary of former Auto-tests. It is possible to Open any of the previous Auto-tests and Reload the list by using the available buttons.

Export test report
The user can export the Auto-test results in Excel format. To do so, simply click the Get Report button available in the top of the Auto-test page.
For further information on how to review the test results, please see the Auto test section.
Auto-test results panel
Test results panel
Results from Auto-tests are displayed in the Auto-test page in the backstage of Teneo Studio Desktop.
If a test was performed earlier, when opening the page, the results from the last Auto-test are automatically displayed. If no test was executed previously, then test results populates the window only when the user starts the first Auto-test; learn how to run an Auto-test here.

The top area of the test results view includes a visual summary of the currently displayed test results (below Solution). The summary includes details about the number of tested Triggers, Transitions and Links as well as a percentage for the pass rate.
Action panel
Selecting any of the examples of User Intent in the results view will populate the right-side panel with information related to the selected User Intent example and the Trigger it belongs to.
In case of a failed test, the Trigger which stole the example (blocking the expected result) will have a red line on its left side and is displayed in the top under Triggered. The test example will be displayed next in the section Expected Result - Should trigger.
For both failed tests and tests with warning, an All Matches section is available displaying information about all the Triggers which matched the User Intent example.
In the Action panel, the user can double-click any of the Triggers to open the Flow it belongs to. Furthermore, three icons are available above the All Matches section which allows the user to open the Flow, Trigger Ordering or obtain further information by clicking the Info button.

Filter options
The filters are available below the test result summary.

Test results can be filtered by using the following options:
- Text filter (applies to Flow names, examples of User Intent, and the test result message)
- Filter buttons to show/hide the following test items:
- Succeeded (green checkmark)
- Passed with warning (yellow triangle)
- Failed (red warning)
- Non-testable items (clipboard)
When many test results are available, the list of successful results will automatically be hidden (the Succeeded button will automatically be disabled). This happens in the following two scenarios:
- There are more than 100 results and at least one is a failed result. In this case only failed results are displayed.
- There are more than 500 results, but none of them are a failed result.
In both of the above cases, the succeeded results can always be displayed by clicking the respective button.
Test passed (with warning)
When Positive examples of User Intent pass the Auto-test with warning, it means that although the example matched the correct trigger and triggered it, there are one or more triggers in the same Order group which also match the positive example and potentially could trigger it.
In these cases, it is recommended to create order relations between the Triggers for which the warning is displayed, because even though these currently do not cause the Auto-test to fail, an order relation will help avoid conflicts or Auto-test failures in the future.
Failed tests
The most common reasons for a failed test are:
- The Matches are not covering the positive examples of User Intent
- Ordering problem (the User Intent example is stolen by another Trigger with a higher rank in Ordering).
Recognize issues related to Matches
When the text The example did not match this trigger is displayed next to the failed example of User Intent, it indicates that the example is not matched by the Matches defined in the trigger and therefore was not triggered.

Another trigger, ranked below the current one, could still trigger the example (if covered by its Matches), but as this was most likely not the intention, the user is encouraged to open the failed Trigger and either edit the example of User Intent or update the defined Matches.
Recognize issues related to Ordering
When one or more examples of User Intent are triggered by one or more different Flows, in most cases this is due to Ordering and can be solved by either changing the Order group of one of the triggers or by creating a relation between the two triggers.

Related topics: