A service to enable functional testing of a Microsoft Bot Framework bot. A call to this service programmatically simulates a user’s back-and-forth conversation with a bot, to test whether the bot behaves as expected.
When calling the service, a Test is given as input. A Test is basically a “recording” of a user’s conversation with a bot. The Test is run against a given bot to check whether the conversation occurs as expected.
The service exposes a RESTful API for running Tests. The HTTP response code of an API call indicates whether the conversation had occurred as expected or not. If not, the response body contains information regarding the Test failure.
In order to create Tests, you should work with the Bot Framework Emulator.
Note: After going through installation basics, make sure you configure ngrok as detailed here.
The simplest way to create a Test is to have a conversation with your bot within the emulator, then save the Transcript (.transcript file) as explained here. The Transcript can be used by itself as a Test for the service. The service will relate to the relevant information in the Transcript (ignoring conversation-specific details like timestamps, for example) and attempt to conduct a similar conversation, sending the user utterances to the bot and expecting the same bot replies.
The service is a Node.js application. It can be installed using npm (npm install
).
It can be easily deployed to Azure as an App Service:
The service communicates with a bot, therefore it needs to know the bot's Web Chat secret key.
The service may communicate with multiple bots, so each bot should be identified by a logical name. All bots' secrets need to be defined in a single environment variable named SECRETS
, which should include a string representing a JSON object. In this JSON object, each key (bot's name) is mapped to a value (bot's secret).
For example, let's assume we give the logical name 'samplebot' to the bot we would like to test, and that its Web Chat secret is '123'. Then we should have an environment variable named SECRETS
set to the following string:
{"samplebot" : "123"}
In case you would like to test a single bot most of the time, you can define an environment variable called DefaultBot
to specify the logical name of your default bot.
Note: If you are deploying the code sample using the "Deploy to Azure" option, you should set the variables in the Application Settings of your App Service.
There are several options for calling the service to run a Test, using HTTP GET
or POST
. There are also several ways to pass Test parameters to the service.
In all cases, the service needs to be aware of the bot to test. The target bot is identified by a logical name. This name can be passed as a 'bot' HTTP query parameter, e.g. '…?bot=bot-logical-name'. If no bot name is specified as a query parameter, the service uses the DefaultBot
environment variable.
The simplest way to run a test is to POST
an HTTP request to the /test
route of the service. The request body should contain the contents of a Transcript in JSON (application/json) format.
Assuming our target bot is named 'samplebot' and our service was deployed to Azure as 'testing123', the request query may look like:
https://testing123.azurewebsites.net/test?bot=samplebot
In case you have DeafultBot
set to 'samplebot', the request may look like:
https://testing123.azurewebsites.net/test
Instead of POST
-ing the Transcript as the request body, you can store it somewhere and give its URL to the service as a 'url' HTTP query parameter in a GET
HTTP request.
Let's assume that we have a Blob Storage account on Azure called 'samplestorageaccount', and we uploaded a Transcript file called 'Sample.transcript' to a container called 'tests'. The corresponding request query may look like:
https://testing123.azurewebsites.net/test?bot=samplebot&url=https://samplestorageaccount.blob.core.windows.net/tests/Sample.transcript
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.