Freigeben über


Azure Function - Integration tests automation

In this third post on the Azure Function blog post series, we discuss topic that is often forgotten or set aside - integration testing:

"Integration testing (sometimes called integration and testing, abbreviated I&T) is the phase in software testing in which individual software modules are combined and tested as a group" Wikipedia

On the first post, azure-functions-prepare-for-continuous-delivery, of this blog series, we addressed Azure Function tests and how we’re running them from our environment with Postman.

The goal of this post is to describe how to automate test execution, which will be integrated in our DevOps CI / CD pipeline.

Our integration tests workflow

Before going into details about Postman, just a small reminder of the list of APIs we need to test, and their workflow in our integration tests.

First, we have our 2 Azure Functions:

  • https://azureaccount/GetHashKey
  • https://azureaccount/UpdateUserFlag

These two APIs take the user's VSTS token as an input parameter, so we must start our integration tests by first calling the VSTS Rest API:

https://<account>.visualstudio.com/_apis/WebPlatformAuth/SessionToken

Using Postman

To briefly describe Postman, it is a free tool (also with a Pro license) that allows you to test calls to any Http query like API, Webservices, URL, ...
It is a very practical tool for API integration tests both for basic and advanced use.

We will not describe all these features in this article. See POSTMAN documentation for details.

Through a list of steps, we’ll show you how we used it for integration test automation.

Step 1: Create collections

In order to automate a series of tests, we have created a Collection (which is similar to a folder), and we ranked our tests in this Collection in their execution order.

clip_image001[4]

We named the Collection with the name of the project (1) and the name of the tests with the name of the tested API (2).

For more on Collections, see Creating collections.

Step 2: Using variables

To be able to test our APIs on different environments (Early Adopters / Production) and for security we replaced all environment-specific data into dynamic content through the use of variables.

clip_image003[4]

These variables can be used in any query elements of the a query:

  • The Uri
  • The Request body
  • The header

For more on variables, see Variables.

Step 3: The environments

Once our variables are defined, we create an environment and fill in the corresponding values.

clip_image005[4]

Each key corresponds to a variable of our tests.

You can obviously create as many environments as necessary; for example, in my case, we have created 2 environments:

  • Local environment - for test our azure Function in our local environment
  • Dev environment - our Azure development environment.

For more on environments, see Manage environments.

Step 4: Write script tests in Postman

Integration testing is not just testing to which an API responds with success status code 200 . The goal is to be able to verify that the result corresponds to what was expected.

One interesting Postman's feature is to be able, in each query, to write tests (in JavaScript) which will enable us to:

  • Manipulate environment keys [1]
  • Test the query's code status [2]
  • Test the content of the query's result [2]
  • Display trace information for debugging purposes [3]

Here are some uses of script tests on our API query

clip_image006[4]

clip_image008[4]

When Postman run the API, these scripts tests are executed in post-execution, and the results are displayed in the Postman Tests tab

clip_image010[4]

For more on tests scripts, see Test scripts.

Step 5: Run and automate integration tests in Postman

Once your queries are created and your tests written, Postman can run tests in your Collections.

For this, we will use the Collection Runner:

clip_image012[4]

clip_image014[4]

To run all tests in our collection automatically, go the in the runner and select:

  • The collection that contains the tests to run [1]
  • The environment that you want to run the tests against. It will contain the desired values for the variables [2]
  • The number of iterations, or the number of times the tests are run [3], can be useful for obtaining performance metrics.

Then we click on the "Run ..." button [4], the tests are executed and the result of each one appears on the right panel [5]. The tests that are run are the ones that were scripted in the tests part.

For more on the collection Runner, see Starting a collection run.

With all tests successfully completed, we can now create our project that will run these tests in our DevOps pipeline.

Preparing automation with Newman

Postman is a tool that is used with a GUI, to run our Postman tests in our CI / CD pipeline. We use Newman, a npm package to run the Postman tests. The Postman utility will be run from the command line.

Before using Newman, we will have to export our tests as well as the variables of the environment.

Export the collection

clip_image016[4]

Clicking the Export button generates a Json local file (named : Azure_Func_LD.postman_collection.Json)

Export the environment

clip_image018[4]

The generated file is also in local Json (named: postman_environment.json ) format and contains the definition of the selected environment.

At this stage we have our 2 Json local files that we place in the same directory:

  • Azure_Func_LD.postman_collection.Json which contains all the API tests from our collection.
  • postman_environment.json which contains all environment variables with dev values.

We now need to add a package.json file, that must be stored on the same directory. It will allow us:

  • To download and install Newman
  • To run Newman with the collection file and the environment file as a parameter

clip_image020[4]

Our preparation is not yet complete. We need to change the hardcoded values of the environment variables. We know that these values must be dynamic, depending on the environment the tests will be executed against in our deployment pipeline.

To solve this problem, we have added to our directory a file data-tests.json which will contain all dynamic variables and Token values.

clip_image022[4]

All these tokens will be replaced during our pipeline as we will see in the next series of blog post.

This file will be used to override the environment variables already defined, and for this we must complete the command line of Newman script of our package.json:

"testapi": "newman run Azure_Func_LD.postman_collection.json -e postman_environment.json -d data-tests.json”

The last step needed is to configure the Newman Reporter to show the results of the test execution in the cmd window, but also to generate the results in JUnit and Html format.

In the end, we get the following Newman command:

"testapi": "newman run Azure_Func_LD.postman_collection.json -e postman_environment.json -d data-tests.json -r junit,cli,html --reporter-html-export TestResult/result-tests.html --reporter-junit-export TestResult/result-tests.xml"

Newman's complete documentation with all its parameters is available here.

Finally, we run this npm run testapi script locally for test and we get:

clip_image024[4]

With the above command, we display all the tests as well as their result and their execution time in the cmd window, and we obtained our reporting files in the TestResult directory.

In the result of this command line execution we can see for each API to test:

  • Its postman name [1]
  • its console log traces written in test scripts [2]
  • the results of all its tests scripts: success or failed [3]

At the end the command display grid with summary of success and failed tests and time tests execution metrics.

To integrate this into our pipeline, we need to copy this directory into our already existing Visual Studio solution.

clip_image026[4]

The sample source code used on this series of posts can be found in GitHub (github.com/ALM-Rangers/azurefunction-vsts-feature-flags).

You can also see more Postman tips on the Postman blog.

What's Next….

With this 3rd part on our Azure Function series, we have completed all necessary artifacts for our CI / CD DevOps Pipeline.

The next article will tell you about the construction of our DevOps pipeline in VSTS.

Then, in the last part of this series, we will explain how we integrated the monitoring of our Azure Function in Application Insights.

THANK YOU REVIEWERS: Charles Sterling, Rui Melo, Tiago Pascoal, Edward Fry