Stelligent Amazon Pollycast
mu is a tool that makes it simple and cost-efficient for developers to use AWS as the platform for running their microservices. In this second post of the blog series focused on the mu tool, we will use mu to incorporate automated testing in the microservice pipeline we built in the first post.
Why should I care about testing?
Most people, when asked why they want to adopt continuous delivery, will reply that they want to “go faster”. Although continuous delivery will enable teams to get to production quicker, people often overlook the fact that it will also improve the quality of the software…at the same time.
Martin Fowler, in his post titled ContinuousDelivery, says you’re doing continuous delivery when:
- Your software is deployable throughout its lifecycle
- Your team prioritizes keeping the software deployable over working on new features
- Anybody can get fast, automated feedback on the production readiness of their systems any time somebody makes a change to them
- You can perform push-button deployments of any version of the software to any environment on demand
It’s important to recognize that the first three points are all about quality. Only when a team focuses on injecting quality throughout the delivery pipeline can they safely “go faster”. Fowler’s list of continuous delivery characteristics is helpful in assessing when a team is doing it right. In contrast, here is a list of indicators that show when a team is doing it wrong:
- Testing is done late in a sprint or after multiple sprints
- Developers don’t care about quality…that is left to the QA team
- A limited number of people are able to execute tests and assess production readiness
- Majority of tests require manual execution
This problem is only compounded with microservices. By increasing the number of deployable artifacts by a factor of 10x or 100x, you are increasing the complexity of the system and therefore the volume of testing required. In short, if you are trying to do microservices and continuous delivery without considering test automation, you are doing it wrong.
Let mu help!
The continuous delivery pipeline that mu creates for your microservice will run automated tests that you define on every execution of the pipeline. This provides quick feedback to all team members as to the production readiness of your microservice.
mu accomplishes this by adding a step to the pipeline that runs a CodeBuild project to execute your tests. Any tool that you can run from within CodeBuild can be used to test your microservice.
Let’s demonstrate this by adding automated tests to the microservice pipeline we created in the first post for the banana service.
Define tests with Postman
First, we’ll use Postman to define a test collection for our microservice. Details on how to use Postman are beyond the scope of this post, but here are few good videos to learn more:
- Intro to Postman & Collections
- How to Add Tests to a Postman Collection
- How to Manage Environments in Postman
Below, you will find an example of one of the requests in my collection:
Once we have our collection created and we confirm that our tests pass locally, we can export the collection as a JSON file and save it in our microservices repository. For this example, I’ve exported the collection to “src/test/postman/collection.json”.
Run tests with CodeBuild
Now that we have our end to end tests defined in a Postman collection, we can use Newman to run these tests from CodeBuild. The pipeline that mu creates will check for the existence of a file named buildspec-test.yml and if it exists, will use that for running the tests.
There are three important aspects of the buildspec:
- Install the Newman tool via NPM
- Run our test collection with Newman
- Keep the results as a pipeline artifact
Here’s the buildspec-test.yml file that was created:
version: 0.1 ## Use newman to run a postman collection. ## The env.json file is created by the pipeline with BASE_URL defined phases: install: commands: - npm install newman --global build: commands: - newman run -e env.json -r html,json,junit,cli src/test/postman/collection.json artifacts: files: - newman/*
The final change that we need to make for mu to run our tests in the pipeline is to specify the image for CodeBuild to use for running our tests. Since the tool we use for testing requires Node.js, we will choose the appropriate image to have the necessary dependencies available to us. So our updated mu.yml file now looks like:
environments: - name: acceptance - name: production service: name: banana-service port: 8080 pathPatterns: - /bananas pipeline: source: provider: GitHub repo: myuser/banana-service build: image: aws/codebuild/java:openjdk-8 acceptance: image: aws/codebuild/eb-nodejs-4.4.6-amazonlinux-64:2.1.3
Apply these updates to our pipeline my running mu:
$ mu pipeline up Upserting Bucket for CodePipeline Upserting Pipeline for service 'banana-service' …
Commit and push our changes to cause a new run of the pipeline to occur:
$ git add --all && git commit -m "add test automation" && git push
We can see the results by monitoring the build logs:
$ mu pipeline logs -f 2017/04/19 16:39:33 Running command newman run -e env.json -r html,json,junit,cli src/test/postman/collection.json 2017/04/19 16:39:35 newman 2017/04/19 16:39:35 2017/04/19 16:39:35 Bananas 2017/04/19 16:39:35 2017/04/19 16:39:35 New Banana 2017/04/19 16:39:35 POST http://mu-cl-EcsEl-1K74542METR82-1781937931.us-west-2.elb.amazonaws.com/bananas [200 OK, 354B, 210ms] 2017/04/19 16:39:35 Has picked date 2017/04/19 16:39:35 Not peeled 2017/04/19 16:39:35 2017/04/19 16:39:35 All Bananas 2017/04/19 16:39:35 GET http://mu-cl-EcsEl-1K74542METR82-1781937931.us-west-2.elb.amazonaws.com/bananas [200 OK, 361B, 104ms] 2017/04/19 16:39:35 Status code is 200 2017/04/19 16:39:35 Has bananas 2017/04/19 16:39:35 2017/04/19 16:39:35 2017/04/19 16:39:35 executed failed 2017/04/19 16:39:35 2017/04/19 16:39:35 iterations 1 0 2017/04/19 16:39:35 2017/04/19 16:39:35 requests 2 0 2017/04/19 16:39:35 2017/04/19 16:39:35 test-scripts 2 0 2017/04/19 16:39:35 2017/04/19 16:39:35 prerequest-scripts 0 0 2017/04/19 16:39:35 2017/04/19 16:39:35 assertions 5 0 2017/04/19 16:39:35 2017/04/19 16:39:35 total run duration: 441ms 2017/04/19 16:39:35 2017/04/19 16:39:35 total data received: 331B (approx) 2017/04/19 16:39:35 2017/04/19 16:39:35 average response time: 157ms 2017/04/19 16:39:35
Adopting continuous delivery for microservices demands the injection of test automation into the pipeline. As demonstrated in this post, mu gives you the freedom to choose whatever test framework you desire and executes those test for you on every pipeline execution. Only once your pipeline is doing the work of assessing the microservice readiness for production can you achieve the goal of delivering faster while also increasing quality.
In the upcoming posts in this blog series, we will look into:
- Custom Resources – create custom resources like DynamoDB with mu during our microservice deployment
- Service Discovery – use mu to enable service discovery via `Consul` to allow for inter-service communication
- Additional Use Cases – deploy applications other than microservices via mu, like a wordpress stack
Until then, head over to stelligent/mu on GitHub and get started!
- Introducing mu: a tool for managing your microservices in AWS – Introducing the motivation for mu and demonstrating the deployment of a microservice with it.
- Continuous Delivery Assembly Line – Video introducing the idea of continuous delivery as an assembly line for software.
- Stelligent AWS Continuous Delivery Screencast – See a live demonstration of a system that uses continuous delivery.
- An Introduction to AWS CodeBuild – Blog post introducing the CodeBuild service
- Deploy to Production using AWS CodeBuild and the AWS Developer Tools Suite – In depth post demonstrating the capabilities of AWS CodeBuild
- cplee/banana-service – The final state of my repo that was used for this post, forked from stelligent/banana-service.
Did you find this post interesting? Are you passionate about working with the latest AWS technologies? If so, Stelligent is hiring and we would love to hear from you!