TECH.insight

Automated testing on the MEAN stack

Monday 16 March 2015

You can automate a complete system-testing strategy for your project from front-end to back-end

These days, the MEAN stack – MongoDB, Express, AngularJS, and Node.js – has become a relatively well-established approach to building specific types of web projects.

The advantages of MEAN are that you can use JavaScript as a uniform language throughout the front- and back-end, and enjoy an increase of productivity as client-side developers can understand the server-side code and database objects with little extra effort.

When we first started working on AngularJS projects at AKQA, we realised how easily we could rely on well-written tests to give developers the confidence they were delivering high-quality software. The solution we used was Karma, a test runner created by the AngularJS team to bring a productive testing environment to large web applications.

Karma

Karma is framework-agnostic, which means tests can be written using the framework of your choice, e.g. Jasmine or Mocha. It also supports preprocessors – a very handy feature if an application has an intermediate build step (e.g. if you want to use CommonJS modules in your client-side JavaScript).

So far, this is satisfying to developers who want to ensure they're writing good quality code. But ultimately, we want a mechanism to automate running those tests as part of a Continuous Integration (CI) setup, and extend the test execution to the server-side (running Node.js). We also want to generate code coverage reports for both client- and server-side code, so that the health of a project can be measured in real time, not only by developers but also by a wider audience of project stakeholders.

Devising a testing strategy setup throughout the full stack such as this can be time-consuming. There are many articles offering recommendations, but it is not easy to find advice with a unified solution, from end to end.

In this article, I'm going to cover how to set up a testing and code coverage solution for the MEAN stack using Gulp, the streaming build system, to automate application workflow.

Automation with Gulp

Before running any tests, build tool tasks such as JSHint will help to identify any suspicious issues in your code – presenting them neatly with a JSHint reporter.

This task can also be run as part of the CI setup for continuous checking. The following shows a simple Gulp setup for running JSHint against all the JavaScript files in a local src/scripts/ folder.

var gulp = require('gulp'),
    jshint = require('gulp-jshint'), // JSHint
    stylish = require('jshint-stylish'), // JSHint reporter
    JS_PATH = "src/scripts/";

gulp.task('lint', function() {
    return gulp.src([JS_PATH + '**/*.js'])
        .pipe(jshint())
        .pipe(jshint.reporter(stylish));
});

When it comes to testing AngularJS projects, Karma is the right choice, but developers can choose their own testing framework. I have opted for Jasmine to test my client-side JavaScript.

Generating code coverage reports with Karma is not difficult. First, create a configuration file referencing the JavaScript files you wish to generate coverage reports from in the preprocessors code block. Most preprocessors need to be loaded as external plugins, although CoffeeScript and coverage are available built-in.

Nowadays, it is not unusual to use a preprocessor such as Browserify to write modular JavaScript in the browser and then bundle up all of your dependencies. In the following example, I handle this scenario in the configuration file by loading karma-commonjs as a framework to be used in the preprocessors block later.

You may wonder why I would choose karma-commonjs and not karma-browserify if I am using Browserify to compile the project.

The reason is simply because coverage and karma-commonjs work seamlessly together – something that karma-browserify does not support at time of writing (although this may change in future). Test this out by creating a new gulpfile.js with the following configuration:

var gulp = require('gulp'),
    browserify = require('gulp-browserify'),
    karma = require('gulp-karma'),
    uglify = require('gulp-uglify'),
    JS_PATH = "src/scripts/",
    DIST_PATH = "dist/",
    TEST_PATH = "src/tests/";

// Bundle up with Browserify, Minify and copy JavaScript
gulp.task('scripts', function() {
    return gulp.src(JS_PATH + 'app.js')
        .pipe(browserify({
            insertGlobals: true
        }))
        .pipe(uglify())
        .pipe(gulp.dest(DIST_PATH  + 'scripts/'));
});

gulp.task('test-ui', function() {
    return gulp.src('./idontexist') // See https://github.com/lazd/gulp-karma/issues/9
        .pipe(karma({
            configFile: TEST_PATH + 'karma.conf.js'
        }))
        .on('error', handleError);
});

Now create the Karma configuration file named karma.conf.js:

module.exports = function(config) {
    var JS_PATH = "src/scripts/",
        TEST_PATH = "src/tests/";

  config.set({
    basePath : '.',
    frameworks: ['jasmine', 'commonjs'],
    files: [
        JS_PATH + 'libs/angular/angular.js',
        JS_PATH + 'libs/angular-mocks/angular-mocks.js',
        JS_PATH + 'services/*.js',
        JS_PATH + 'controllers/*.js',
        JS_PATH + 'directives/*.js',
        JS_PATH + 'app.js',
        TEST_PATH + '/**/*-spec.js'
    ],
    preprocessors: {
        JS_PATH + 'app.js': ['commonjs', 'coverage'],
        JS_PATH + 'libs/angular/angular.js': ['commonjs'],
        JS_PATH + 'libs/angular-mocks/angular-mocks.js': ['commonjs'],
        JS_PATH + 'services/*.js': ['commonjs', 'coverage'],
        JS_PATH + 'controllers/*.js': ['commonjs', 'coverage'],
        JS_PATH + 'directives/*.js': ['commonjs', 'coverage'],
        TEST_PATH + '/**/*-spec.js': ['commonjs']
    },
    reporters: ['coverage', 'progress'],
    coverageReporter: {
        type: 'lcov',
        dir: 'reports',
        file: 'lcov.info'
    },
    port: 9876,
    runnerPort: 9100,
    colors: true,
    logLevel: config.LOG_INFO,
    autoWatch: true,
    browsers: ['PhantomJS'],
    captureTimeout: 60000,
    singleRun: true
  });
};

Only include those source files you want to generate coverage reports for in the preprocessors block – do not include test scripts or third-party libraries. Additionally, you can configure the reporter type that will be instrumented by Istanbul in the coverageReporter block.

SonarQube

If the report type selected is LCOV, you can then import the resulting coverage reports into SonarQube for Continuous Inspection, raising code quality visibility for all stakeholders and making it an integral part of the software development lifecycle.

Assuming you have a SonarQube server installed and running, the SonarQube JavaScript plugin can be used to:

Run analysis

Create a configuration file in the root directory of the project named sonar-project.properties:

# First configure general information about the environment

# Project properties
sonar.projectKey=your-unique-project-key
sonar.projectName=Your Unique Project Name
sonar.projectVersion=0.0.1

sonar.sources=src/scripts/
sonar.language=js
sonar.sourceEncoding=UTF-8

# To import the LCOV report
sonar.javascript.lcov.reportPath=reports/lcov.info
sonar.exclusions=file:src/scripts/third-party/**/*.js

To push the code to SonarQube for analysis, execute the following on the command line from within the project root directory:

$ sonar-runner

Integration testing in Node.js with Mocha

On the server, I use Mocha to test the Node.js RESTful API as it gives developers a lot of flexibility. On the other hand, Mocha may take more time to configure while you find the right choices for your application. Mocha supports the following features:

As an example, to test a Node.js RESTful API, add the following in your gulpfile.js after installing your dependencies:

var gulp = require('gulp'),
    mocha = require('gulp-mocha'),
    istanbul = require('gulp-istanbul'),
    JS_PATH_SERVER = "app/,
    TEST_PATH_SERVER = "tests/";

// Run Node.js tests and create LCOV-format reports with Istanbul
gulp.task('test-server', function () {
  return gulp.src([JS_PATH_SERVER + '**/*.js'])
      .pipe(istanbul()) // Node.js source coverage
      .on('end', function () {
          gulp.src(TEST_PATH_SERVER + '**/*.js')
              .pipe(mocha({
                  reporter: 'spec'
              }))
              .on('error', handleError)
              .pipe(istanbul.writeReports('reports')); // Creating reports
      });
});

Your Node.js integration test scripts may then look like the following:

var should = require('should'),
    request = require('supertest');

it('should respond with a successful top ten array', function(done) {
    request(app)
        .get('/api/topten')
        .set('Accept', 'application/json')
        .expect('Content-Type', 'application/json; charset=utf-8')
        .expect(200)
        .end(function(err, res) {
            expect(err).to.not.exist;
            res.should.have.status(200);
            res.body.should.be.array;
            done();
        });
});

Once this task has been configured correctly, update and run a new SonarQube analysis against the Node.js JavaScript files to complete the end-to-end system testing strategy.

Using Protractor for real-world tests

If you want to go the extra mile, Protractor will enable end-to-end testing for AngularJS applications – running tests against your application in a real browser and interacting with it as a user would.

Setting up a testing strategy with Protractor goes beyond the scope of this article, but in summary, Protractor:

An example spec file might look like the following, given in the standard Jasmine format:

describe('Protractor demo', function() {
  it('should add one and two', function() {

    // Load the URL in a browser
    browser.get('http://juliemr.github.io/protractor-demo/');

    // Simulate keyboard activity
    element(by.model('first')).sendKeys(1);
    element(by.model('second')).sendKeys(2);

    // Simulate a click of a form submit button
    element(by.id('gobutton')).click();

    // Test the visual result is what is expected
    expect(element(by.binding('latest')).getText()).toEqual('3');
  });
});

Here browser is a global variable created by Protractor, and used for browser-level commands such as navigation and interaction with the page.

With these techniques in conjunction with a CI setup, you can automate a complete system-testing strategy for your project from front-end to back-end, while increasing code quality visibility through a Continuous Inspection process.

About The Author

Ignacio is a Senior Web Developer at AKQA in London.