Unit testing Express route handlers in isolation from everything, including Express

nodejs-dark.pngWhen we write unit tests it's good practice to "mock" out extraneous bits to the code being tested. It's almost like the scientific method in that testing, in the unit testing paradigm, means exercising each small portion of your code in isolation if only to eliminate unwanted variables. While there are other testing paradigms, unit testing has its value. A big question for Node.js web application programmers is - how do you mock out HTTP requests for unit testing? In other words, how do you test the route handler method in isolation from the Node.js HTTPServer object, or Express, or whatever app framework used in your application?

Click here for more Node.js programming advice

Your unit test needs to focus on testing your application. You don't want unit tests of your application to be tripped up by bugs in Express, that wouldn't be unit testing.

Let's start this with a basic assumption - your route definitions aren't going to have code, but instead invoke route handler functions located in another module. That is, the way I've structured Express applications is to have a script, app.js, containing the glue to bind all the modules used in the application, but app.js has very little code of its own other than calls to functions in other modules whether in Express or my own modules.

var express = require('express');
var http = require('http');

var model = require('./path/to/data-model-module');
var routes = require('./path/to/application-route-handlers-module');
var users = require('./path/to/route-handlers-module-for-user-authentication');
routes.configure({ model: model });
users.configure({ model: model });

var app = express();
app.set('port', process.env.PORT || 3000); // do other configuration

app.get('/login',      users.doLogin);
app.get('/logout',     users.doLogout);

app.get('/unpriveleged-stuff', routes.doAppropriateFunction);
app.get('/some-priveleged-stuff', users.ensureAuthenticated, routes.doPrivelegedOperation);

var server = http.Server(app);

server.listen(app.get('port'), function(){
  console.log("Express server listening on port " + app.get('port'));
});

So that's sort of what I would do for an Express application. To test that as an Express app means some kind of browser automation, or to send HTTP requests and look at the responses. Neither approach is unit testing because you're not testing the route handlers in isolation. Instead this methodology is Functional Testing or Integration Testing.

To unit test the route functions - the contents of the routes module shown above - if we think about this, the routes module simply exposes a number of functions which receive request and response objects. The router handler functions really don't care if those objects resulted from real HTTP requests from a real HTTP server, or if they're fake objects provided by a test framework, right?

Let's see how to do this.

The first task is to create fake data model(s) because, the unit testing paradigm says to mock out anything extraneous to the code being tested. Before we go further, I want to note that I just made an assumption - that your application is structured using some sort of model-view-controller paradigm. That is, the data model module(s) simply handle marshaling data in and out of a data store, providing a high level application specific API to the data. The view code is essentially the templates with which Express renders pages of the application. And the controller code is the route handler functions.

It should be quite straightforward to unit test model code. Simply call the API of the model modules, using test data of known characteristics.

Because route handlers (the controller) use data model code, the question is whether to unit test the route handlers on top of "real" data model? To do so wouldn't be unit testing, because it's not testing route handler functions in isolation if they're using real data models. To be purist about unit testing means concocting some kind of fake data model to use while testing the route handler functions. That in turn means the route handling module(s) must have an API to configure the data model(s), so you can inject fake data models or real data models depending on the situation.

Here's approximately how I'd do that:

var model = require('./path/to/fake-mock-data-model-module');
var routes = require('./path/to/application-route-handlers-module');
routes.configure({ model: model });

The fake-mock-data-model-module of course must have the same API as the real data model, but for the purpose of unit testing it should supply known data against which you can write tests.

All route handler functions (in an Express application) have the same function signature: doSomething = function(request, response, next)

The general idea is each test scenario calls this function using faked up objects, and then it inspects the response/behavior exhibited by the handler function. It's time now to look at implementing them.

routes.doSomething({
   ... fake request object
   },
   { ... fake response object },
   { ... fake next function }
)
... test the data and behavior

I use the Vows test framework, and with a fake response and next object we can use Vows to easily write test cases:

var mockRes = function(vows) {
    return {
        render: function() { vows.callback('render', arguments); },
        redirect: function() { vows.callback('redirect', arguments); },
    };
};
var mockNext = function(vows) {
    return function() { vows.callback('next', arguments); };
};

The idea is that mockRes creates a fake Response object - Express provides a Response object with more functionality than the Node.js HTTPServer object does. However, my application uses a bare minimum subset of that functionality. Similarly, mockNext is a fakeout of the "next" function Express provides to route functions.

Now let's see how to write a unit test (again, I'm using the Vows framework):

vows.describe("test batch 1")
    .addBatch({
        "test scenario 1": {
            topic: function() {
                routes.doSomething({
                    query: { query: "parameters" },
                    ... other fields of the Request object
               }, mockRes(this), mockNext(this));
           },
           "should do something or other": function(command, args) {
               assert.match(command, /render/);
               assert.match(args[0], /showerror/);
                ....etc...
          }
        }
    })
    .run();

Each route handler function will have the signature doSomething(request, result, next). In this example we've mocked up a dummy object containing the request fields with which we want to test the route function. The mockRes and mockNext captures the route handler response in a way that let's us inspect that response in the test case.

The mockRes and mockNext gives us a string, called "command" here, and whatever data was given in function arguments. We inspect that data in the "should" clauses, letting us verify whether the route function responds correctly to each test scenario.

For this technique to work, your route handler functions must be written to allow plugging in fake data models. I know that it's tempting to put SQL queries etc in the route functions. But being serious about unit testing means structuring your code to allow injecting fake code while testing. A side benefit is that your application is segmented into semi autonomous sections, which may help you over the long term to evolve the application without having to rewrite everything.

For example - switching database engines (swapping out MySQL for MongoDB) is simplified if the database code is carefully isolated in a data model. Not only can it be tested in isolation, but if you did a good job with the model code you could swap one model for another without changing the rest of the application.

The moment you mix data model code into the route handlers - mixing controller & model - you'll have created a little mess for yourself. You don't have the freedom to test route functions in isolation. You don't have the freedom to swap database technologies. You have to rewrite your application if the schema changes. etc...