Easily deploy files or directory hierarchies to a server using Grunt

Something we geeks need to do all the time is deploy files between machines. Such as, deploying a directory hierarchy over to a server for staging or production use. There's a ton of ways to do this. The old-school way is a shell script with carefully crafted rsync commands. In my case I build websites using AkashaCMS and need to deploy them to the destination webserver. Until now I'd added a command to AkashaCMS solely for deployment, but I'm experimenting with Grunt to see how much can be done using the Grunt ecosystem rather than having to maintain code in AkashaCMS.

In AkashaCMS I simply used the Node.js spawn function to run an rsync command with selected command line options, some of which came out of the config file:

$ rsync --archive --delete --verbose --compress localDir/ user@remotehost.com:remoteDir/

One thing I'd like to avoid is that this creates a dependency on Unix/Linux/MacOSX systems that have rsync. Those poor people who have to suffer through using Windows don't have rsync. Let's all feel sorry for them.

Unfortunately the solution I've ended up with doesn't solve the rsync-on-windows problem.

These solutions of course are built around the Node.js platform and the Grunt tool. My page on learning about Node.js has several books and resources to help you learn both.


The first tool I tried could possibly solve that problem, but I wasn't able to work out how to get it to run at a decent speed. But here goes anyway. In your Gruntfile.js load the grunt-sftp-deploy plugin as so


And make sure it's installed

$ npm install grunt-sftp-deploy

And/Or add it to your package.json to make sure the plugin is always installed.

The next step of course is to configure the plugin.

module.exports = function(grunt) {
       'sftp-deploy': {
            deploy: {
                auth: {
                    host: 'example.com',
                    port: 22,
                    authKey: process.env.HOME +"/.sftp-deploy-example.txt"
                cache: 'sftpCache.json',
                src: 'source-directory-name',
                dest: 'destination-directory-name',  // this is on the remote host
                exclusions: [],
                serverSep: '/',
                concurrency: 4,
                progress: true

This sets up a copying from a local directory to one on a remote host. The remove host name is in the host parameter, with the remote directory name in the dest parameter, and so on. That part is pretty straight-forward.

The tricky part is the authentication. Since this is a file which gets checked into source control it's a really bad idea to put user names or passwords in the Gruntfile. Fortunately Grunt makes it easy to read stuff in from external files, that you hopefully don't put under source control.

In this case the grunt-sftp-deploy plugin uses this authKey parameter in several ways, one of which is a file name containing a JSON object. It should be like so:

"username": "user-name-on-server"

Then grunt-sftp-deploy also automatically looks for local SSH keys to use for passwordless authentication.

This works pretty slick and you can then type this command:

$ grunt sftp-deploy:deploy

The only problem is that it takes a long time because it appears to deploy every last file. Supposedly the cache parameter is for a file that's used to keep data to help avoid uploading files which haven't changed. But that didn't work for me.

grunt-ssh, with .zip file

My next thought was to build a .zip file of the directory hierarchy I wanted to deploy, then use the grunt-ssh plugin to upload it, then ssh over a command to unpack the .zip file. This didn't work out because of the time involved in copying the .zip file. But let's take a look at this anyway.

To create the .zip file is something like this (using the archiver module):

var archiver = require('archiver');

module.exports.zipRenderedSite = function(config, done) {

    var archive = archiver('zip');
    var output = fs.createWriteStream(config.root_out +'.zip');
    output.on('close', function() {
        logger.info(archive.pointer() + ' total bytes');
        logger.info('archiver has been finalized and the output file descriptor has closed.'); 
    archive.on('error', function(err) {

archive.directory(config.root_out, ".");


That was easy.

Then in the Gruntfile.js:


To load the plugin tasks.

    deployData: grunt.file.readJSON(process.env.HOME +'/.sftp-deploy-example.json'),
    sftp: {
            deployZip: {
                files: { "./": config.root_out +'.zip' }
            options: {
                host: '<%= deployData.host %>',
                path: '<%= deployData.path %>',
                username: '<%= deployData.username %>',
                privateKey: grunt.file.read(process.env.HOME + "/.ssh/id_rsa"),
                showProgress: true

This again uses a JSON file in the home directory to store info that shouldn't be checked into source code control. The SSH private key is read directly.

This works without any external dependencies. I was going to take the next step to unpack the .zip file on the remote host, but it took too long to upload that file (over 200MB) that this too was a non-starter.


That left using an rsync wrapper, since I was running out of options that use a pure JavaScript SSH2 implementation.


This loads the plugin.

rsync: {
    deploySite: {
        options: {
            args: [ '--verbose', '--archive', '--delete', '--compress' ],
            src: config.root_out +'/',
            dest: "example.com/",
            host: "remote-user-name@example.com"

This is a simple wrapper around the rsync command. The args array are the exact same args you'd pass on the command line. The src and dest directories are exactly as you'd do on the command line. You have to take the exact same precise care about when and where to put a trailing '/' on the directory names, just as in rsync.

With this deployment is now this easy:

$ grunt rsync:deploySite