16 Feb 2016

Broccoli and Angular.js


This post was initially written on 2014-05-28 and not published. Things might have changed.

Broccoli is a relatively new asset builder. It is based on doing operations on a trees of files.

Here is how I used it to concatenate front end dependencies installed via bower and an angular.js app.


First, let me show you the two files you need for bower.


Here is an example bower.json that lists the frontend dependencies. Note the resolutions property.

  "dependencies": {
    "angular-ui-router": "0.2.11",
  "resolutions": {
    "angular": "~1.3.0"


This is relative to your project and tells bower where to put the dependencies.

    "directory": "public/vendor"


Now we need to install broccoli. I’ve installed the broccoli-cli globally and as per the installation guide.

npm install --save-dev broccoli
npm install --global broccoli-cli

We also need to install plugins for broccoli;

npm install --saveDev broccoli-concat


Like all task runners broccoli has its own file format to define its operations, though its not really a task runner but rather a build tool.

Here is the brocfile.js to concatenate all of the above bower dependencies

var broccoli = require('broccoli');
var concat = require('broccoli-concat');

var concatenated = concat('public/',  {
  inputFiles: [
  outputFile: '/assets/app.js',
  separator: '\n', // (optional, defaults to \n)
  wrapInEval: false // (optional, defaults to false)

module.exports = concatenated;

We explicitly define the order of concatenation to the concat function. This way we have jQuery loading before angular, and angular loading before ui-router and our app code (which is assumed to exist in public/js).

Now running broccoli serve will start a http server on port 4200 and the concatenated Javascript will be available at http://localhost:4200/assets/app.js.

Hope that helps.

11 Nov 2014

Using Bluebird With Angular Protractor

Async control flow

There are few places where you would want to use a promise. Protractor supports Promises in the onPrepare function but the example uses Q.

That example onPrepare written using Bluebird looks like this;

var Promise = require('bluebird');

onPrepare: function(){
  return Promise.delay(2000);
      browser.params.password = '12345';

A better example is that the onPrepare function can be used to perform some async setup task like creating a fake User in your database to be able to login.

var User = require('./models/User');

onPrepare: function() {
  // returns a Promise
  return User.create({
    username: 'bulkan',
    password': 'igotdis'

Test structure

Protractor uses Jasmine 1.3 and has updated it to automatically resolve Promises.

describe('Home page', function(){
  it('should have username input', function(){
    var username = element(by.css('#username'));

expect automatically resolves the Promise so there is no need to do the following


Here is another example test that will verify that the home page is rendering Post titles. This time we have to chain onto the .then of the Promises.

var Promise = require('bluebird'),
    Posts = require('./models/Posts');

describe('Home Page', function(){
  it('should have a list of posts', function(done){


    var posts = element(by.repeater('post in posts').column('post.title'));

      return elm.getInnerHtml();
      return titles.sort();
      return Posts.findAll({attributes: 'title', order: 'title'})

We need to Promise.cast the posts.map as we call .nodeify which is a bluebird function. nodeify helps simplify tests by not needing to explicitly call done in the last then and in a catch

Jasmine supports asynchronous tests by passing in a callback function to an it, just like in Mocha. In the test above we find elements by the repeater. The template used might look like;

<div ng-repeat="post in posts">
    <h1> {{::post.title}} </h1>

There might be an easier/simpler way to do this so please do let me know by commenting below.

09 Jun 2014

Using Express Router instead of express-namespace

express 4.0 has been out for a while and it seems people are still using express-namespace. According to npm it had 183 downloads on the 8th of June.

express-namespace hasnt been updated in nearly two years and it can now be replaced with the Router that comes with express 4.

Also I’ve found that the middleware mounting on namespace roots would mount it at the the application level. This is else that the router solves as it allows you to seperate out routes into different modules with its own middleware.

Here is the example from express-namespace written using the Router in express 4.0.

var express = require('express'),
    forumRouter = express.Router(),
    threadRouter = express.Router(),
    app = express();

forumRouter.get('/:id/((view)?)', function(req, res){
  res.send('GET forum ' + req.params.id);

forumRouter.get('/:id/edit', function(req, res){
  res.send('GET forum ' + req.params.id + ' edit page');

forumRouter.delete('/:id', function(req, res){
  res.send('DELETE forum ' + req.params.id);

app.use('/forum', forumRouter);

threadRouter.get('/:id/thread/:tid', function(req, res){
  res.send('GET forum ' + req.params.id + ' thread ' + req.params.tid);

forumRouter.use('/', threadRouter);


A little bit more typing but easier to explain to others and no monkey patching weirdness of express-namespace.

The routes are more little more explicitly defined.

Hope this helps.

28 Apr 2014

Mocking a function that returns a (bluebird) Promise

With Sinon.JS mocking functions are quite easy. Here is how to stub a function that returns a Promise.

Demonstrated with a potato quality example. Imagine the following code is in a file named db.js

var Promise = require('bluebird');

module.exports.query = function query(q) {
  return Promise.resolve([
      username: 'bulkan',
      verified: true

Using bluebird we simulate a database query which returns a Promise that is resolved with an Array of Objects.

Imagine the following code located in users.js;

var db = require('./db');

module.exports.getVerified = function getVerified(){
  return db.query('select * from where verified=true');

The mocha unit test for the above which stubs out db.query that is called in users.js;

var db = require('./db')
  , should  = require('chai').should()
  , sinon = require('sinon')
  , users;

describe('Users', function(){
  var sandbox, queryStub;

    sandbox = sinon.sandbox.create();
    queryStub = sandbox.stub(db, 'query');
    users = require('./users');


  it('getVerified should return a resolved Promise', function(){
    queryStub.returns(Promise.reject('still resolved'));
    var p = users.getVerified();
    return p;

In the beforeEach and afterEach functions of the test we create a sinon sandbox which is slightly over kill for this example but it allows you to stub out a few methods without worrying about manually restoring each stub later on as you can just restore the whole sandbox as demonstrated in the afterEach.

There is one test case that tells the queryStub to return a Promise that is rejected. Then test that the promise that users.getVerified returns is resolved. Mocha now will wait until Promises that are returned from its to resolve.

Sorry about the potato quality example, been trying to think of a better example. Any suggestions ?

Hope this helps.

24 Apr 2014

Using mockery to mock modules for Node.js testing

In a previous article I wrote about mocking methods on the request module.

request also supports another workflow in which you directly call the imported module;

var request = require('request');

  method: 'GET',
  url: 'https://api.github.com/users/bulkan'
}, function(err, response, body){
  if (err) {
    return console.err(err);


You pass in an options object specifying properties like the HTTP method to use and others such as url, body & json.

Here is the example from the previous article updated to use request(options);

var request = require('request');

function getProfile(username, cb){
    method: 'GET',
    url: 'https://api.github.com/users/' + username
  }, function(err, response, body){
    if (err) {
      return cb(err);
    cb(null, body);

module.exports = getProfile;

Its not that big of a change. To unit test the getProfile function we will need to mock out request module that is being imported by the module that getProfile is defined in. This where mockery comes in. It allows us to change what gets returned when a module is imported.

Here is a mocha test case using mockery. This assumes that the above code is in a file named gh.js.

var sinon = require('sinon')
  , mockery = require('mockery')
  , should = require('chai').should();

describe('User Profile', function(){
  var requestStub, getProfile

      warnOnReplace: false,
      warnOnUnregistered: false,
      useCleanCache: true

    requestStub = sinon.stub();

    // replace the module `request` with a stub object
    mockery.registerMock('request', requestStub);

    getProfile = require('./gh');


  it('can get user profile', function(done){
    requestStub.yields(null, {statusCode: 200}, {login: "bulkan"});

    getProfile('bulkan', function(err, result){
      if(err) {
        return done(err);

mockery hijacks the require function and replaces modules with our mocks. In the above code we register a sinon stub to be returned when require('request') is called. Then we configure the mock in the test using the method .yield on the stub to a call the callback function passed to request with null for the error, an object for the response and another object for the body.

You can write more tests

Hope this helps.

14 Apr 2014

AngularJS & Popup Windows

Popup windows are extremely annoying hence most modern browsers block them, agreeably so. That being said one use of popup windows is when doing OAuth. Showing the OAuth authorization dialog in a popup window as not to confuse the user.

If there is a better or different way please comment below.

All the code can be found at angular-popup.

Here is how I solved it using a simple express 4 application and the accompanying AngularJS.

The express code is very simple it just creates two routes. The root/index route renders the view to bootstrap the angular application.

The angular app has one default route / with its controller set to PopupCtrl. In the template popup.html using ng-click we call the function bound on the $scope called showPopup. This is the code for PopupCtrl;

Read the inline comments;

popupApp.controller('PopupCtrl', ['$scope', '$window', '$interval', function PopupCtrl($scope, $window, $interval) {
  'use strict';

  // assign the current $scope to $window so that the popup window can access it
  $window.$scope = $scope;

  $scope.showPopup = function showPopup(){
    // center the popup window
    var left = screen.width/2 - 200
        , top = screen.height/2 - 250
        , popup = $window.open('/popup', '', "top=" + top + ",left=" + left + ",width=400,height=500")
        , interval = 1000;

    // create an ever increasing interval to check a certain global value getting assigned in the popup
    var i = $interval(function(){
      interval += 500;
      try {

        // value is the user_id returned from paypal
        if (popup.value){
      } catch(e){
    }, interval);


We tell the popup to load up the /popup URL which our express app will render the server side jade template.

extends layout

block content
    <h1>I'm a popup</h1>

            window.opener.$scope.says = 'teapot';
            window.value = true;
        }, 2000);

The template above is simple enough. All it does is after two seconds assing to window.value to indicate to the $interval that the popup has done something important. The popup also assigns a value to window.opener.$scope which is the $scope that was assigned in PopupCtrl.

As we have used ng-model in the default routes template a we will see the text teapot appear in the text input.

Hope this makes sense.

20 Jan 2014

Using Sequelize Migrations With An Existing Database


Im sure you know know how to install packages but here is the command for the sake of completeness

npm install sequelize async

The first migration

First initialize the migrations structure

sequelize --init

Then create the initial migration, but dont edit this file as we will use it create the SequelizeMeta table.

sequelize -c initial

Create another migration

sequelize -c create-tables

Dump the database

Now dump your database without the data. With mysqldump

mysqldump -d --compact --compatible=mysql323 ${dbname}|egrep -v "(^SET|^/\*\!)".

We need to remove the lines beginning or containing SET

Save this dump to the migrations folder and name it initial.sql

Edit the last migration that was created to look like;

var async = require('async')
  , fs = require('fs');

module.exports = {
  up: function(migration, DataTypes, done) {
    var db = migration.migrator.sequelize;

        fs.readFile(__dirname + '/initial.sql', function(err, data){
          if (err) throw err;
          cb(null, data.toString());

      function(initialSchema, cb){
        // need to split on ';' to get the individual CREATE TABLE sql
        // as db.query can execute on query at a time
        var tables = initialSchema.split(';');

        function createTable(tableSql, doneCreate){

        async.each(tables, createTable, cb);
    ], done);

  down: function(migration, DataTypes, done) {

      // Dont drop the SequelizeMeta table
      var tables = tableNames.filter(function(name){
        return name.toLowerCase() !== 'sequelizemeta';

      function dropTable(tableName, cb){

      async.each(tables, dropTable, done);

Please explain

On the migrations up function we use async.waterfall to orchestrate a the async calls;

  • read in the initial.sql file
  • need to split the initial.sql and retrieve each CREATE TABLE queries as db.query can execute on query at a time
  • using async.each run each of these queries

On the migrations down function we just remove all tables that is not the SequelizeMeta table. For some reason migration.dropAllTables() remove this table and messes up the migrations. Not sure if this is the correct behavior.

Hope this helps