Notes of Maks Nemisj

Experiments with JavaScript

When preparing application for deploying to production env I want to ensure that everything is properly logged, especially things which are unexpected in code. That’s why, I think, proper error handling is crucial to any application.

This is a short article on how to do an error handling when going live in isomporhic web app using React.js, Express.js and Fluxible. Probably after seeing the word Fluxible you might think – “I don’t use Fluxible, it is irrelevant to me”. Still hold on, since most of the points can be applied to any isomporhic SPA’s based on Express and React.

Step 1: Rendering

First step is to prevent initial rendering from breaking. It is the place in a code where method render() is called on react-dom and renderToStaticMarkup() on the react-dom/server.

An example code for browser:

import ReactDOM from 'react-dom';
try {
  ReactDOM.render();
} catch (e) {
  console.error(e);
}

and one for the server:

import ReactServer from 'react-dom/server';
try {
    ReactServer.renderToStaticMarkup();
} catch (e) {
  console.error(e);
}

In case you use promises in your code base, there is no need to put catch statements around methods. Instead use catch() function. Code below will clarify it:

import ReactServer from 'react-dom/server';

  (...some code before)
  .then(() => {
    ReactServer.renderToStaticMarkup();
  })
  .catch(() => {
    console.error(e);
  });

Step 2: Express

After rendering of react is fixed, there are other things, which might go wrong on the server. For example, things might break before the rendering. If you use Express.js you can catch them using special middleware: http://expressjs.com/en/guide/error-handling.html

This middleware should be placed after all the other middlewares:

import express from 'express';
const server = express();
server.use((req, res) => {
  // some rendering code
});

server.use((req, res) => {
  // some other handler
});

//// error middleware is HERE:
//

server.use((err, req, res, next) => {
  console.error(err);
});

//
////

As you can see, this middleware expects to receive 4 arguments, the first one is err object and all others are as in normal middleware.

Step 3: Global error handler

Besides the specific error handlers, there are also two global places for intercepting errors which can be used:

Step 3.1: Node.js

Node.js has a global error hook to catch the errors inside the asynchronous flows. Such errors might occur, when you’re getting back from I/O inside a callback and the code is not within try {} catch() {}. Example:

import superagent from 'superagent';

export default () => {
  return new Promise((resolve, reject) => {
    superagent
      .get(url)
      .end((err, result) => {
        // at this place if error occurs, global hook can help
        return resolve(result);
      });
  });
}

To setup global error hook use uncaughtException event:

process.on('uncaughtException', (err) => {
  console.log(err);
});

A lot of people advice against using this hook, but I propose to do it if you do some different logging, than console.error. At least, you could catch error using your logger, and then terminate a process:

process.on('uncaughtException', (err) => {
  logger.error(err);
  process.exit(1);
});

If you use Promises in your code base, or maybe some of the dependencies might use them, there is another event available: unhandledRejection. This one will catch promises which have no .catch() method in them. Example:

  (some code)
  .then(() => {
    // at this place if error occurs, unhandledRejection might help
  });

Here is the hook to use:

process.on('unhandledRejection', (reason, p) => {
  console.error(reason);
});

Small note to those who use source-map-support npm package. In order to make uncaughtException to work you have to disable it inside the module configuration:

require("source-map-support").install({
  handleUncaughtExceptions: false
});

Step 3.2: Browser

When code is running inside the browser, there is another way to catch unhandled errors. Such errors might occur not only inside fetching of the data, but for example inside browser events, like mouse clicks, key presses, scrolls. To setup error handler use window.onerror

window.onerror = function onError(message, fileName, col, line, err) {
 console.error(err || new Error(message));
});

Be careful with the non production build of React. It appears that React intercept unhandled messages with the ReactErrorUtils and will give you Script error instead of meaningful error. WHen you will build the react for production then all will be fine.

Step 4: Fluxible

Fluxible has its own way of handling errors. Whenever you use executeAction, it will be caught by the Fluxible itself. Which means it wont’ appear in all of the above places. In case you want to get the error and do something with it, use componentActionErrorHandler when constructing Fluxible instance:

new Fluxible({
  componentActionErrorHandler(context, fluxibleError) {
    // fluxibleError has err inside which is Native one 
    console.error(fluxibleError.err);
  }
})

Step 4.1: Services

It’s not a separate hook, but a friendly reminder. Do something with your errors inside the services, when fetching data. I have noticed that it is one of the points where people forget to do the error handling.

Step 5

Whenever you use some framework or library, don’t hesitate to look into the documentation. Maybe they have their own way of handling errors. Please, do not leave your luggage errors unattended.

, , , , , , , , ,

If you’ve decided to move react components to es6/es2015 syntax you’ve might found out that defining propTypes and contextTypes is not that seamless as it was. Babel@6.7.7 doesn’t yes support static properties on Classes and the most evident way to use propTypes is to append them to the class at the end:

class SomeComponent extends React.Component {
  render() {
  }
}

SomeComponent.propTypes = {
  text: React.PropTypes.string
};

Though there is a little trick to do it inline in the class. Thankfully bable@6.7.7 supports static getters and setters which we can use for that:

class SomeComponent extends React.Component {

  static get propTypes() {
    return {
      text: React.PropTypes.string
    }
  }

  render() {
  }

}

The same applies to contextTypes.

Now you can choose which method to use đŸ™‚

, , , ,

If you have moved to Ubuntu 16.04 you can find out that your old ViM stuff is not working – Some plugins are broken. This is due to the change in python interpreter for ViM ( https://wiki.ubuntu.com/XenialXerus/ReleaseNotes#VIM_defaults_to_python3 )

To fix this you have to use different package of vim, like vim-gnome-py2. If you’re like me and using ncurses version of vim you would go better with to vim-nox-py2 package.

sudo apt install vim-nox-py2
sudo update-alternatives --set vim /usr/bin/vim.nox-py2
sudo update-alternatives --set vi /usr/bin/vim.nox-py2

That should fix broken plugins.

, ,

I still have to used to this new arrow functions and implicit return statement. If you’re unfamiliar with them, here is the doc – https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/Arrow_functions

Look at this two ‘almost’ identical parts of code and think what is the difference between them?

code-one.js

function run(context) {
  return methodOne()
    .then(() => doIt(context) );
}

and this one

code-two.js

function run(context) {
  return methodOne()
    .then(() => { doIt(context); });

They look the same, except that one will work correctly and another one won’t. Whenever an arrow function has curly braces, it expects statement in the body, whenever there are no curlies, it sees it as expression and applies implicit return to it.

It’s quite easy to overlook this, when doing code review or writing code at two o’clock in the morning. That’s why I suggest to not write implicit code like this. Stick to the explicit return statement and you’re safe. I know, it’s longer to write, but remember: “Time saved by less typing is not comparable to the time spent on debugging this code.”

You’ve been warned.

,

As you know, getters and setters are already a part of the JavaScript for sometime. They’re widely support in all major browsers even starting at IE8.

I don’t think that this concept is wrong in general, but I think it’s not very well suited for JavaScript. It might look like getters and setters are a time saver and simplification of your code, but actually they brings hidden errors which are not obvious from the first look.

How does getters and setters work

First a small recap on what are these things are:

Sometimes it is desirable to allow access to a property that returns a dynamically computed value, or you may want reflect the status of an internal variable without requiring the use of explicit method calls.

To illustrate how they work, let’s look at a person object which has two properties: firstName and lastName, and one computed value fullName.

var obj = {
  firstName: "Maks",
  lastName: "Nemisj"
}

The computed value fullName would return a concatenation of both firstName and lastName.

Object.defineProperty(person, 'fullName', {
  get: function () {
    return this.firstName + ' ' + this.lastName;
  }
});

To get the computed value of fullName there is no more need for awful braces like person.fullName(), but a simple var fullName = person.fullName can be used.

The same applies to the setters, you could set a value by using the function:

Object.defineProperty(person, 'fullName', {
  set: function (value) {
    var names = value.split(' ');
    this.firstName = names[0];
    this.lastName = names[1];
  }
});

Usage is just as simple with getter: person.fullName = 'Boris Gorbachev' This will call the function defined above and will split Boris Gorbachev into firstName and lastName.

Where is the problem

You maybe think: “Hey, I like setters and getters, they feel more natural, just like JSON.” You’re right, they do, but let’s step back for a moment and look how would fullName worked before getters and setters?

For getting a value we would use something like getFullName() and for setting a value person.setFullName('Maks Nemisj') would be used.

And what would happen if the name of the function is misspelled and person.getFullName() is written as person.getFulName()?

JavaScript would give an error:

person.getFulName();
       ^
TypeError: undefined is not a function

This error is triggered at the right place and at the right moment. Accessing non existing functions of an object will trigger an error – that’s good.

Now let’s see what happens when setter is used with the wrong name?

  person.fulName = 'Boris Gorbachev';

Nothing. Objects are extensible and can have dynamically assigned keys and values, so no error will be thrown in runtime.

Such behavior means that errors might be visible somewhere in the user interface, or maybe, when some operation is performed on the wrong value, but not at the moment when the real typo occurred.

Tracing errors which should happen in the past but shown in the future of the code flow is “so fun”.

Seal to the rescue

This problem could be partially solved by seal API. Whenever an object is sealed, it can’t be mutated, which means that fulName will try to assign a new key to the person object and it will fail.

For some reason, when I was testing this in node.js v4.0, it didn’t worked the way I was expecting. So I doubt this solution.

What is even more frustrating is that there is no solution for getters at all. As I already mentioned, objects are extensible and are failsafe, which means accessing a non existing key will not result in any error at all.

I wouldn’t bother writing this article if this situation would only apply to the object literals, but after rise of ECMAScript 2015 (ES6) and the ability to define getters and setters within Classes, I’ve decided to blog about the possible pitfalls.

Classes to the masses

I know that currently Classes are not very welcome inside of some JavaScript communities. People are arguing about the need of them in a functional/prototype-based language like JavaScript. However, the fact is that classes are in ECMAScript 2015 (ES6) spec and are going to stay there for a while.

For me, Classes are the way to specify well defined APIs between the outside world ( consumers ) of the classes and the internals of the application. It is an abstraction which puts rules down in black and white, and assumes that these rules are not going to change any time soon.

Time to improve the person object and make a real class of it ( as real as class can be in JavaScript). Person defines the interface for getting and setting fullName.

class Person {
  constructor(firstName, lastName) {
    this.firstName = firstName;
    this.lastName = lastName;
  }

  getFullName() {
    return this.firstName + ' ' + this.lastName;
  }

  setFullName(value) {
    var names = value.split(' ');
    this.firstName = names[0];
    this.lastName = names[1];
  }
}

Classes define a strict interface description, but getters and setters make it less strict than it should be. We’re already used to the swollen errors when typos occur in keys when working with object literals and with JSON. At least I was hoping that Classes would be more strict and provide better feedback to the developers in that sense.

Though this situation is not any different when defining getters and setters on a class. It will not stop others from making typos without any feedback.

class Person {
  constructor(firstName, lastName) {
    this.firstName = firstName;
    this.lastName = lastName;
  }

  get fullName() {
    return this.firstName + ' ' + this.lastName;
  }

  set fullName(value) {
    var names = value.split(' ');
    this.firstName = names[0];
    this.lastName = names[1];
  }
}

Executing with a typo, won’t give any error:

var person = new Person('Maks', 'Nemisj');
console.log(person.fulName);

The same non-strict, non-verbose, non-traceable behavior leading to possible errors.

After I discovered this, my question was: is there anything to do in order to make classes more strict when using getters and setters? I found out: sure there is, but is this worse it? Adding an extra layer of complexity into code just to use fewer braces? It is also possible not to use getters and setters for API definition and that would solve the issue. Unless you’re a hardcore developer and willing to proceed, there is another solution, described below.

Proxy to the rescue?

Besides setters and getters, ECMAScript 2015 (ES6) also comes with proxy object. Proxies help you to define the delegator method which can be used to perform various actions before real access to the key is performed. Actually, it looks like dynamic getters/setters.

Proxy objects can be used to trap any access to the instance of the Class and throw an error if a pre-defined getter or setter was not found in that Class.

In order to do this, two actions must be performed:

  • Create list of getters and setters based on the Person prototype.
  • Create Proxy object which will test against these lists.

Let’s implement it.

First, to find out what kind of getters and setters are available on the class Person, it’s
possible to use getOwnPropertyNames and getOwnPropertyDescriptor:

var names = Object.getOwnPropertyNames(Person.prototype);

var getters = names.filter((name) => {
  var result =  Object.getOwnPropertyDescriptor(Person.prototype, name);
  return !!result.get;
});

var setters = names.filter((name) => {
  var result =  Object.getOwnPropertyDescriptor(Person.prototype, name);
  return !!result.set;
});

After that, create a Proxy object, which will be tested against these lists:

var handler = {
  get(target, name) {
    if (getters.indexOf(name) != -1) {
      return target[name];
    }
    throw new Error('Getter "' + name + '" not found in "Person"');
  },

  set(target, name) {
    if (setters.indexOf(name) != -1) {
      return target[name];
    }
    throw new Error('Setter "' + name + '" not found in "Person"');
  }
};

person = new Proxy(person, handler);

Now, whenever you will try to access person.fulName, message Error: Getter "fulName" not found in "Person" will be shown.

I hope this article helped you to understand the whole picture about getters and setters, and danger which they can bring into the code.

, , ,

Currently I’m working on a project which uses GitHub. Here we work with feature branches, which means every feature gets its branch and every branch has to go through the pull request and then merged back to the main line. Whenever branch is merged back it get’s deleted in the GitHub with button “Delete branch”. GitHub allows to restore branches, so we try to keep branches list as short as possible.

I’m working with a command line and after a while of such workflow git branch becoming bigger and bigger while git branch -r remains small. Sure, it’s possible to do git branch -d {branch_name}, but after a while I will have again the full list.

To fix this manual routine I’ve created a script which will cleanup all the branches which are available locally but already removed from the remote. This is a python script and working only for python 2.7 https://raw.githubusercontent.com/nemisj/git-removed-branches/master/git-removed-branches.py

By default this script will not remove any local branches, but only will list branches to be removed. To actually perform the deletion --prune flag must be specified.

One more trick: If you place this script inside the $PATH variable, then you will be able to run it as git command

git removed-branches

I found this trick at http://thediscoblog.com/blog/2014/03/29/custom-git-commands-in-3-steps/

NPM

If you prefer node.js and npm, you can install git-removed-branches of the script via npm

$ npm install -g git-removed-branches

JS source is available at github: https://github.com/nemisj/git-removed-branches

, , , ,

Have you ever needed to repeat a string or character multiple times? Some times I have this need ( don’t ask why ) and it was always annoying for me to do this. For such a simple operation, you have to write for loop and concatenate string. I know, there is now repeat available in javascript, but it’s only starting with ES6 (ES2015) which is not available everywhere.

var repeat = function (times, str) {
  var result = '';
  for (var i = 0; i < times; i++) {
    result += str;
  }
  return result;

But today after thinking a while I’ve found much more cleaner and faster way to do that. For rescue bit shifting operations comes into scene. Look at that beauty:

var repeat = function (times, str) {
    return (1 << (times - 1)).toString(2).replace(/./g, str);
}

Doesn’t it looks cute? Let’s see how is this working.

There are two ways in repeating a string: it can be buildup using a for loop or it can be created using a replacement method.

The first one we all know, but the second one is what makes foundation of this trick.

We can take any string ‘string’ and replace any character of it with any other character or any string:

'string'.replace(/./g, 'new string');

This will repeat our ‘new string’ 6 times, because ‘string’ has 6 characters. In order to make repetition adjustable, we have to generate string with the number of characters we want to have repetition.

To create an input string, I’ve decided to use bit-shifting operation and represent bits as a characters. I won’t describe the idea behind bitshifting in details, but in short it looks like this.

Since any digit contains of a bits, it’s is possible to do bits manipulation. For example: number 2 is represented in bits as 00000010. If we apply bitshift operator << to this number, all the bits are going to be shifting ‘n’ times to the left. So shifting 2 << 5 times, will make 01000000. In order to see this bits, JavaScript provides toString method with operand. If you pass 2 to it, then it will represent digit in binary format.

(2).toString(2) // will be 10
(5).toString(2) // will be 1000000

Now that we know how bitshifting works and how to represent it as a string, we can easily create string with ‘n’ characters.

(1 << (n - 1)).toString(2);

That said, I have to note, that bitshifting in Javascript only works till 31 bits, then it bits get overflowed. That’s the reason why it’s only possible to repeat string 31 times using this approach:

(1 << 32).toString(2); // will be 1 again

Only what is left is to replace any character of this string with the string we need.

var repeat = function (times, str) {
    return (1 << (times - 1)).toString(2).replace(/./g, str);
}

, ,

Embellishment of a story

Recently I had to setup a new mini webserver. The functionality of this server was not very complex:

1. Fetch JSON data from [Jenkins](http://jenkins-cli.com) server
2. Transform JSON into another format
3. Read files from the file system
4. Return the result to the user

I took express.js and node-jenkins package as server stack. Express was handling http/https requests and node-jenkins package was responsible for making requests to the Jenkins server.

To structure asynchronous code I’ve used async.js library. This is because jenkins package and fs module of node.js are both based on error first callbacks.

When everything was done, I suddenly realized that world is almost up to the next version of the JavaScript specs – ECMAScript-2015 and that there are generators available to us. Generators allow halting the execution flow inside the function and resume it later. This gives the possibility to use generators as an alternative for handling asynchronous code in a synchronous manner.

var result = yield asyncMethodOne();
var result2 = yield asyncMethodTwo();

Support of the generators is already added to node.js by using –harmony flag and I could make use of that to restructure code of the webserver.

I’ve opened up Google and looked at the existence of the available libraries for using as generators runners. I’ve found a nice overview http://blog.vullum.io/nodejs-javascript-flow-fibers-generators/ which shows different approaches to the same problem.

After looking at all these packages, a couple of things still bothered me.

First of all it appears that the current node code is not really suitable for all this generators stuff, unless it’s written with Promises. Which means that to make my current code work I will have to wrap all the code with promises, e.g with promisify:

var processedData1 = yield Promise.promisify(fs.readFile)(inputFile);

There was also another approach to use generators and “old-school-node-js” functions. In this case I had to use bind for every function and pass it to the yield statement, so that the generator runner could pass itself as a callback to yielded function.

var data = yield fs.readFile.bind(fs, inputFile);

I didn’t like both solutions. The first one does not look very clean and put an extra abstraction which I don’t like. The second one gives wrong expectation to the reader of the code, because executing function fs.readFile(inputFile) means something different than binding the function fs.readFile.bind(fs, inputFile).

Yielding the yield

From the beginning of time was node.js implemented using error-first callbacks. It is so fundamental to node.js, that a major part of libraries in the npm registry works this way and every developer who uses node.js knows how to apply technique of error-first callbacks.

When using yield constructs within the asynchronous code you have to choose whether you use promises or wrap around your calls. My idea was to create a library which would work with error-first callback functions in a natural way, without wrappers and promises. I called it yield-yield

To show how it is working, let’s walk through a sample code which reads a file from the file-system.

var fs = require('fs');

var inputFile = '/etc/hosts';
var callback = function (err, content) {
  if (err) {
    console.error('Error when opening file: ' + err.message);
  }
  /* code */
};

fs.readFile(inputFile, 'utf8', callback);

In order to use yield-yield library, yield statement have to be passed to fs.readFile function instead of a usual callback.

fs.readFile(inputFile, 'utf8', yield);

After that, to get the content from the fs.readFile second yield statement must be used at the beginning of the call:

var fs = require('fs');

var inputFile = '/etc/hosts';
var content = yield fs.readFile(inputFile, 'utf8', yield);

The content which is normally passed by the fs.readFile as the second argument will returned instead. If for some reason fs.readFile will pass a first argument, in cases of error, then this yield will throw an Error. In this way it can be caught using standard try {} catch (e) {} syntax.

var fs = require('fs');

var inputFile = '/etc/hosts';
var content;

try {
  content = yield fs.readFile(inputFile, 'utf8', yield);
} catch (e) {
  console.error('Error when opening file: ' + err.message);
}

Because yield statement is used, this code must be placed inside a generator function with an asterisk.

var fs = require('fs');

var readHostsFile = function *() {
  var inputFile = '/etc/hosts';
  var content;

  try {
    content = yield fs.readFile(inputFile, 'utf8', yield);
  } catch (e) {
    console.error('Error when opening file: ' + err.message);
  }
};

Still, it’s not enough to put an asterisk in a function definition.

Generators are supposed to be handled differently, using .next() method of the generator. In order to use this generator yield-yield must be used as a runner for this generator.

To keep yield-yield compatible with exciting code, yield-yield runner will transform any given generator into the error-first callback function, so that it can be executed as if it is not a generator:

var sync = require("yield-yield");

var readMyFile = sync(function* () {
    try {
        var content = yield fs.readFile(inputFile, options, yield);
    } catch (e) {
        console.log('error detected');
    }
});

readMyFile(function () {
   console.log('Function is done');
});

In this way, yield-yield is compliant with existing code and can be used as an enrichment instead of a full replacement.

All the possible execution flows of the yield-yield can be found in the documentation.

Using yield-yield in real-world examples

Even if you don’t plan to transform your code to the generators ready solution, you can start applying yield-yield to different parts of your operations code. For example inside tests, or building tools or anything else.

Mocha

Mocha is one of the libraries which can run tests using error-first callback mechanism. Let’s see how asynchronous test can be translated into the yield-yield code.

First of all, code before transformation:

describe('file', function() {

  test('should do something async', function(done) {

    methodone(function() {    
      methodTwo(function() {
        return done();
      });
    });

  });

});

Now we can wrap testing code into the yield-yield runner and use yield statements to do async execution:

var sync = require('yield-yield');

describe('file', function() {

  test('should do something async', sync(function* () {

    yield methodOne(yield);
    yield methodTwo(yield);

  });

});

If methodOne or methodTwo will throw an exception or will pass an error as the first parameter, then test will fail.

x file should do something async

Also you can see, that there is no done() call at the end. Because mocha passes done as an argument to the test function, yield-yield will execute it automatically at the end of the code flow.

PostCSS-cli

I took another library postcss-cli and looked what can be changed. Postcss-cli is used as a command line tool for postcss. Inside index.js there is enough processing using async.js

I took method processCSS and translated it into the yield-yield construction without changing all the other code, and it stil works.

GitHub

Code is available on GitHub: https://github.com/nemisj/yield-yield

, , , , ,

In this article I will explain “Why should you test react.js components”, “How can you do testing” and “What are the problems you might come across”. Testing solution which I use, doesn’t rely on the Test-utils of React and on DOM implementation ( so that you can do the node.js testing ).

Why/How to test React components?

The main question you might have with react components – “Why to test react components and what exactly to test?”

Setup

Before I will continue with the explanation I would like to show you a sample setup we will use. I will have two components ParentComponent.jsx and ChildComponent.jsx. In render() of ParentComponent.jsx ChildComponent will be rendered based on a ‘standalone’ property.

Definition of ParentComponent.jsx:

var React = require('react');

var Child = require('./ChildComponent.jsx');

module.exports = React.createClass({
  displayName: 'ParentComponent',

  renderChildren: function() {

    if (this.props.standalone === true) {
      return null;
    } else {
      return <Child />;
    }

  },

  render: function() {
    return (
      <div className="parent-component">
        {this.renderChildren()}
      </div>
    );
  }
});

ParentComponent.jsx

Here is definition of the ChildComponent.jsx:

var React = require('react');

module.exports = React.createClass({
  displayName: 'ChildComponent',

  render: function() {
    return <div className="child-component"></div>;
  }
});

ChildComponent.jsx

Code can be found at https://github.com/nemisj/react-mock-testing with the history following this article.

Unit testing

In my opinion whenever you have some logic in the code and it depends on any circumstances it’s important to test it, in order to make sure that what you expect is always valid. While this is obvious for ‘usual’ code, it’s not always clear what to test in react components.

If you look at the example above you will see that render() method of the ParentComponent.jsx will return the following html:

<div class="parent-component">
  <div class="child-component"></div>
</div>

And renderChildren() will return in its turn:

  <div class="child-component"></div>

When seeing this, you might start asking.

Do I have to test HTML…?

That’s a valid question and I was also asking it myself. The answer is – YES you DO have to test it and NO you DON’T have to test it.

Yes, you DO have to test HTML

The reason for that is simple. Think about the HTML asif it’s a return type. Imagine if renderChildren() would return you a real instance of ChildComponent. You wouldn’t even ask whether to test it or not. Because it’s an instance you would just do some kind of instanceof and that’s it. But because in react a markup (HTML) is returned, it feels like a different story. Nevertheless HTML is the only medium there is, so we have to deal with it as it is.

No, you DON’T have to test HTML

When testing component we are not interested in the content of HTML itself. It does not matter what kind of node component returns, either it’s a <span> with a CSS-class or a <div> with an attribute. What is important, is to test what certain element means to us inside our application/code.

Take for example markup of ChildComponent<div class="child-component"></div>.

Whenever it is returned by ParentComponent this markup means to us that there is a ChildComponent instance which is returned. Not a <div> element with css-class “child-component”, but instance of ChildComponent. That’s the reason why we DON’T test HTML as a browser language, but we DO test HTML as an instance definition.

Implicit instance checking

In it’s simplest form in order to test our logic of ParentComponent, we have to test that its render() method returns HTML which will contain <div class="child-component"></div>. Suchwise we can identify that actually ChildComponent was instantiated inside ParentComponent and not something else.

It feels like an implicit instance checking, since we deal not directly with instance, but with the ‘representations’ of it.

Representation Type
<div class=”child-component”></div> ChildComponent.jsx
<div class=”parent-component”></div> ParentComponent.jsx

Writing tests

Let’s look at the possible tests for ParentComponent. I’ve used (React.renderToStaticMarkup)[https://facebook.github.io/react/docs/top-level-api.html#react.rendertostaticmarkup] to do assertions based on string. This method will return string value of rendered component.

In addition such approach allows to run tests in node.js environment without need for having any DOM implementation available inside environment.

Below is the test test/ParentComponent.test.js written using (mocha)[http://mochajs.org/] testing framework.

var React = require('react');
var expect = require('chai').expect;

var ParentComponent = require('../ParentComponent.jsx');

describe('ParentComponent', function() {

  var childType = '<div class="child-component"></div>';

  it('should render with child', function() {
    var markup = React.renderToStaticMarkup(<ParentComponent />);

    expect(markup).to.contains(childType);

  });

  it('should render without child', function() {
    var markup = React.renderToStaticMarkup(<ParentComponent standalone={true} />);

    expect(markup).to.not.contains(childType);
  });

});

test/ParentComponent.test.js

As you can see, there is one test ‘should render with child’ for testing the existence of the ChildComponent inside the html and another test ‘should render without child’ for testing that child component is not returned.

While this solution is working, it has one big disadvantage.

To see this, let’s imagine that definition of the ChildComponent.jsx will change to the following form:

var React = require('react');

module.exports = React.createClass({
  displayName: 'ChildComponent',

  render: function() {
    return (
      <div className="child-component">
        Inner Text
      </div>
    );
  }
});

ChildComponent.jsx

Because the content of the ChildComponent.jsx is changed to <div class="child-component">Inner text</div>, our test will fail.

AssertionError: expected '<div class="parent-component"><div class="child-component">Inner Text</div></div>' to include '<div class="child-component"></div>'

This is the reason why testing HTML feels so wrong at the beginning, because the implementation of the component has dependency on the test of the ParentComponent and deeper nesting will mean bigger change in returned HTML. But, bear with me a little bit more.

Mock

As I told, we are not interested in the HTML itself, but only in the fact that this HTML represents a certain type. If we will mock the ChildComponent with our own definition, then we could abstract the implementation of the child away from the parent.

To do mocking I’ve used rewire library, but you can use the one which better fits your architecture and needs. It’s also possible that you use a Dependency Injection library in your architecture and need another way of mocking.

Rewire library allows patching of the private variables in the module. Just require a module using rewire method and then use __set__ on the module. Let’s look at the example:

var rewire = require('rewire');
var ParentComponent = rewire('./ParentComponent.jsx');

var ChildComponentMock = {};

ParentComponent.__set__('Child', {})

rewire-example.js

In this example Child variable is replaced with an empty object.

This leads my story to the next point.

We can create a mock component and replace the real one. By doing so mock representation will be used, whenever ParentComponent will render. To do comparison we can render this mock separatly and use in assertion.

Below is an implementation of test case together with mock:

var React = require('react');
var expect = require('chai').expect;
var rewire = require('rewire');

var ParentComponent = rewire('../ParentComponent.jsx');

var ChildMock = React.createClass({
  render: function () {
    return <div className="child-mock-component" />;
  }
});

ParentComponent.__set__('Child', ChildMock);

describe('ParentComponent', function() {

  var childType = React.renderToStaticMarkup(<ChildMock />);

  it('should render with child', function() {
    var markup = React.renderToStaticMarkup(<ParentComponent />);

    expect(markup).to.contains(childType);

  });

  it('should render without child', function() {
    var markup = React.renderToStaticMarkup(<ParentComponent standalone={true} />);

    expect(markup).to.not.contains(childType);
  });

});

test/ParentComponent.test.js

Let’s walk through this code.

  • First of all rewire module is required.
  • After that ChildMock component is created. This component will represent our ChildComponent type.
  • Using __set__ method of the rewire, replace the real component with the mock
  • Compare whether the ParentComponent contains ChildComponent mock representation.

As you can see using a mock for ChildComponent we can test whether the ParentComponent uses the correct component.

Small optimization

We can abstract the creation of the mock into a separate function, and make component distinguishable based on the custom tag and not css-class. Using React.createElement we can make custom tags.

function getMock(componentName) {
  return React.createClass({
    render: function () {
      return React.createElement(componentName);
    }
  });
}

var ChildMock = getMock('ChildComponent');

test/ParentComponent.test.js

ChildMock represenation will look like <ChildComponent></ChildComponent>

Testing this.props

Components can be parameterized via the props definition. Now that it’s possible to express type of the component via its HTML representation, let’s think how can we test component. Since it’s an input for our component, it’s vital to test it as well. Imagine that ChildComponent will use property “childName” to render it inside node. If ParentCompoentn passes wrong value to it, we will have incorrect screen.

I have simplified ParentComponent.jsx, removed if statement and added childName property when rendering ChildComponent in the code below:

var React = require('react');

var Child = require('./ChildComponent.jsx');

module.exports = React.createClass({
  displayName: 'ParentComponent',

  renderChildren: function() {
    return <Child childName="Name of the child"/>;
  },

  render: function() {
    return (
      <div className="parent-component">
        {this.renderChildren()}
      </div>
    );
  }
});

ParentComponent.jsx

If we use current implementation of the mock, we will never find out what properties have been passed to the ChildComponent, because they will be dropped in the representation. By slightly modifying our mock component, we could serialize the properties into the HTML and make them comparable. React.createElement can help us with this, because second argument of React.createElement will be converted into attributes of the node. In this way we could pass received properties to it.

function getMock(componentName) {
  return React.createClass({
    render: function() {
      return React.createElement(componentName, this.props);
    }
  });
}

var ChildMock = getMock('ChildComponent');

The only problem with this solution, is that React will skip attributes which don’t belong to HTML, unless they are prefixed with “data-“. This means that we have to iterate over all the properties and append “data-” prefix to all custom attributes. I say to all custom attributes, because we don’t want to prefix attributes which are native to react, like className, disable, etc. We can use DOMProperty.isStandardName object from “react/lib/DOMProperty.js” to find our which properties are native. Also names, must be lowercased, otherwise React will gives us an error that attributes must be all lowercase.

var DOMProperty = require('react/lib/DOMProperty.js');

var createAttributes = function (props) {
  var attrs = {};

  Object.keys(props).forEach(function (key) {
    var attrName = DOMProperty.isStandardName[key] ? key : ('data-' + key.toLowerCase());
    attrs[attrName] = props[key];
  });

  return attrs
}

function getMock(componentName) {
  return React.createClass({
    render: function() {
      var attrs = createAttributes(this.props);
      return React.createElement(componentName, attrs);
    }
  });
}

test/ParentComponent.js

Now, if we instantiate ChildMock with attribute childName, it will additionally have the childName property serialized into the HTML, like this:

<ChildComponent data-childname="Name of the child"></ChildComponent>

In this way we can check both and the type os the returned component and properties which are passed to it.

Shallow rendering

Instead of writing own mockShallow rendering feature of the Test-Utils of the react could be used.

It allows to do the type checking of the return markup like we do. Shallow rendering will return the first level of the components. Unfortunately the problem relyies in the fact, that it gives back the first elements of the given element and in this approach there is a problem.

Nowdays, when React is going away from the mixins because of ES6 classes, people found out another way of getting mixing into play. What they do is wrapping components with virtual components. Virtual components are components which are not visible and have no own representation, but they bring some mixable functionality in wrapped component. For example fluxible is one of this libraries doing that.

To make it clear, look at the following code:

var ParentComponent = React.createClass({
  render: function() {
    return <div/>;
  }
});

function handleMixin(Component) {
  return React.createClass({
    render: function() {
      return React.createElement(Component, objectAssign({}, this.props, this.state));
    }
  });
}

ParentComponent = handleMixin(ParentComponent);

Because components are wrapped around with another components, the first elements for shallow renderer will become not the nested child, but the parent componnent itself. And this is really a problem, since we would like to test the children of this parent, and they’re not available there.

That’s it for today. You can find source code of this article at https://github.com/nemisj/react-mock-testing.

, , , ,

Sometimes easy things appear to be more complicated, than initially thought. For example conditional IE comments in HTML, which I had to add today to a code I write.

At my work we have to support Internet Explorer browser version 9 an higher . In order to use media-queries we decided to use https://github.com/weblinc/media-match polyfill library.

What can be easy than that. Just add a conditional comment <!–[if lte IE 9]> and that’s it.

 <!--[if lte IE 9]>
    <script src="/public/media.match.js">
 <![endif]-->

But things appear to be a bit more complicated, due to the React.js and isomorphic SPA which we build.

Unfortunately React.js doesn’t render HTML comments if you put them inside jsx file. In our architecture we have main HTML.jsx component which renders the whole HTML page on the server. So the solution I tried as the first one just didn’t work out:

renderHead: function() {
  return (
    <head>
      <!--[if lte IE 9]>
      <script src="/public/media.match.js"/>
      <![endif]-->
    </head>
  );
}

index.jsx

The only possible way to render HTML comments within jsx was to use dangerouslySetInnerHTML attribute of rect.’s and put that comment in there:

renderHead: function() {
  return (
    <head dangerouslySetInnerHTML={{__html: '<!--[if lte IE 9]><script src="/public/media.match.js"></script><![endif]-->'}}>
    </head>
  );
}

index.jsx

Also note that actually <script> tag must be closed separately, since the shortened version will not work correctly in the rect.’s and you will receive js errors.

The last but not least problem is that this will not work, if you have more than one item inside the <head> tag. Ofcourse you could put the whole html string also inside dangerouslySetInnerHTML, but imho that looks lame.

That’s why I’ve abused the forgiveness of the browser’s html parser and placed the conditional comment inside the <meta> tag, which works perfectly now.

Man2Code

  • KvK: 62508016
  • email: info@nemisj.com
  • tel : +31627627001

Recent Posts

Tags

Theme created by thememotive.com. Powered by WordPress.org.