Node.js, the open-source runtime environment for developing server-side web applications in JavaScript, provides a non-blocking, event-driven architecture that is perfect for heavy I/O and API driven applications. Here I will dive into several examples on how one can best adapt and overcome the inherent hurdles such a paradigm introduces towards traditional design patterns.
At the core of JavaScript is the concept of callbacks and closures. Functions being first-class objects in JavaScript, allow us to pass a callback function as an argument to a second function and later execute it upon return. Closures then are functions that maintain access to variables defined in the parent scope for at least the lifetime of the closure itself, outliving the parent function’s scope. To give context, the following example will output 9.
function add(num1) {
return function(num2) { return num1 + num2; };
}
var myAdd = add(5);
console.log(myAdd(4));
With that basic knowledge in hand, quickly upon developing a Node.js application you will find the need to run code in sequence with a fixed ordering. To do so the initial reaction will be to pass the second code snippet to the first as a callback. This works in isolation, but after having to do this a number of times, say for a 5 step list of instructions, you will fall into the scenario known commonly as the “pyramid of doom” or “callback hell”. Such an increasingly nested set of callbacks, sacrifices control flow, exception handling, and the semantics we are familiar with in synchronous code, leading us to our solution - promises.
What are promises? Promises are objects that represent the return value or the thrown exception that a function may eventually provide, providing developers the freedom to model asynchronous functions in a synchronous fashion. Particularly, promises allow developers to use return statements, rather than calling another function, and the use of exception handling through traditional throw/try/catch semantics.
Popular JavaScript libraries such as Bluebird and Q provide this “inversion of control”, separating input arguments from control flow (callback) arguments. Using Q, the following pyramid of callbacks is turned into a clean and flat list of instructions.
step1(function (value1) {
step2(value1, function(value2) {
step3(value2, function(value3) {
step4(value3, function(value4) {
step5(value4, function(value5) {
// Do something with value5
});
});
});
});
});
Q.fcall(promisedStep1)
.then(promisedStep2)
.then(promisedStep3)
.then(promisedStep4)
.then(promisedStep5)
.then(function (value5) {
// Do something with value5
})
.catch(function (error) {
// Handle any error from all above steps
})
.done();
Promises can become slightly more intricate once you want to loop through a list of promises and call code once they are all executed and resolved. Here I will show you two code snippets, using the Q library, I have come to find especially handy. The first code snippet, executes a list of promises asynchronously. This is great, for example, when one needs to fire off a number of API calls in parallel. The second code snippet, executes a list of promises sequentially. For example, I used this recently when a need arose to upload a large data set and chunking of data was required.
Asynchronous, Parallel Promise Execution
Q.all(promises.map(function(arg) {
return promisedFunc(arg);
}));
Synchronous, Sequential Promise Execution
promises.reduce(function (promise, arg) {
return promise.then(function() {
return promisedFunc(arg);
})
}, Q.resolve());
For more information and material on Node.js, JavaScript, and software development in general continue visiting my blog - AustinCorso.com. Feedback, critique, and questions welcomed by email.