What do we want to achieve?
I’m a big fan of talking about code using live examples, not just theory, so this article is based on a few snippets of code; they do the same job, but use different approaches to writing asynchronous JavaScript code.
Code to get data from external APIs is simple, and I’m going to use a fun little project for getting quotes from Ron Swanson. It prints a message two seconds after receiving the data, then another message indicating the previous one was displayed.
All the implementations share two functions:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | const fetch = require('node-fetch'); function getQuote(cb) { return new Promise((resolve, reject) => { fetch('http://ron-swanson-quotes.herokuapp.com/v2/quotes') .then(response => response.json()) .then(quote => { if (_.isFunction(cb)) { cb(quote) } resolve(quote) }) .catch(reject); }); } function printMessage(msg) { return new Promise(resolve => { setTimeout(() => { console.log(msg); resolve(); cb(); }, 2000); }); } |
Notice they both have two interface types: promises or callbacks, to allow interaction using those two different methods. It’s similar to how most libraries that switched to using a promise-based approach handle backwards compatibility for callbacks.
Yesterday: Callbacks
If you’re reading this article you’re probably trying to avoid this approach. The problem with using callbacks is the descent into ‘callback hell’. For anyone not familiar with ‘callback hell” here’s just a simple example:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | fs.readFile(fileName, function(contents) { if (condition) { fs.writeFile(fileName, function (data) { fs.readFile(fileName, function(contens) { //...etc. }) }) } }) ``` ```javascript function main() { console.log('Gonna get a quote'); getQuote(quote => { console.log(quote); printMessage(() => { console.log('A message was printed') }) }) } main(); |
Today: Promises
The last example already had callback nesting, and we know where that leads. To remedy this, use Promises, which in most cases gives much cleaner code which is more descriptive and easier to read.
The most important thing about Promises is they allow asynchronous code to be written as ‘flat’ as possible, so you instantly know what comes next, and the sequence the code is executed in.
Node.js and most modern browsers (except IE and Opera Mini) implement native Promises, and those which don’t can still use other implementations, like Bluebird or Q.
Here’s the same example, but using Promises:
1 2 3 4 5 6 7 8 9 10 11 12 13 | function main() { console.log('Gonna get a quote'); getQuote() .then(console.log) .then(() => { return printMessage('test msg'); }) .then(() => { console.log('A message was printed'); }) .catch(console.error); } main(); |
Tomorrow: Async functions
This approach is based on a proposal submitted to ES7 (2016). Only two features were eventually accepted as a part of that release (exponential operator and Array.prototype.includes()), based on the decision to have smaller changes with a shorter release cycle.
You can read more about that – and how new features are being integrated into JavaScript – here.
Async and await are new keywords that allow asynchronous code to be written that looks synchronous.
Here’s an example:
1 2 3 4 5 6 7 8 9 10 11 12 13 | async function main() { try { console.log('Gonna get a quote'); const quote = await getQuote(); console.log(quote); await printMessage('test msg'); console.log('a message was printed after 2s'); } catch(error) { console.error(error); } } main(); |
The code is written the way we’re used to when writing sync functions – with two main differences.
The function is preceded with async, informing the interpreter there’s asynchronous code inside.
Then using await makes the browser/Node app/etc wait until the async code specified in that expression is completed; in our case: promises responsible for API communication and printing a message.
In English, what it does is:
- Print ‘Gonna get a quote’
- Wait for the quote and when it arrives assign it to ‘quote’
- Print the quote
- Wait for the ‘test msg’ message to be printed
- Print ‘a message was printed …’
This feature is still just a proposal, so we need to use a transpiler, like Babel or Traceur, to convert the code to ES5 so it can be used.
In the meantime: Co
Co is a very clever and simple library that uses generator functions to achieve effects similar to async.
If you’re those unfamiliar with generators, some simple examples on MDN should tell you what you need to know.
First, here’s some code:
1 2 3 4 5 6 7 8 9 10 | function *main() { console.log('Gonna get a quote'); const a = yield getQuote(); console.log(a); yield printMessage('test msg'); console.log('A message was printed after 2s'); } co(main) .catch(console.error); |
Co wraps a generator function and automatically does two things:
- Executes next() until there’s something left to do in the generator function
- Waits on yield for the async objects to resolve.
Co has a concept of ‘yieldables’ – async objects that can be used with yield – which it’ll wait for before continuing code execution. Yieldables are (according to https://github.com/tj/co#yieldables):
- Promises
- Thunks (functions)
- Array (parallel execution)
- Objects (parallel execution)
- Generators (delegation)
- Generator functions (delegation)
Two are worth mentioning here: ‘objects’ and ‘arrays’. These provide an easy way to execute of async code in parallel, similar to Promise.all. Let’s say we want to get the quote, print the message as fast as possible and don’t care which comes first:
1 2 3 4 5 6 7 8 9 10 11 | function *main() { console.log('Gonna get a quote'); const results = yield [ getQuote(), printMessage('test msg') ] console.log(results[0]); } co(main) .catch(console.error); |
With Promise.all it would look like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 | function main() { Promise .all([ getQuote(), printMessage('test msg') ]) .then(results => { console.log(results[0]); }) .catch(console.error); } main(); |
Conclusions
As always in JavaScript, there are many ways to achieve the same result. Most are actually based on Promises, so we can’t stop using polyfills yet, but this is definitely the way forward.
If you live on the bleeding edge, use transpilers and jump on the async/await train before native async functions are implemented. However, it’s still a stage 3 candidate, so it may change.
The safe options provide you some advantages of the former and give easy to read code without needing to use experimental stuff, making co a nice solution. The biggest advantage is it’s usable straightaway, without changing your existing code too much. Since you probably use Promises, this can be a way of clearly orchestrating them. And, when the day comes and async functions are implemented, migration will only involve replacing yield with await and switching wrapped generators to async functions.
In any case, use what makes the most sense for you and your project, and consider the implications of using a particular solution.
Next time I’ll write more about the performance of those solutions and the overhead in your code from using them.
Further reading
[ad]Check other articles in our blog[/ad]