DEV Community

Cover image for The Power of Reduce: Optimizing JavaScript Code for Speed and Efficiency
Diogo Almeida
Diogo Almeida

Posted on • Edited on

The Power of Reduce: Optimizing JavaScript Code for Speed and Efficiency

Hey everybody! This is my first-ever post!

In this article, I will be talking about the reduce method of JavaScript arrays, which I feel is sometimes forgotten. I will give a brief introduction to this method, and then I will proceed to compare it with other iteration methods.

TLDR

  • Review your array iterations
  • Instead of map and filter, use reduce;
  • Spreading (as in [...list]) large lists is a bad idea; Push to list instead
  • reduce is faster than map and filter, and for in for large lists

Personal Notes

I'm a Frontend Developer with less than 2 years of experience and this is my first article.

I'm saying this to let you know that I'm still very new, so all feedback is welcome, whether it's about technical content, writing, or personality (please keep it constructive).

Motivation: the path to a single loop

While reviewing some people's code (and sometimes my own 😬), I realized that it's really easy to employ unnecessary iterations of arrays and that sometimes reduce could improve the code.

For example, if I have a list of people and I want to get the list of names of all adults, it is easy to think: "First I need to filter the list and get all the adults (age > 18 years in my country) and then I need to map it and return their names".
I believe that arrow functions makes this type of thought even easier because it eliminates the need to write function.

However, what keeps you from iterating the list of people and adding a person's name to an empty list if they are an adult. This is reduce in a nutshell.
This can also be done with ease using a for loop, although occasionally people overlook this method as well.


Reduce method introduction

I will now give a short introduction to how the reduce method works, so if you are already familiar with it, be sure to jump to the benchmark section.

Theory

In JavaScript, we have more than one way to iterate an array, like map, for, or reduce. The last one is perhaps the less 'friendly' and common between the three, but instrumental as well, as we shall see.

The method receives two arguments:

  • a callback function, that will be used to return the accumulated value in each iteration, and
  • an initial value, which will be used to start the accumulation.

The callback function itself receives up to four arguments:

  • The accumulator, sometimes referred to as total or abbreviated to acc, is the value that is "added" onto each iteration;
  • The currentValue, i.e, the current element of the array being iterated
  • Optionally, the currentIndex, the position of the currentValue in the array. Commonly used;
  • Optionally, the array being iterated. It's rarely used since you typically are iterating a variable that you declared previously. It is useful nonetheless because if you chain methods (e.g. you call initialArray.sort(...).reduce(...)), then the array parameter will be the sorted array and not the initial one.

You might notice that I'm employing terms related to accumulate. This is because that is what reduce does: it starts with one value, and each iteration it will "add" onto that value (I added quotation marks because algebraic addition is not necessarily involved).

Let's look at some code:

Practical Examples

I'm just going to give you some basic examples of how to use reduce because my goal isn't to turn you into reduce expert (nor do I claim to be) but to keep this method in the back of your mind.

Retrieving the names of all Adults
Let's focus on the previous example. If you were to do the same with map and filter, you would do something like:

peopleList.filter((person) => person.age >= 18)
          .map((adult) => adult.name);
Enter fullscreen mode Exit fullscreen mode

With reduce, you'll do something like this:

peopleList.reduce((acc, person) => {
  if (person.age >= 18) acc.push(person.name);
  return acc;
}, []);
Enter fullscreen mode Exit fullscreen mode

This way, you only use one loop instead of two. Is it less 'friendly'? You tell me.

There are a lot more uses of reduce, and ways of using it. I opted to use an arrow function but you don't have to if you don't want. In this CodePen you can find:

  • A comparison of the retrieval of the adults' names between map and filter, for in and reduce;
  • Another example, is where you sum all the numbers in a list. This is a typical example introduced when using reduce.

    The reason I'm not detailing it here is that I don't want the concept "reduce is for sums only" to settle in your head. Yes, it is used for that, but not exclusively, as a teacher once told me.

    In this example, I also detail how you can use reduce with regular functions and inline returns;


Benchmarking iteration methods

Now, those who know me know that I am a bit obsessed with avoiding repetition and increasing speed and performance.

This is the origin of my article.

I wanted to test if there was any difference performance-wise between map and filter, and reduce. Then I decided to throw for in in the mix as well.

Always learning!

So I did an initial benchmark and I quickly found a problem:

Spreading slows the reduce method!

See, in the reducer function, I was returning the acc the following way:

return condition ? [...acc, item] : acc;
Enter fullscreen mode Exit fullscreen mode

With short lists, this isn't much of a problem, but with colossal-sized lists, this syntax made the reduce method much slower (as you will see).

Imagine my surprise when I first developed the benchmark and saw that reduce (the method I am advocating in my article) was much slower than map and filter! However, after some research, I found out that this was because in each iteration an object is created, and thousands of elements are spreaded onto it, so a strategy change was needed:

if(condition) acc.push(item);
return acc;
Enter fullscreen mode Exit fullscreen mode

Now, whenever I employ reduce in my projects, instead of spreading, I push the item to the acc.

The benchmark

So, to develop my benchmark I created another CodePen where:

  • A list of 100,000 random numbers is created;
  • Each method runs a certain task 100 times, and each execution is timed;
  • And an average is obtained for each method; Regardless of the iteration and method, the task itself is to obtain the binary converted numbers smaller than 16. It's not a "real world" example, but it'll do for this purpose.

You'll see that I added the reduce method with spread and push so that you can see the difference in performance. So let's see what the fastest method is.

Results: And the winner is...

Reduce! Of course, it's reduce. I wouldn't be making this article if it wasn't. 😄

Iteration methods time comparison

You'll notice I didn't add the reduce with spread time to the graphic because it was 8.8 seconds long (over 3000 times slower than with push)! All other methods are measured in milliseconds.

As you can see, the difference between the three iteration methods is not very large, but it doesn't change the fact that map and filter are 3 milliseconds slower than reduce. Of course, with smaller lists, the difference will be minimal.

Some things to consider about this benchmark are:

  • The code may seem overcomplicated, but that was the best way I found to avoid re-creating functions with each iteration or method.
  • I pushed the list's length to reaches that probably aren't reasonable. It's rare the project where one deals with lists with thousands of elements. But if you are, and performance is a big issue in your website, reduce is your friend;
  • Dealing with big lists isn't usually handled in pure JS. There are several ways to handle them, such as virtualization or pagination, so be warned.
  • The times I presented were obtained by running the code on CodePen, so in production, on a server, the execution can be faster.

That's all folks!
Thank you for reading my first post, and I hope you found it useful in some way (I don't refund the time you spent on it otherwise)

References and links

Top comments (14)

Collapse
 
jonrandy profile image
Jon Randy 🎖️ • Edited

Not looked in any great detail - but something a bit odd in your forEach test in your code... it isn't using forEach!!

You might also want to compare just using a normal for loop and accessing the array using a numeric index. That can very often be the fastest method if you're really concerned about speed.

For benchmarking, you might want to consider a pre-existing tool like perf.link:

Perflink | JS Benchmarks

JavaScript performance benchmarks you can share via URL.

favicon perf.link

I made a test of the different methods here - and using the index method is by far the fastest, with forEach and reduce jostling for 2nd place (sometimes one is faster, sometimes the other is) on Chrome (desktop), with Firefox (desktop) consistently putting reduce in 2nd place followed by forEach.

Collapse
 
qm3ster profile image
Mihail Malo • Edited

After the first run, №4 (filter.map) consistently outperforms even indexing!
Go figure!
Turns out it was due to the tiny array. With large input data, index absolutely dominates to this day.
Image description
And iterators (I changed №4 to the below for this picture) are extremely unoptimized in my mobile V8:

const mapFilterTest = () => people.values()
    .filter(p => p.age >= 18)
    .map(p => p.name)
    .toArray()
Enter fullscreen mode Exit fullscreen mode
Collapse
 
red-dial profile image
Diogo Almeida • Edited

Hi!
You're right! That really isn't a forEach, but more of a for in I'll make that correction.
Also thank you for sharing the index alternative and the Perflink tool!
I guess I didn't think of using the for loop with the index since it's not something I would typically use.

Collapse
 
budgiewatts profile image
John Watts

Never prematurely optimize - you save 3ms processing 100,000 numbers but the next developer that looks at the code will spend three minutes trying to figure out what's supposed to happen in your reduce code - that's going to cost the business a whole lot more than the difference in processing time, especially if they don't figure it out and introduce a bug.

In a real application those milliseconds will be nothing compared to latencies caused by network and database calls but the clarity gained from using filter/map is always priceless.

All of that said, your observations around the spread operator are the real gem in this article and worth sharing. Keep up the good work but only fix the problems that actually exist :)

Collapse
 
vicariousv profile image
AndrewBarrell1

I'm not sure about the tldr use reduce instead of map and filter.

The whole point of using map and filter is that you separate the logic into parts, the what(condition to determine what you need) and how (the translation).

For a tiny benefit in an abnormal situation you made the code much less intuitive at a glance. I don't think this is a good blanket takeaway.

Writing readable and maintainable code should be the first port of call unless time and/or memory optimisation is a specific requirement to the task. Over optimising for time for a project where it doesn't matter if something takes 200ms or 2 minutes at the expense of making you or your successor have to work longer to make sense of it in the future is not ideal.

Also why did you use for in? It's not the most ideal one, it has some pitfalls unless that has been corrected in recent updates. Either use for of or a normal for loop. (If arr.forEach is out of the question)

Collapse
 
teamtoyumi profile image
teamtoyumi

Great article! You made reduce sound so much simpler! I also liked how you called out that "reduce is not for sums only", because yes the naming of reducer, total/acc can make it so easy to misunderstand

Collapse
 
hugohub profile image
Karlis Melderis

I find .reduce hard to reason about and seen too often a code where people are creating new arrays and objects inside .reduce instead of mutating accumulated value.

Hence we made an agreement to default to for .. of loops

Maybe it does end up with couple more lines of code but reasoning about the flow is easier

Collapse
 
chu_lcninh_6b61c8429c94 profile image
Chu Lục Ninh • Edited

If you really want to do the kind of operation that do much things in a single iteration, check out generator. You can build generator version of map, filter, then use that version to lazily compute when you iterate through items. That achieve both the logic separation of map and filter, while keeping the performance and memory usage optimized

Collapse
 
williamukoh profile image
Big Will

TLDRs generally appears at the beginning of articles

Collapse
 
g1itcher profile image
G1itcher

Using reduce instead of map or filter robs your code of a clear intent narrative that's is arguably more important in the majority of cases than performance gains.

It also doesn't help that in my experience many developers seem to have difficulty parsing what a reduce function is doing.

Collapse
 
revenity profile image
Revenity

That is slower than a normal for loop:

for (let  i = 0, { length } = arr; i < length; ++i) {
}
Enter fullscreen mode Exit fullscreen mode

Nobody uses for in for arrays it doesn't make any sense

Collapse
 
pveercs profile image
Prashant Verma

This one is marginally faster.

Collapse
 
revenity profile image
Revenity

Not really marginally if it gets JITed

Collapse
 
j-256 profile image
James

You lost me as soon as you talked about accessing arrays with for..in rather than for..of.