JavaScript Patterns β€” Wrangling arrays like a boss, with Array#reduce πŸ‘Š

8Comments

Write your comment…

This comment has received 1 appreciation.

Hi there. Nice article! Indeed, reduce is always a better choice than more generic methods, like forEach or map/filter combinations for cases when you need to generate one single value or object. And more importantly, it improves readability of your code, as your fellow developer knows right aways what's going on.For this specific case, I have a different approach for you:

var emails = nodes.reduce(function(store, node) { 
    return node.followers.reduce(function(store, follower) { 
        follower.email && (store[follower.email] = true); 
        return store; 
   }, store); 
}, {});

console.log(Object.keys(emails));

https://jsfiddle.net/epp9zwj0/6/ A bit shorter)

Cool! I think it can have an additional benefit in case it's necessary to keep emails in the same order as nodes. While object properties also preserve insertion order, it is not guaranteed.

Write a reply...

This comment has received 1 appreciation.

You got me thinking this morning, so I was looking for potential optimizations. I'm far from a guru on optimizations but I thought I'd take your idea and extend it into the followers node. forEach is notably a slow function, so I thought why not use a reducer?

https://jsperf.com/multi-reducer-comparison

Looks like a clear win, and we increase readability, separate the concerns a bit, and made each reducer a little more testable.

Cheers!

Brilliantly done! :) πŸ‘

Write a reply...

This comment has received 1 appreciation.

I personally feel that in this case the improved code can be more readable and doesn't need any other libraries by using full es6 potential

nodes
.map(node => node.followers.map(user => user.email))    // get all emails in a format [['abc@abc.com', 'ijk@ijk.com'...], ['abc@abc.com', ...], ...]
.reduce((emails, result) => [...emails, ...result], []) // flatMap like function, flat all the arrays into one
.filter((email, i, self) => self.indexOf(email) === i)  // get all uniq items
.filter(mail => mail)                                   // remove undefined or null

https://jsfiddle.net/ae1nof9p/

There's an error in example 2 for rewriting map function, it should be:

const squaresOfNumbers = numArray => numArray.reduce(
Write a reply...

For what it's worth I actually think you older code is more readable, here is how I would have improved it:

const followers = (nodes) => {
  return _(nodes)
    .flatMap( node => node.followers)
    .filter( follower => follower.email)
    .map( follower => follower.email)
    .uniq()
}

I think some would disagree with this, but I personally favour readability in code and I think your old code is easier to reason about. As a bonus, the time required to complete this also faster.

nice solution, much more clear

Write a reply...

getSetOfFollowerEmails doesn't return anything. Good article. Array#reduce is underused. If I see one more var isTrue; and a for loop, I'm going to lose it.

Hahah, I hear you! And, thank you! Added the missing return. :)

Write a reply...

About "one reduce to rule them all" - I recommend to have a look at Lodash chaining https://lodash.com/docs/4.17.4#lodash

Write a reply...

#2's reducer functions needs to return acc because acc.push returns the pushed item:

const squaresOfNumbers = numArray => numbers.reduce(
    (acc, item) => {
       acc.push(item * item);
        return acc;
    },
    []
);

Why not just use concat instead of push ? make its nice and compact and easy to follow along:

const squaresOfNumbers = numArray => numArray.reduce((acc, item) => acc.concat(item * item), [])
Write a reply...

The above operations can be performed even with a simple forEach with less redundancy and confusion of accumulator,but the problem is not that.Coding in FP style doesn't mean writing multiple functionalities into single reduce or forEach, but is to separate the functionality into multiple functions which has a single predefined meaning(filter-filters out some elements,map-converts one type to another etc).

When we use reduce for filter and map and flattening etc its much harder to understand the code at a glance.

Also the typical javascript filter/map etc or lodash utils doesn't natively support chaining of functions. Java 8 streams support such chaining without reducing the performance.

Similarly there are multiple Utils/libraries which support chaining of multiple functions and execute them vertically so that performance will still be maintained.

The functionality is filter persons with age <30 and map to employee object

Eg : var persons = [{
                name: 'abc',
                age: 23,
                profession: 'hhh'
            },
            {
                name: 'def',
                age: 35,
                profession: 'xxx'
            },
            {
                name: 'xyz',
                age: 40,
                profession: 'aaa'
            }
        ];

var employees = Stream(persons).filter(person => {
            console.log('filter', person);
 return person.age > 30
        }).map(person => {
            console.log('map', person);
 return {
                name: person.name,
                profession: person.profession
            };
        }).toArray();
        console.log(employees);

The above code is written using Stream.js library winterbe.github.io/streamjs

The above code exactly run 3 times and finishes filtering and map but lodash or other chaining utils takes 3 + 2 =5 times.(for above example).

Even a typical forEach loop with if condition works faster than lodash chaining.

Write a reply...

Join a friendly and inclusive Q&A network for coders

  • πŸ–₯Pick the technologies you like & read great content through your feed.
  • πŸ’¬Ask a question when you want to learn more about anything.
  • πŸš€Share what you know & build your portfolio.
Sign up nowLearn more

loading ...