Let's say you have the following code:
const groceryList = ['apple', 'orange', 'celery', 'pineapple'];
let fiber;
const modifiedList = groceryList.map(item => {
if (item !== 'celery') {return `${item} juice`;}
fiber = item;
});
// modifiedList = ['apple juice','orange juice','pineapple juice'];
// fiber = 'celery'
[other code that consumes these variables]
I want to avoid iterating through the entire list again just to handle one case, but this solution seems less than ideal. I'm wonder if there's an industry standard or if this is considered a bad practice?
Usually mapping over an array implies you are modifying each member of the array, so maybe .reduce is a better option here. You reduce over the array and transform it into something different.
I'd say that setting fiber = item if the function doesn't return isn't that great. Since you are doing two discrete operations - one is transforming the array and the other is picking out the single item that isn't celery - two operations is fine. The second operation could just call filter on the array and return 'celery'. Or .some() over it and return false when you find it.
To me, combining the operations into that one callback is not as clear as doing the two separately. Performance shouldn't be a concern until it is a concern IMO.
Peter Scheler
JS enthusiast
A solution with reduce could look like:
const groceryList = ['apple', 'orange', 'celery', 'pineapple'] const { fiber, juices: modifiedList } = groceryList .reduce((result, item) => { if (item !== 'celery') { result.juices.push(item) } else { result.fiber = item } return result }, { fiber: undefined, juices: [] })No side effects. Nice and clean.