[javascript] How to skip over an element in .map()?

How can I skip an array element in .map?

My code:

var sources = images.map(function (img) {
    if(img.src.split('.').pop() === "json"){ // if extension is .json
        return null; // skip
    }
    else{
        return img.src;
    }
});

This will return:

["img.png", null, "img.png"]

This question is related to javascript

The answer is


Here's a utility method (ES5 compatible) which only maps non null values (hides the call to reduce):

_x000D_
_x000D_
function mapNonNull(arr, cb) {_x000D_
    return arr.reduce(function (accumulator, value, index, arr) {_x000D_
        var result = cb.call(null, value, index, arr);_x000D_
        if (result != null) {_x000D_
            accumulator.push(result);_x000D_
        }_x000D_
_x000D_
        return accumulator;_x000D_
    }, []);_x000D_
}_x000D_
_x000D_
var result = mapNonNull(["a", "b", "c"], function (value) {_x000D_
    return value === "b" ? null : value; // exclude "b"_x000D_
});_x000D_
_x000D_
console.log(result); // ["a", "c"]
_x000D_
_x000D_
_x000D_


Why not just use a forEach loop?

_x000D_
_x000D_
let arr = ['a', 'b', 'c', 'd', 'e'];_x000D_
let filtered = [];_x000D_
_x000D_
arr.forEach(x => {_x000D_
  if (!x.includes('b')) filtered.push(x);_x000D_
});_x000D_
_x000D_
console.log(filtered)   // filtered === ['a','c','d','e'];
_x000D_
_x000D_
_x000D_

Or even simpler use filter:

const arr = ['a', 'b', 'c', 'd', 'e'];
const filtered = arr.filter(x => !x.includes('b')); // ['a','c','d','e'];

To extrapolate on Felix Kling's comment, you can use .filter() like this:

var sources = images.map(function (img) {
  if(img.src.split('.').pop() === "json") { // if extension is .json
    return null; // skip
  } else {
    return img.src;
  }
}).filter(Boolean);

That will remove falsey values from the array that is returned by .map()

You could simplify it further like this:

var sources = images.map(function (img) {
  if(img.src.split('.').pop() !== "json") { // if extension is .json
    return img.src;
  }
}).filter(Boolean);

Or even as a one-liner using an arrow function, object destructuring and the && operator:

var sources = images.map(({ src }) => src.split('.').pop() !== "json" && src).filter(Boolean);

TLDR: You can first filter your array and then perform your map but this would require two passes on the array (filter returns an array to map). Since this array is small, it is a very small performance cost. You can also do a simple reduce. However if you want to re-imagine how this can be done with a single pass over the array (or any datatype), you can use an idea called "transducers" made popular by Rich Hickey.

Answer:

We should not require increasing dot chaining and operating on the array [].map(fn1).filter(f2)... since this approach creates intermediate arrays in memory on every reducing function.

The best approach operates on the actual reducing function so there is only one pass of data and no extra arrays.

The reducing function is the function passed into reduce and takes an accumulator and input from the source and returns something that looks like the accumulator

// 1. create a concat reducing function that can be passed into `reduce`
const concat = (acc, input) => acc.concat([input])

// note that [1,2,3].reduce(concat, []) would return [1,2,3]

// transforming your reducing function by mapping
// 2. create a generic mapping function that can take a reducing function and return another reducing function
const mapping = (changeInput) => (reducing) => (acc, input) => reducing(acc, changeInput(input))

// 3. create your map function that operates on an input
const getSrc = (x) => x.src
const mappingSrc = mapping(getSrc)

// 4. now we can use our `mapSrc` function to transform our original function `concat` to get another reducing function
const inputSources = [{src:'one.html'}, {src:'two.txt'}, {src:'three.json'}]
inputSources.reduce(mappingSrc(concat), [])
// -> ['one.html', 'two.txt', 'three.json']

// remember this is really essentially just
// inputSources.reduce((acc, x) => acc.concat([x.src]), [])


// transforming your reducing function by filtering
// 5. create a generic filtering function that can take a reducing function and return another reducing function
const filtering = (predicate) => (reducing) => (acc, input) => (predicate(input) ? reducing(acc, input): acc)

// 6. create your filter function that operate on an input
const filterJsonAndLoad = (img) => {
  console.log(img)
  if(img.src.split('.').pop() === 'json') {
    // game.loadSprite(...);
    return false;
  } else {
    return true;
  }
}
const filteringJson = filtering(filterJsonAndLoad)

// 7. notice the type of input and output of these functions
// concat is a reducing function,
// mapSrc transforms and returns a reducing function
// filterJsonAndLoad transforms and returns a reducing function
// these functions that transform reducing functions are "transducers", termed by Rich Hickey
// source: http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
// we can pass this all into reduce! and without any intermediate arrays

const sources = inputSources.reduce(filteringJson(mappingSrc(concat)), []);
// [ 'one.html', 'two.txt' ]

// ==================================
// 8. BONUS: compose all the functions
// You can decide to create a composing function which takes an infinite number of transducers to
// operate on your reducing function to compose a computed accumulator without ever creating that
// intermediate array
const composeAll = (...args) => (x) => {
  const fns = args
  var i = fns.length
  while (i--) {
    x = fns[i].call(this, x);
  }
  return x
}

const doABunchOfStuff = composeAll(
    filtering((x) => x.src.split('.').pop() !== 'json'),
    mapping((x) => x.src),
    mapping((x) => x.toUpperCase()),
    mapping((x) => x + '!!!')
)

const sources2 = inputSources.reduce(doABunchOfStuff(concat), [])
// ['ONE.HTML!!!', 'TWO.TXT!!!']

Resources: rich hickey transducers post


I use .forEach to iterate over , and push result to results array then use it, with this solution I will not loop over array twice


Since 2019, Array.prototype.flatMap is a good option.

images.flatMap(({src}) => src.endsWith('.json') ? [] : src);

From MDN:

flatMap can be used as a way to add and remove items (modify the number of items) during a map. In other words, it allows you to map many items to many items (by handling each input item separately), rather than always one-to-one. In this sense, it works like the opposite of filter. Simply return a 1-element array to keep the item, a multiple-element array to add items, or a 0-element array to remove the item.


var sources = images.map(function (img) {
    if(img.src.split('.').pop() === "json"){ // if extension is .json
        return null; // skip
    }
    else{
        return img.src;
    }
}).filter(Boolean);

The .filter(Boolean) will filter out any falsey values in a given array, which in your case is the null.


Here's a fun solution:

/**
 * Filter-map. Like map, but skips undefined values.
 *
 * @param callback
 */
function fmap(callback) {
    return this.reduce((accum, ...args) => {
        let x = callback(...args);
        if(x !== undefined) {
            accum.push(x);
        }
        return accum;
    }, []);
}

Use with the bind operator:

[1,2,-1,3]::fmap(x => x > 0 ? x * 2 : undefined); // [2,4,6]

Here is a updated version of the code provided by @theprtk. It is a cleaned up a little to show the generalized version whilst having an example.

Note: I'd add this as a comment to his post but I don't have enough reputation yet

/**
 * @see http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
 * @description functions that transform reducing functions
 */
const transduce = {
  /** a generic map() that can take a reducing() & return another reducing() */
  map: changeInput => reducing => (acc, input) =>
    reducing(acc, changeInput(input)),
  /** a generic filter() that can take a reducing() & return */
  filter: predicate => reducing => (acc, input) =>
    predicate(input) ? reducing(acc, input) : acc,
  /**
   * a composing() that can take an infinite # transducers to operate on
   *  reducing functions to compose a computed accumulator without ever creating
   *  that intermediate array
   */
  compose: (...args) => x => {
    const fns = args;
    var i = fns.length;
    while (i--) x = fns[i].call(this, x);
    return x;
  },
};

const example = {
  data: [{ src: 'file.html' }, { src: 'file.txt' }, { src: 'file.json' }],
  /** note: `[1,2,3].reduce(concat, [])` -> `[1,2,3]` */
  concat: (acc, input) => acc.concat([input]),
  getSrc: x => x.src,
  filterJson: x => x.src.split('.').pop() !== 'json',
};

/** step 1: create a reducing() that can be passed into `reduce` */
const reduceFn = example.concat;
/** step 2: transforming your reducing function by mapping */
const mapFn = transduce.map(example.getSrc);
/** step 3: create your filter() that operates on an input */
const filterFn = transduce.filter(example.filterJson);
/** step 4: aggregate your transformations */
const composeFn = transduce.compose(
  filterFn,
  mapFn,
  transduce.map(x => x.toUpperCase() + '!'), // new mapping()
);

/**
 * Expected example output
 *  Note: each is wrapped in `example.data.reduce(x, [])`
 *  1: ['file.html', 'file.txt', 'file.json']
 *  2:  ['file.html', 'file.txt']
 *  3: ['FILE.HTML!', 'FILE.TXT!']
 */
const exampleFns = {
  transducers: [
    mapFn(reduceFn),
    filterFn(mapFn(reduceFn)),
    composeFn(reduceFn),
  ],
  raw: [
    (acc, x) => acc.concat([x.src]),
    (acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src] : []),
    (acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src.toUpperCase() + '!'] : []),
  ],
};
const execExample = (currentValue, index) =>
  console.log('Example ' + index, example.data.reduce(currentValue, []));

exampleFns.raw.forEach(execExample);
exampleFns.transducers.forEach(execExample);

Answer sans superfluous edge cases:

const thingsWithoutNulls = things.reduce((acc, thing) => {
  if (thing !== null) {
    acc.push(thing);
  }
  return acc;
}, [])

if it null or undefined in one line ES5/ES6

//will return array of src 
images.filter(p=>!p.src).map(p=>p.src);//p = property


//in your condition
images.filter(p=>p.src.split('.').pop() !== "json").map(p=>p.src);

I think the most simple way to skip some elements from an array is by using the filter() method.

By using this method (ES5) and the ES6 syntax you can write your code in one line, and this will return what you want:

_x000D_
_x000D_
let images = [{src: 'img.png'}, {src: 'j1.json'}, {src: 'img.png'}, {src: 'j2.json'}];_x000D_
_x000D_
let sources = images.filter(img => img.src.slice(-4) != 'json').map(img => img.src);_x000D_
_x000D_
console.log(sources);
_x000D_
_x000D_
_x000D_


You can use after of you method map(). The method filter() for example in your case:

var sources = images.map(function (img) {
  if(img.src.split('.').pop() === "json"){ // if extension is .json
    return null; // skip
  }
  else {
    return img.src;
  }
});

The method filter:

const sourceFiltered = sources.filter(item => item)

Then, only the existing items are in the new array sourceFiltered.


You can do this

_x000D_
_x000D_
var sources = [];
images.map(function (img) {
    if(img.src.split('.').pop() !== "json"){ // if extension is not .json
        sources.push(img.src); // just push valid value
    }
});
_x000D_
_x000D_
_x000D_