[javascript] Remove duplicates from an array of objects in JavaScript

I have an object that contains an array of objects.

things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

I'm wondering what is the best method to remove duplicate objects from an array. So for example, things.thing would become...

{place:"here",name:"stuff"},
{place:"there",name:"morestuff"}

This question is related to javascript arrays object duplicates

The answer is


let data = [
  {
    'name': 'Amir',
    'surname': 'Rahnama'
  }, 
  {
    'name': 'Amir',
    'surname': 'Stevens'
  }
];
let non_duplicated_data = _.uniqBy(data, 'name');

Here is another technique to find number of duplicate and and remove it easily from you data object. "dupsCount" is number of duplicate files count. sort your data first then remove. it will gives you fastest duplication remove.

  dataArray.sort(function (a, b) {
            var textA = a.name.toUpperCase();
            var textB = b.name.toUpperCase();
            return (textA < textB) ? -1 : (textA > textB) ? 1 : 0;
        });
        for (var i = 0; i < dataArray.length - 1; ) {
            if (dataArray[i].name == dataArray[i + 1].name) {
                dupsCount++;
                dataArray.splice(i, 1);
            } else {
                i++;
            }
        }

 var testArray= ['a','b','c','d','e','b','c','d'];

 function removeDuplicatesFromArray(arr){

 var obj={};
 var uniqueArr=[];
 for(var i=0;i<arr.length;i++){ 
    if(!obj.hasOwnProperty(arr[i])){
        obj[arr[i]] = arr[i];
        uniqueArr.push(arr[i]);
    }
 }

return uniqueArr;

}
var newArr = removeDuplicatesFromArray(testArray);
console.log(newArr);

Output:- [ 'a', 'b', 'c', 'd', 'e' ]

If you want to de-duplicate your array based on all arguments and not just one. You can use the uniqBy function of lodash that can take a function as a second argument.

You will have this one-liner:

 _.uniqBy(array, e => { return e.place && e.name })

str =[
{"item_id":1},
{"item_id":2},
{"item_id":2}
]

obj =[]
for (x in str){
    if(check(str[x].item_id)){
        obj.push(str[x])
    }   
}
function check(id){
    flag=0
    for (y in obj){
        if(obj[y].item_id === id){
            flag =1
        }
    }
    if(flag ==0) return true
    else return false

}
console.log(obj)

str is an array of objects. There exists objects having same value (here a small example, there are two objects having same item_id as 2). check(id) is a function that checks if any object having same item_id exists or not. if it exists return false otherwise return true. According to that result, push the object into a new array obj The output of the above code is [{"item_id":1},{"item_id":2}]


This is simple way how to remove duplicity from array of objects.

I work with data a lot and this is useful for me.

const data = [{name: 'AAA'}, {name: 'AAA'}, {name: 'BBB'}, {name: 'AAA'}];
function removeDuplicity(datas){
    return datas.filter((item, index,arr)=>{
    const c = arr.map(item=> item.name);
    return  index === c.indexOf(item.name)
  })
}

console.log(removeDuplicity(data))

will print into console :

[[object Object] {
name: "AAA"
}, [object Object] {
name: "BBB"
}]

You can use Object.values() combined with Array.prototype.reduce():

_x000D_
_x000D_
const things = new Object();_x000D_
_x000D_
things.thing = new Array();_x000D_
_x000D_
things.thing.push({place:"here",name:"stuff"});_x000D_
things.thing.push({place:"there",name:"morestuff"});_x000D_
things.thing.push({place:"there",name:"morestuff"});_x000D_
_x000D_
const result = Object.values(things.thing.reduce((a, c) => (a[`${c.place}${c.name}`] = c, a), {})); _x000D_
_x000D_
console.log(result);
_x000D_
.as-console-wrapper { max-height: 100% !important; top: 0; }
_x000D_
_x000D_
_x000D_


A TypeScript solution

This will remove duplicate objects and also preserve the types of the objects.

function removeDuplicateObjects(array: any[]) {
  return [...new Set(array.map(s => JSON.stringify(s)))]
    .map(s => JSON.parse(s));
}

If you can use Javascript libraries such as underscore or lodash, I recommend having a look at _.uniq function in their libraries. From lodash:

_.uniq(array, [isSorted=false], [callback=_.identity], [thisArg])

Basically, you pass in the array that in here is an object literal and you pass in the attribute that you want to remove duplicates with in the original data array, like this:

var data = [{'name': 'Amir', 'surname': 'Rahnama'}, {'name': 'Amir', 'surname': 'Stevens'}];
var non_duplidated_data = _.uniq(data, 'name'); 

UPDATE: Lodash now has introduced a .uniqBy as well.


The simplest way is use filter:

_x000D_
_x000D_
var uniq = {}_x000D_
var arr  = [{"id":"1"},{"id":"1"},{"id":"2"}]_x000D_
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));_x000D_
console.log('arrFiltered', arrFiltered)
_x000D_
_x000D_
_x000D_


Here is a solution using new filter function of JavaScript that is quite easy . Let's say you have an array like this.

var duplicatesArray = ['AKASH','AKASH','NAVIN','HARISH','NAVIN','HARISH','AKASH','MANJULIKA','AKASH','TAPASWENI','MANJULIKA','HARISH','TAPASWENI','AKASH','MANISH','HARISH','TAPASWENI','MANJULIKA','MANISH'];

The filter function will allow you to create a new array, using a callback function once for each element in the array. So you could set up the unique array like this.

var uniqueArray = duplicatesArray.filter(function(elem, pos) {return duplicatesArray.indexOf(elem) == pos;});

In this scenario your unique array will run through all of the values in the duplicate array. The elem variable represents the value of the element in the array (mike,james,james,alex), the position is it's 0-indexed position in the array (0,1,2,3...), and the duplicatesArray.indexOf(elem) value is just the index of the first occurrence of that element in the original array. So, because the element 'james' is duplicated, when we loop through all of the elements in the duplicatesArray and push them to the uniqueArray, the first time we hit james, our "pos" value is 1, and our indexOf(elem) is 1 as well, so James gets pushed to the uniqueArray. The second time we hit James, our "pos" value is 2, and our indexOf(elem) is still 1 (because it only finds the first instance of an array element), so the duplicate is not pushed. Therefore, our uniqueArray contains only unique values.

Here is the Demo of above function.Click Here for the above function example


For a readable and a simple solution searcher, her is my version:

    function removeDupplicationsFromArrayByProp(originalArray, prop) {
        let results = {};
        for(let i=0; i<originalArray.length;i++){
            results[originalArray[i][prop]] = originalArray[i];
        }
        return Object.values(results);
    }

I had this exact same requirement, to remove duplicate objects in a array, based on duplicates on a single field. I found the code here: Javascript: Remove Duplicates from Array of Objects

So in my example, I'm removing any object from the array that has a duplicate licenseNum string value.

var arrayWithDuplicates = [
    {"type":"LICENSE", "licenseNum": "12345", state:"NV"},
    {"type":"LICENSE", "licenseNum": "A7846", state:"CA"},
    {"type":"LICENSE", "licenseNum": "12345", state:"OR"},
    {"type":"LICENSE", "licenseNum": "10849", state:"CA"},
    {"type":"LICENSE", "licenseNum": "B7037", state:"WA"},
    {"type":"LICENSE", "licenseNum": "12345", state:"NM"}
];

function removeDuplicates(originalArray, prop) {
     var newArray = [];
     var lookupObject  = {};

     for(var i in originalArray) {
        lookupObject[originalArray[i][prop]] = originalArray[i];
     }

     for(i in lookupObject) {
         newArray.push(lookupObject[i]);
     }
      return newArray;
 }

var uniqueArray = removeDuplicates(arrayWithDuplicates, "licenseNum");
console.log("uniqueArray is: " + JSON.stringify(uniqueArray));

The results:

uniqueArray is:

[{"type":"LICENSE","licenseNum":"10849","state":"CA"},
{"type":"LICENSE","licenseNum":"12345","state":"NM"},
{"type":"LICENSE","licenseNum":"A7846","state":"CA"},
{"type":"LICENSE","licenseNum":"B7037","state":"WA"}]

const objectsMap = new Map();
const placesName = [
  { place: "here", name: "stuff" },
  { place: "there", name: "morestuff" },
  { place: "there", name: "morestuff" },
];
placesName.forEach((object) => {
  objectsMap.set(object.place, object);
});
console.log(objectsMap);

here is my solution, it searches for duplicates based on object.prop and when it finds a duplicate object it replaces its value in array1 with array2 value

function mergeSecondArrayIntoFirstArrayByProperty(array1, array2) {
    for (var i = 0; i < array2.length; i++) {
        var found = false;
        for (var j = 0; j < array1.length; j++) {
            if (array2[i].prop === array1[j].prop) { // if item exist in array1
                array1[j] = array2[i]; // replace it in array1 with array2 value
                found = true;
            }
        }
        if (!found) // if item in array2 not found in array1, add it to array1
            array1.push(array2[i]);

    }
    return array1;
}

If you are using Lodash library you can use the below function as well. It should remove duplicate objects.

var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);

One liner using Set

_x000D_
_x000D_
var things = new Object();_x000D_
_x000D_
things.thing = new Array();_x000D_
_x000D_
things.thing.push({place:"here",name:"stuff"});_x000D_
things.thing.push({place:"there",name:"morestuff"});_x000D_
things.thing.push({place:"there",name:"morestuff"});_x000D_
_x000D_
// assign things.thing to myData for brevity_x000D_
var myData = things.thing;_x000D_
_x000D_
things.thing = Array.from(new Set(myData.map(JSON.stringify))).map(JSON.parse);_x000D_
_x000D_
console.log(things.thing)
_x000D_
_x000D_
_x000D_

Explanation:

  1. new Set(myData.map(JSON.stringify)) creates a Set object using the stringified myData elements.
  2. Set object will ensure that every element is unique.
  3. Then I create an array based on the elements of the created set using Array.from.
  4. Finally, I use JSON.parse to convert stringified element back to an object.

_x000D_
_x000D_
 const things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
  const x = thing.find(item => item.place === current.place);
  if (!x) {
    return thing.concat([current]);
  } else {
    return thing;
  }
}, []);
console.log(filteredArr)
_x000D_
_x000D_
_x000D_

Solution Via Set Object | According to the data type

_x000D_
_x000D_
const seen = new Set();
 const things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];

const filteredArr = things.filter(el => {
  const duplicate = seen.has(el.place);
  seen.add(el.place);
  return !duplicate;
});
console.log(filteredArr)
_x000D_
_x000D_
_x000D_

Set Object Feature

Each value in the Set Object has to be unique, the value equality will be checked

The Purpose of Set object storing unique values according to the Data type , whether primitive values or object references.it has very useful four Instance methods add, clear , has & delete.

Unique & data Type feature:..

addmethod

it's push unique data into collection by default also preserve data type .. that means it prevent to push duplicate item into collection also it will check data type by default...

has method

sometime needs to check data item exist into the collection and . it's handy method for the collection to cheek unique id or item and data type..

delete method

it will remove specific item from the collection by identifying data type..

clear method

it will remove all collection items from one specific variable and set as empty object

Set object has also Iteration methods & more feature..

Better Read from Here : Set - JavaScript | MDN


_x000D_
_x000D_
var things = new Object();_x000D_
_x000D_
things.thing = new Array();_x000D_
_x000D_
things.thing.push({place:"here",name:"stuff"});_x000D_
things.thing.push({place:"there",name:"morestuff"});_x000D_
things.thing.push({place:"there",name:"morestuff"});_x000D_
console.log(things);_x000D_
function removeDuplicate(result, id) {_x000D_
    let duplicate = {};_x000D_
    return result.filter(ele => !duplicate[ele[id]] &&                   (duplicate[ele[id]] = true));_x000D_
}_x000D_
let resolverarray = removeDuplicate(things.thing,'place')_x000D_
console.log(resolverarray);
_x000D_
_x000D_
_x000D_


Using ES6+ in a single line you can get a unique list of objects by key:

const unique = [...new Map(arr.map(item => [item[key], item])).values()]

It can be put into a function:

function getUniqueListBy(arr, key) {
    return [...new Map(arr.map(item => [item[key], item])).values()]
}

Here is a working example:

_x000D_
_x000D_
const arr = [_x000D_
    {place: "here",  name: "x", other: "other stuff1" },_x000D_
    {place: "there", name: "x", other: "other stuff2" },_x000D_
    {place: "here",  name: "y", other: "other stuff4" },_x000D_
    {place: "here",  name: "z", other: "other stuff5" }_x000D_
]_x000D_
_x000D_
function getUniqueListBy(arr, key) {_x000D_
    return [...new Map(arr.map(item => [item[key], item])).values()]_x000D_
}_x000D_
_x000D_
const arr1 = getUniqueListBy(arr, 'place')_x000D_
_x000D_
console.log("Unique by place")_x000D_
console.log(JSON.stringify(arr1))_x000D_
_x000D_
console.log("\nUnique by name")_x000D_
const arr2 = getUniqueListBy(arr, 'name')_x000D_
_x000D_
console.log(JSON.stringify(arr2))
_x000D_
_x000D_
_x000D_

How does it work

First the array is remapped in a way that it can be used as an input for a Map.

arr.map(item => [item[key], item]);

which means each item of the array will be transformed in another array with 2 elements; the selected key as first element and the entire initial item as second element, this is called an entry (ex. array entries, map entries). And here is the official doc with an example showing how to add array entries in Map constructor.

Example when key is place:

[["here", {place: "here",  name: "x", other: "other stuff1" }], ...]

Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key. Note: Map keeps the order of insertion. (check difference between Map and object)

new Map(entry array just mapped above)

Third we use the map values to retrieve the original items, but this time without duplicates.

new Map(mappedArr).values()

And last one is to add those values into a fresh new array so that it can look as the initial structure and return that:

return [...new Map(mappedArr).values()]


 npm i lodash

 let non_duplicated_data = _.uniqBy(pendingDeposits, v => [v.stellarAccount, v.externalTransactionId].join());

function dupData() {
  var arr = [{ comment: ["a", "a", "bbb", "xyz", "bbb"] }];
  let newData = [];
  comment.forEach(function (val, index) {
    if (comment.indexOf(val, index + 1) > -1) {
      if (newData.indexOf(val) === -1) { newData.push(val) }
    }
  })
}

Considering lodash.uniqWith

var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];

_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]

Simple solution with ES6 'reduce' and 'find' array helper methods

Works efficiently and perfectly fine!

"use strict";

var things = new Object();
things.thing = new Array();
things.thing.push({
    place: "here",
    name: "stuff"
});
things.thing.push({
    place: "there",
    name: "morestuff"
});
things.thing.push({
    place: "there",
    name: "morestuff"
});

// the logic is here

function removeDup(something) {
    return something.thing.reduce(function (prev, ele) {
        var found = prev.find(function (fele) {
            return ele.place === fele.place && ele.name === fele.name;
        });
        if (!found) {
            prev.push(ele);
        }
        return prev;
    }, []);
}
console.log(removeDup(things));

If you can wait to eliminate the duplicates until after all the additions, the typical approach is to first sort the array and then eliminate duplicates. The sorting avoids the N * N approach of scanning the array for each element as you walk through them.

The "eliminate duplicates" function is usually called unique or uniq. Some existing implementations may combine the two steps, e.g., prototype's uniq

This post has few ideas to try (and some to avoid :-) ) if your library doesn't already have one! Personally I find this one the most straight forward:

    function unique(a){
        a.sort();
        for(var i = 1; i < a.length; ){
            if(a[i-1] == a[i]){
                a.splice(i, 1);
            } else {
                i++;
            }
        }
        return a;
    }  

    // Provide your own comparison
    function unique(a, compareFunc){
        a.sort( compareFunc );
        for(var i = 1; i < a.length; ){
            if( compareFunc(a[i-1], a[i]) === 0){
                a.splice(i, 1);
            } else {
                i++;
            }
        }
        return a;
    }

If you don't mind your unique array being sorted afterwards, this would be an efficient solution:

things.thing
  .sort(((a, b) => a.place < b.place)
  .filter((current, index, array) =>
    index === 0 || current.place !== array[index - 1].place)

This way, you only have to compare the current element with the previous element in the array. Sorting once before filtering (O(n*log(n))) is cheaper than searching for a duplicate in the entire array for every array element (O(n²)).


Have you heard of Lodash library? I recommend you this utility, when you don't really want to apply your logic to the code, and use already present code which is optimised and reliable.

Consider making an array like this

things.thing.push({place:"utopia",name:"unicorn"});
things.thing.push({place:"jade_palace",name:"po"});
things.thing.push({place:"jade_palace",name:"tigress"});
things.thing.push({place:"utopia",name:"flying_reindeer"});
things.thing.push({place:"panda_village",name:"po"});

Note that if you want to keep one attribute unique, you may very well do that by using lodash library. Here, you may use _.uniqBy

.uniqBy(array, [iteratee=.identity])

This method is like _.uniq (which returns a duplicate-free version of an array, in which only the first occurrence of each element is kept) except that it accepts iteratee which is invoked for each element in array to generate the criterion by which uniqueness is computed.

So, for example, if you want to return an array having unique attribute of 'place'

_.uniqBy(things.thing, 'place')

Similarly, if you want unique attribute as 'name'

_.uniqBy(things.thing, 'name')

Hope this helps.

Cheers!


es6 magic in one line... readable at that!

// returns the union of two arrays where duplicate objects with the same 'prop' are removed
const removeDuplicatesWith = (a, b, prop) => {
  a.filter(x => !b.find(y => x[prop] === y[prop]));
};

Shortest one liners for ES6+

Find unique id's in an array.

arr.filter((v,i,a)=>a.findIndex(t=>(t.id === v.id))===i)

Unique by multiple properties ( place and name )

arr.filter((v,i,a)=>a.findIndex(t=>(t.place === v.place && t.name===v.name))===i)

Unique by all properties (This will be slow for large arrays)

arr.filter((v,i,a)=>a.findIndex(t=>(JSON.stringify(t) === JSON.stringify(v)))===i)

Keep the last occurrence.

arr.slice().reverse().filter((v,i,a)=>a.findIndex(t=>(t.id === v.id))===i).reverse()

Another way would be to use reduce function and have a new array to be the accumulator. If there is already a thing with the same name in the accumulator array then don't add it there.

let list = things.thing;
list = list.reduce((accumulator, thing) => {
    if (!accumulator.filter((duplicate) => thing.name === duplicate.name)[0]) {
        accumulator.push(thing);
    }
    return accumulator;
}, []);
thing.things = list;

I'm adding this answer, because I couldn't find nice, readable es6 solution (I use babel to handle arrow functions) that's compatible with Internet Explorer 11. The problem is IE11 doesn't have Map.values() or Set.values() without polyfill. For the same reason I used filter()[0] to get first element instead of find().


You can also create a generic function which will filter the array based on the object key you pass to the function

function getUnique(arr, comp) {

  return arr
   .map(e => e[comp])
   .map((e, i, final) => final.indexOf(e) === i && i)  // store the keys of the unique objects
   .filter(e => arr[e]).map(e => arr[e]); // eliminate the dead keys & store unique objects

 }

and you can call the function like this,

getUnique(things.thing,'name') // to filter on basis of name

getUnique(things.thing,'place') // to filter on basis of place

Continuing exploring ES6 ways of removing duplicates from array of objects: setting thisArg argument of Array.prototype.filter to new Set provides a decent alternative:

_x000D_
_x000D_
const things = [_x000D_
  {place:"here",name:"stuff"},_x000D_
  {place:"there",name:"morestuff"},_x000D_
  {place:"there",name:"morestuff"}_x000D_
];_x000D_
_x000D_
const filtered = things.filter(function({place, name}) {_x000D_
_x000D_
  const key =`${place}${name}`;_x000D_
_x000D_
  return !this.has(key) && this.add(key);_x000D_
_x000D_
}, new Set);_x000D_
_x000D_
console.log(filtered);
_x000D_
_x000D_
_x000D_

However, it will not work with arrow functions () =>, as this is bound to their lexical scope.


I believe a combination of reduce with JSON.stringify to perfectly compare Objects and selectively adding those who are not already in the accumulator is an elegant way.

Keep in mind that JSON.stringify might become a performance issue in extreme cases where the array has many Objects and they are complex, BUT for majority of the time, this is the shortest way to go IMHO.

_x000D_
_x000D_
var collection= [{a:1},{a:2},{a:1},{a:3}]_x000D_
_x000D_
var filtered = collection.reduce((filtered, item) => {_x000D_
  if( !filtered.some(filteredItem => JSON.stringify(filteredItem) == JSON.stringify(item)) )_x000D_
    filtered.push(item)_x000D_
  return filtered_x000D_
}, [])_x000D_
_x000D_
console.log(filtered)
_x000D_
_x000D_
_x000D_

Another way of writing the same (but less efficient):

collection.reduce((filtered, item) => 
  filtered.some(filteredItem => 
    JSON.stringify(filteredItem ) == JSON.stringify(item)) 
      ? filtered
      : [...filtered, item]
, [])

If you strictly want to remove duplicates based on one property, you can reduce the array into and object based on the place property, since the object can only have unique keys, you can then just get the values to get back to an array:

const unique = Object.values(things.thing.reduce((o, t) => ({ ...o, [t.place]: t }), {}))

removeDuplicates() takes in an array of objects and returns a new array without any duplicate objects (based on the id property).

const allTests = [
  {name: 'Test1', id: '1'}, 
  {name: 'Test3', id: '3'},
  {name: 'Test2', id: '2'},
  {name: 'Test2', id: '2'},
  {name: 'Test3', id: '3'}
];

function removeDuplicates(array) {
  let uniq = {};
  return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true))
}

removeDuplicates(allTests);

Expected outcome:

[
  {name: 'Test1', id: '1'}, 
  {name: 'Test3', id: '3'},
  {name: 'Test2', id: '2'}
];

First, we set the value of variable uniq to an empty object.

Next, we filter through the array of objects. Filter creates a new array with all elements that pass the test implemented by the provided function.

return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));

Above, we use the short-circuiting functionality of &&. If the left side of the && evaluates to true, then it returns the value on the right of the &&. If the left side is false, it returns what is on the left side of the &&.

For each object(obj) we check uniq for a property named the value of obj.id (In this case, on the first iteration it would check for the property '1'.) We want the opposite of what it returns (either true or false) which is why we use the ! in !uniq[obj.id]. If uniq has the id property already, it returns true which evaluates to false (!) telling the filter function NOT to add that obj. However, if it does not find the obj.id property, it returns false which then evaluates to true (!) and returns everything to the right of the &&, or (uniq[obj.id] = true). This is a truthy value, telling the filter method to add that obj to the returned array, and it also adds the property {1: true} to uniq. This ensures that any other obj instance with that same id will not be added again.


This is a generic way of doing this: you pass in a function that tests whether two elements of an array are considered equal. In this case, it compares the values of the name and place properties of the two objects being compared.

ES5 answer

_x000D_
_x000D_
function removeDuplicates(arr, equals) {_x000D_
    var originalArr = arr.slice(0);_x000D_
    var i, len, val;_x000D_
    arr.length = 0;_x000D_
_x000D_
    for (i = 0, len = originalArr.length; i < len; ++i) {_x000D_
        val = originalArr[i];_x000D_
        if (!arr.some(function(item) { return equals(item, val); })) {_x000D_
            arr.push(val);_x000D_
        }_x000D_
    }_x000D_
}_x000D_
_x000D_
function thingsEqual(thing1, thing2) {_x000D_
    return thing1.place === thing2.place_x000D_
        && thing1.name === thing2.name;_x000D_
}_x000D_
_x000D_
var things = [_x000D_
  {place:"here",name:"stuff"},_x000D_
  {place:"there",name:"morestuff"},_x000D_
  {place:"there",name:"morestuff"}_x000D_
];_x000D_
_x000D_
removeDuplicates(things, thingsEqual);_x000D_
console.log(things);
_x000D_
_x000D_
_x000D_

Original ES3 answer

function arrayContains(arr, val, equals) {
    var i = arr.length;
    while (i--) {
        if ( equals(arr[i], val) ) {
            return true;
        }
    }
    return false;
}

function removeDuplicates(arr, equals) {
    var originalArr = arr.slice(0);
    var i, len, j, val;
    arr.length = 0;

    for (i = 0, len = originalArr.length; i < len; ++i) {
        val = originalArr[i];
        if (!arrayContains(arr, val, equals)) {
            arr.push(val);
        }
    }
}

function thingsEqual(thing1, thing2) {
    return thing1.place === thing2.place
        && thing1.name === thing2.name;
}

removeDuplicates(things.thing, thingsEqual);

Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.

const things = {
  thing: [
    { place: 'here', name: 'stuff' },
    { place: 'there', name: 'morestuff1' },
    { place: 'there', name: 'morestuff2' }, 
  ],
};

const removeDuplicates = (array, key) => {
  return array.reduce((arr, item) => {
    const removed = arr.filter(i => i[key] !== item[key]);
    return [...removed, item];
  }, []);
};

console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]

const uniqueElements = (arr, fn) => arr.reduce((acc, v) => {
    if (!acc.some(x => fn(v, x))) { acc.push(v); }
    return acc;
}, []);

const stuff = [
    {place:"here",name:"stuff"},
    {place:"there",name:"morestuff"},
    {place:"there",name:"morestuff"},
];

const unique = uniqueElements(stuff, (a,b) => a.place === b.place && a.name === b.name );
//console.log( unique );

[{
    "place": "here",
    "name": "stuff"
  },
  {
    "place": "there",
    "name": "morestuff"
}]

How about with some es6 magic?

things.thing = things.thing.filter((thing, index, self) =>
  index === self.findIndex((t) => (
    t.place === thing.place && t.name === thing.name
  ))
)

Reference URL

A more generic solution would be:

const uniqueArray = things.thing.filter((thing, index) => {
  const _thing = JSON.stringify(thing);
  return index === things.thing.findIndex(obj => {
    return JSON.stringify(obj) === _thing;
  });
});

Stackblitz Example


Source

JSFiddle

This will remove the duplicate object without passing any key.

_x000D_
_x000D_
uniqueArray = a => [...new Set(a.map(o => JSON.stringify(o)))].map(s => JSON.parse(s));_x000D_
_x000D_
var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];_x000D_
_x000D_
var unique = uniqueArray(objects);_x000D_
console.log('Original Object',objects);_x000D_
console.log('Unique',unique);
_x000D_
_x000D_
_x000D_

uniqueArray = a => [...new Set(a.map(o => JSON.stringify(o)))].map(s => JSON.parse(s));

    var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];

    var unique = uniqueArray(objects);
    console.log(objects);
    console.log(unique);

_x000D_
_x000D_
let myData = [{place:"here",name:"stuff"}, _x000D_
 {place:"there",name:"morestuff"},_x000D_
 {place:"there",name:"morestuff"}];_x000D_
_x000D_
_x000D_
let q = [...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];_x000D_
_x000D_
console.log(q)
_x000D_
_x000D_
_x000D_

One-liner using ES6 and new Map().

// assign things.thing to myData
let myData = things.thing;

[...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];

Details:-

  1. Doing .map() on the data list and converting each individual object into a [key, value] pair array(length =2), the first element(key) would be the stringified version of the object and second(value) would be an object itself.
  2. Adding above created array list to new Map() would have the key as stringified object and any same key addition would result in overriding the already existing key.
  3. Using .values() would give MapIterator with all values in a Map (obj in our case)
  4. Finally, spread ... operator to give new Array with values from the above step.

To add one more to the list. Using ES6 and Array.reduce with Array.find.
In this example filtering objects based on a guid property.

let filtered = array.reduce((accumulator, current) => {
  if (! accumulator.find(({guid}) => guid === current.guid)) {
    accumulator.push(current);
  }
  return accumulator;
}, []);

Extending this one to allow selection of a property and compress it into a one liner:

const uniqify = (array, key) => array.reduce((prev, curr) => prev.find(a => a[key] === curr[key]) ? prev : prev.push(curr) && prev, []);

To use it pass an array of objects and the name of the key you wish to de-dupe on as a string value:

const result = uniqify(myArrayOfObjects, 'guid')

This answer will probably not be found by anyone but here is a short ES6 way with a better runtime than the 60+ answers that already exist:

let ids = array.map(o => o.id)
let filtered = array.filter(({id}, index) => !ids.includes(id, index+1))

Example:

_x000D_
_x000D_
let arr = [{id: 1, name: 'one'}, {id: 2, name: 'two'}, {id: 1, name: 'one'}]

let ids = arr.map(o => o.id)
let filtered = arr.filter(({id}, index) => !ids.includes(id, index + 1))

console.log(filtered)
_x000D_
_x000D_
_x000D_

How it works:

Array.filter() removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id} destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()'s second parameter fromIndex with index + 1 which will ignore the current object and all previous.

Since every iteration of the filter callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.

This obviously also works for any other key that is not called id or even multiple or all keys.


Another option would be to create a custom indexOf function, which compares the values of your chosen property for each object and wrap this in a reduce function.

var uniq = redundant_array.reduce(function(a,b){
      function indexOfProperty (a, b){
          for (var i=0;i<a.length;i++){
              if(a[i].property == b.property){
                   return i;
               }
          }
         return -1;
      }

      if (indexOfProperty(a,b) < 0 ) a.push(b);
        return a;
    },[]);

  • This solution is generic for any kind of object and checks for every (key, value) of the Object in the array.
  • Using an temporary object as a hash table to see if the entire Object was ever present as a key.
  • If the string representation of the Object is found then that item is removed from the array.

_x000D_
_x000D_
var arrOfDup = [{'id':123, 'name':'name', 'desc':'some desc'},_x000D_
                {'id':125, 'name':'another name', 'desc':'another desc'},_x000D_
                {'id':123, 'name':'name', 'desc':'some desc'},_x000D_
                {'id':125, 'name':'another name', 'desc':'another desc'},_x000D_
                {'id':125, 'name':'another name', 'desc':'another desc'}];_x000D_
_x000D_
function removeDupes(dupeArray){_x000D_
  let temp = {};_x000D_
  let tempArray = JSON.parse(JSON.stringify(dupeArray));_x000D_
  dupeArray.forEach((item, pos) => {_x000D_
    if(temp[JSON.stringify(item)]){_x000D_
      tempArray.pop();_x000D_
    }else{_x000D_
      temp[JSON.stringify(item)] = item;_x000D_
    }_x000D_
  });_x000D_
 return tempArray;_x000D_
}_x000D_
_x000D_
arrOfDup = removeDupes(arrOfDup);_x000D_
_x000D_
arrOfDup.forEach((item, pos) => {_x000D_
  console.log(`item in array at position ${pos} is ${JSON.stringify(item)}`);_x000D_
});
_x000D_
_x000D_
_x000D_


If you find yourself needing to remove duplicate objects from arrays based on particular fields frequently, it might be worth creating a distinct(array, predicate) function that you can import from anywhere in your project. This would look like

const things = [{place:"here",name:"stuff"}, ...];
const distinctThings = distinct(things, thing => thing.place);

The distinct function can use any of the implementations given in the many good answers above. The easiest one uses findIndex:

const distinct = (items, predicate) => items.filter((uniqueItem, index) =>
    items.findIndex(item =>
        predicate(item) === predicate(uniqueItem)) === index);

What about this:

function dedupe(arr, compFn){
    let res = [];
    if (!compFn) compFn = (a, b) => { return a === b };
    arr.map(a => {if(!res.find(b => compFn(a, b))) res.push(a)});
    return res;
}

You could also use a Map:

const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());

Full sample:

const things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());

console.log(JSON.stringify(dedupThings, null, 4));

Result:

[
    {
        "place": "here",
        "name": "stuff"
    },
    {
        "place": "there",
        "name": "morestuff"
    }
]

    function genFilterData(arr, key, key1) {
      let data = [];
      data = [...new Map(arr.map((x) => [x[key] || x[key1], x])).values()];
    
      const makeData = [];
      for (let i = 0; i < data.length; i += 1) {
        makeData.push({ [key]: data[i][key], [key1]: data[i][key1] });
      }
    
      return makeData;
    }
    const arr = [
    {make: "here1", makeText:'hj',k:9,l:99},
    {make: "here", makeText:'hj',k:9,l:9},
    {make: "here", makeText:'hj',k:9,l:9}]

      const finalData= genFilterData(data, 'Make', 'MakeText');
    
        console.log(finalData);

I think the best approach is using reduce and Map object. This is a single line solution.

_x000D_
_x000D_
const data = [
  {id: 1, name: 'David'},
  {id: 2, name: 'Mark'},
  {id: 2, name: 'Lora'},
  {id: 4, name: 'Tyler'},
  {id: 4, name: 'Donald'},
  {id: 5, name: 'Adrian'},
  {id: 6, name: 'Michael'}
]

const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];

console.log(uniqueData)

/*
  in `map.set(obj.id, obj)`
  
  'obj.id' is key. (don't worry. we'll get only values using the .values() method)
  'obj' is whole object.
*/
_x000D_
_x000D_
_x000D_


function filterDuplicateQueries(queries){
    let uniqueQueries = [];
     queries.forEach((l, i)=>{
        let alreadyExist = false;
        if(uniqueQueries.length>0){
            uniqueQueries.forEach((k, j)=>{
                if(k.query == l.query){
                    alreadyExist = true;
                }
            });
        }
        if(!alreadyExist){
           uniqueQueries.push(l)
        }
    });

If you need an unique array based on multiple properties in the object you can do this with map and combining the properties of the object.

    var hash = array.map(function(element){
        var string = ''
        for (var key in element){
            string += element[key]
        }
        return string
    })
    array = array.filter(function(element, index){
        var string = ''
        for (var key in element){
            string += element[key]
        }
        return hash.indexOf(string) == index
    })

If you don't want to specify a list of properties:

function removeDuplicates(myArr) {
  var props = Object.keys(myArr[0])
  return myArr.filter((item, index, self) =>
    index === self.findIndex((t) => (
      props.every(prop => {
        return t[prop] === item[prop]
      })
    ))
  )
}

OBS! Not compatible with IE11.


Make Something simple. Fancy is good but unreadable code is useless. Enjoy :-)

_x000D_
_x000D_
var a = [_x000D_
 {_x000D_
  executiveId: 6873702,_x000D_
  largePhotoCircle: null,_x000D_
  name: "John A. Cuomo",_x000D_
  photoURL: null,_x000D_
  primaryCompany: "VSE CORP",_x000D_
  primaryTitle: "Chief Executive Officer, President and Director"_x000D_
 },_x000D_
 {_x000D_
  executiveId: 6873702,_x000D_
  largePhotoCircle: null,_x000D_
  name: "John A. Cuomo",_x000D_
  photoURL: null,_x000D_
  primaryCompany: "VSE CORP",_x000D_
  primaryTitle: "Chief Executive Officer, President and Director"_x000D_
 },_x000D_
 {_x000D_
  executiveId: 6873703,_x000D_
  largePhotoCircle: null,_x000D_
  name: "John A. Cuomo",_x000D_
  photoURL: null,_x000D_
  primaryCompany: "VSE CORP",_x000D_
  primaryTitle: "Chief Executive Officer, President and Director",_x000D_
 }_x000D_
];_x000D_
_x000D_
function filterDuplicate(myArr, prop) {_x000D_
      // Format - (1)_x000D_
_x000D_
      // return myArr.filter((obj, pos, arr) => {_x000D_
      //     return arr.map(mapObj => mapObj[prop]).indexOf(obj[prop]) === pos;_x000D_
      // });_x000D_
_x000D_
      // Format - (2)_x000D_
      var res = {};_x000D_
      var resArr = [];_x000D_
      for (var elem of myArr) {_x000D_
        res[elem.executiveId] = elem;_x000D_
      }_x000D_
      for (let [index, elem] of Object.entries(res)) {_x000D_
        resArr.push(elem);_x000D_
      }_x000D_
      return resArr;_x000D_
  }_x000D_
  _x000D_
let finalRes = filterDuplicate(a,"executiveId");_x000D_
console.log("finalResults : ",finalRes);
_x000D_
_x000D_
_x000D_


ES6 one liner is here

_x000D_
_x000D_
let arr = [
  {id:1,name:"sravan ganji"},
  {id:2,name:"pinky"},
  {id:4,name:"mammu"},
  {id:3,name:"sanju"},
  {id:3,name:"ram"},
];

console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))
_x000D_
_x000D_
_x000D_


Dang, kids, let's crush this thing down, why don't we?

_x000D_
_x000D_
let uniqIds = {}, source = [{id:'a'},{id:'b'},{id:'c'},{id:'b'},{id:'a'},{id:'d'}];_x000D_
let filtered = source.filter(obj => !uniqIds[obj.id] && (uniqIds[obj.id] = true));_x000D_
console.log(filtered);_x000D_
// EXPECTED: [{id:'a'},{id:'b'},{id:'c'},{id:'d'}];
_x000D_
_x000D_
_x000D_


I know there is a ton of answers in this question already, but bear with me...

Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.

Consider the array below. Say you want to find the unique objects in this array considering only propOne and propTwo, and ignore any other properties that may be there.

The expected result should include only the first and last objects. So here goes the code:

_x000D_
_x000D_
const array = [{
    propOne: 'a',
    propTwo: 'b',
    propThree: 'I have no part in this...'
},
{
    propOne: 'a',
    propTwo: 'b',
    someOtherProperty: 'no one cares about this...'
},
{
    propOne: 'x',
    propTwo: 'y',
    yetAnotherJunk: 'I am valueless really',
    noOneHasThis: 'I have something no one has'
}];

const uniques = [...new Set(
    array.map(x => JSON.stringify((({ propOne, propTwo }) => ({ propOne, propTwo }))(x))))
].map(JSON.parse);

console.log(uniques);
_x000D_
_x000D_
_x000D_


The problem can be simplified to removing duplicates from the thing array.

You can implement a faster O(n) solution (assuming native key lookup is negligible) by using an object to both maintain unique criteria as keys and storing associated values.

Basically, the idea is to store all objects by their unique key, so that duplicates overwrite themselves:

_x000D_
_x000D_
const thing = [{ place: "here", name:"stuff" }, { place: "there", name:"morestuff" }, { place: "there", name:"morestuff" } ]

const uniques = {}
for (const t of thing) {
  const key = t.place + '$' + t.name  // Or whatever string criteria you want, which can be generified as Object.keys(t).join("$")
  uniques[key] = t                    // Last duplicate wins
}
const uniqueThing = Object.values(uniques)
console.log(uniqueThing)
_x000D_
_x000D_
_x000D_


Generic for any array of objects:

/**
* Remove duplicated values without losing information
*/
const removeValues = (items, key) => {
  let tmp = {};

  items.forEach(item => {
    tmp[item[key]] = (!tmp[item[key]]) ? item : Object.assign(tmp[item[key]], item);
  });
  items = [];
  Object.keys(tmp).forEach(key => items.push(tmp[key]));

  return items;
}

Hope it could help to anyone.


You can convert the array objects into strings so they can be compared, add the strings to a Set so the comparable duplicates will be automatically removed and then convert each of the strings back into objects.

It might not be as performant as other answers, but it's readable.

const things = {};

things.thing = [];
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

const uniqueArray = (arr) => {

  const stringifiedArray = arr.map((item) => JSON.stringify(item));
  const set = new Set(stringifiedArray);

  return Array.from(set).map((item) => JSON.parse(item));
}

const uniqueThings = uniqueArray(things.thing);

console.log(uniqueThings);

My two cents here. If you know the properties are in the same order, you can stringify the elements and remove dupes from the array and parse the array again. Something like this:

_x000D_
_x000D_
var things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
  
let stringified = things.thing.map(i=>JSON.stringify(i));
let unique =  stringified.filter((k, idx)=> stringified.indexOf(k) === idx)
                         .map(j=> JSON.parse(j))
console.log(unique);
_x000D_
_x000D_
_x000D_


This way works well for me:

function arrayUnique(arr, uniqueKey) {
  const flagList = new Set()
  return arr.filter(function(item) {
    if (!flagList.has(item[uniqueKey])) {
      flagList.add(item[uniqueKey])
      return true
    }
  })
}
const data = [
  {
    name: 'Kyle',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Kyle',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Emily',
    occupation: 'Web Designer'
  },
  {
    name: 'Melissa',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Tom',
    occupation: 'Web Developer'
  },
  {
    name: 'Tom',
    occupation: 'Web Developer'
  }
]
console.table(arrayUnique(data, 'name'))// work well

printout

+------------------------------------------+
¦ (index) ¦   name    ¦     occupation     ¦
+---------+-----------+--------------------¦
¦    0    ¦  'Kyle'   ¦ 'Fashion Designer' ¦
¦    1    ¦  'Emily'  ¦   'Web Designer'   ¦
¦    2    ¦ 'Melissa' ¦ 'Fashion Designer' ¦
¦    3    ¦   'Tom'   ¦  'Web Developer'   ¦
+------------------------------------------+

ES5:

function arrayUnique(arr, uniqueKey) {
  const flagList = []
  return arr.filter(function(item) {
    if (flagList.indexOf(item[uniqueKey]) === -1) {
      flagList.push(item[uniqueKey])
      return true
    }
  })
}

These two ways are simpler and more understandable.


Here's another option to do it using Array iterating methods if you need comparison only by one field of an object:

    function uniq(a, param){
        return a.filter(function(item, pos, array){
            return array.map(function(mapItem){ return mapItem[param]; }).indexOf(item[param]) === pos;
        })
    }

    uniq(things.thing, 'place');

Examples related to javascript

need to add a class to an element How to make a variable accessible outside a function? Hide Signs that Meteor.js was Used How to create a showdown.js markdown extension Please help me convert this script to a simple image slider Highlight Anchor Links when user manually scrolls? Summing radio input values How to execute an action before close metro app WinJS javascript, for loop defines a dynamic variable name Getting all files in directory with ajax

Examples related to arrays

PHP array value passes to next row Use NSInteger as array index How do I show a message in the foreach loop? Objects are not valid as a React child. If you meant to render a collection of children, use an array instead Iterating over arrays in Python 3 Best way to "push" into C# array Sort Array of object by object field in Angular 6 Checking for duplicate strings in JavaScript array what does numpy ndarray shape do? How to round a numpy array?

Examples related to object

How to update an "array of objects" with Firestore? how to remove json object key and value.? Cast object to interface in TypeScript Angular 4 default radio button checked by default How to use Object.values with typescript? How to map an array of objects in React How to group an array of objects by key push object into array Add property to an array of objects access key and value of object using *ngFor

Examples related to duplicates

Remove duplicates from dataframe, based on two columns A,B, keeping row with max value in another column C Remove duplicates from a dataframe in PySpark How to "select distinct" across multiple data frame columns in pandas? How to find duplicate records in PostgreSQL Drop all duplicate rows across multiple columns in Python Pandas Left Join without duplicate rows from left table Finding duplicate integers in an array and display how many times they occurred How do I use SELECT GROUP BY in DataTable.Select(Expression)? How to delete duplicate rows in SQL Server? Python copy files to a new directory and rename if file name already exists