I believe this is the best way to do this
var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
reduced = Object.keys(myArray.reduce((p,c) => (p[c] = true,p),{}));
console.log(reduced);
OK .. even though this one is O(n) and the others are O(n^2) i was curious to see benchmark comparison between this reduce / look up table and filter/indexOf combo (I choose Jeetendras very nice implementation https://stackoverflow.com/a/37441144/4543207). I prepare a 100K item array filled with random positive integers in range 0-9999 and and it removes the duplicates. I repeat the test for 10 times and the average of the results show that they are no match in performance.
- In firefox v47 reduce & lut : 14.85ms vs filter & indexOf : 2836ms
- In chrome v51 reduce & lut : 23.90ms vs filter & indexOf : 1066ms
Well ok so far so good. But let's do it properly this time in the ES6 style. It looks so cool..! But as of now how it will perform against the powerful lut solution is a mystery to me. Lets first see the code and then benchmark it.
var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
reduced = [...myArray.reduce((p,c) => p.set(c,true),new Map()).keys()];
console.log(reduced);
Wow that was short..! But how about the performance..? It's beautiful... Since the heavy weight of the filter / indexOf lifted over our shoulders now i can test an array 1M random items of positive integers in range 0..99999 to get an average from 10 consecutive tests. I can say this time it's a real match. See the result for yourself :)
var ranar = [],
red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
avg1 = [],
avg2 = [],
ts = 0,
te = 0,
res1 = [],
res2 = [],
count= 10;
for (var i = 0; i Math.floor(Math.random()*100000));
ts = performance.now();
res1 = red1(ranar);
te = performance.now();
avg1.push(te-ts);
ts = performance.now();
res2 = red2(ranar);
te = performance.now();
avg2.push(te-ts);
}
avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;
console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");
Which one would you use..? Well not so fast...! Don't be deceived. Map is at displacement. Now look... in all of the above cases we fill an array of size n with numbers of range < n. I mean we have an array of size 100 and we fill with random numbers 0..9 so there are definite duplicates and "almost" definitely each number has a duplicate. How about if we fill the array in size 100 with random numbers 0..9999. Let's now see Map playing at home. This time an Array of 100K items but random number range is 0..100M. We will do 100 consecutive tests to average the results. OK let's see the bets..! <- no typo
var ranar = [],
red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
avg1 = [],
avg2 = [],
ts = 0,
te = 0,
res1 = [],
res2 = [],
count= 100;
for (var i = 0; i Math.floor(Math.random()*100000000));
ts = performance.now();
res1 = red1(ranar);
te = performance.now();
avg1.push(te-ts);
ts = performance.now();
res2 = red2(ranar);
te = performance.now();
avg2.push(te-ts);
}
avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;
console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");
Now this is the spectacular comeback of Map()..! May be now you can make a better decision when you want to remove the dupes.
Well ok we are all happy now. But the lead role always comes last with some applause. I am sure some of you wonder what Set object would do. Now that since we are open to ES6 and we know Map is the winner of the previous games let us compare Map with Set as a final. A typical Real Madrid vs Barcelona game this time... or is it? Let's see who will win the el classico :)
var ranar = [],
red1 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
red2 = a => Array.from(new Set(a)),
avg1 = [],
avg2 = [],
ts = 0,
te = 0,
res1 = [],
res2 = [],
count= 100;
for (var i = 0; i Math.floor(Math.random()*10000000));
ts = performance.now();
res1 = red1(ranar);
te = performance.now();
avg1.push(te-ts);
ts = performance.now();
res2 = red2(ranar);
te = performance.now();
avg2.push(te-ts);
}
avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;
console.log("map & spread took: " + avg1 + "msec");
console.log("set & A.from took: " + avg2 + "msec");
Wow.. man..! Well unexpectedly it didn't turn out to be an el classico at all. More like Barcelona FC against CA Osasuna :))