
Removing Duplicates from a JavaScript Array ('Deduping')

When working with arrays in JavaScript, we can often encounter duplicate values, whether handling user input, processing API responses, or manipulating datasets. Removing these duplicates (also known as 'deduping'), helps ensure data integrity and improves efficiency.
In this article, I explore different methods for removing duplicates from a JavaScript array, as well as comparing their advantages, disadvantages, and possible use cases.
Using the Set Object to Remove Duplicates
One of the simplest and most efficient ways to remove duplicates is by using the Set object. A Set automatically enforces uniqueness, making it ideal for deduping arrays of primitive values. For example:
const numbers = [1, 2, 2, 3, 4, 4, 5];const uniqueNumbers = [...new Set(numbers)];console.log(uniqueNumbers); // [1, 2, 3, 4, 5]This approach is concise, performs well, and runs in O(n) time complexity because each element is inserted into the Set once. However, it only works for arrays of primitive values such as numbers or strings. When working with arrays of objects, we need a different strategy.
Removing Duplicates from an Array of Objects
When dealing with objects, we cannot use Set directly because it works by comparing object references rather than their actual values. Instead, we can use Array.prototype.filter with findIndex to ensure uniqueness based on a specific property.
This looks something like this:
const people = [ { id: 1, name: "Maddie" }, { id: 2, name: "Bob" }, { id: 1, name: "Maddie" }];const uniquePeople = people.filter( (person, index, self) => index === self.findIndex(p => p.id === person.id));console.log(uniquePeople);// [{ id: 1, name: "Maddie" }, { id: 2, name: "Bob" }]In this way, we can ensure that only the first occurrence of an object with a given id is retained, effectively deduping the array. However, findIndex iterates through the array for each element, making this method O(n²) in the worst case, which means it can become very slow for large datasets.
Deduping Arrays Using reduce
Another approach to deduping involves using Array.prototype.reduce to build a unique array by checking for existing values as we iterate, like this:
const numbers = [1, 2, 2, 3, 4, 4, 5];const uniqueNumbers = numbers.reduce<number[]>((acc, num) => { if (!acc.includes(num)) { acc.push(num); } return acc;}, []);console.log(uniqueNumbers); // [1, 2, 3, 4, 5]This method offers more control over how duplicates are handled but is ‑ as with using .filter() ‑ O(n²) in complexity because .includes() checks each element in acc linearly, for every iteration. As a result ‑ again ‑ this approach is not terribly efficient for large arrays.
Performance Comparison: Set vs. filter vs. reduce
Here's an overview of the efficiency of each deduping method depends on the dataset size and the type of values stored in the array:
| Method | Time Complexity | Best For | Drawbacks |
|---|---|---|---|
Set | O(n) | Arrays of primitives (numbers, strings) | Does not work for objects |
filter with findIndex | O(n²) | Small arrays of objects | Slow for large datasets |
reduce with includes | O(n²) | Custom logic for small datasets | Inefficient for large datasets |
For small arrays, the performance difference is negligible. However, as the dataset grows, Set becomes significantly faster than filter or reduce. If we need to dedupe objects efficiently, a Map‑based approach is often a better alternative:
const uniquePeople = Array.from(new Map(people.map(person => [person.id, person])).values());Like Set, this runs in O(n) time, making it much more efficient for large datasets than filter.
Wrapping up
Removing duplicates, or deduping, is an essential operation when working with arrays in JavaScript. Depending on the type of data and performance needs, we can choose between Set, filter, or reduce to achieve the desired result efficiently.
Key Takeaways
- The
Setobject is the fastest and simplest way to remove duplicates from arrays of primitive values. - When working with objects,
filterwithfindIndexis an option but can be slow for large datasets. reduceallows for more custom deduplication logic but is inefficient for large arrays.- Using a
Mapprovides a better alternative for efficiently deduping large objects.
By understanding these techniques, we can choose the best approach for removing duplicates efficiently, keeping our JavaScript arrays clean and performant.
Related Articles

Add Two Numbers in TypeScript: Linked Lists Without the Hand‑Waving. 
Object Equality in JavaScript: {} isn't Equal to {}. Object Equality in JavaScript:
{}isn't Equal to{}
JavaScript's hasOwnProperty() Method. JavaScript's
hasOwnProperty()MethodAdvanced Sass: Loops. Advanced Sass: Loops

Exploring the Liquid Templating Language. Exploring the Liquid Templating Language

!Important in CSS. !importantin CSS
Comparing Arrays in JavaScript. Comparing Arrays in JavaScript

Grid Traversal: Solving the 'Number of Islands' Problem. Grid Traversal: Solving the 'Number of Islands' Problem

Understanding the JavaScript Event Loop. Understanding the JavaScript Event Loop

Understanding the Composition API in Vue 3. Understanding the Composition API in Vue 3

Controlling Element Transparency with CSS opacity. Controlling Element Transparency with CSS
opacity
Backtracking Decision Trees: Solving 'Combination Sum'. Backtracking Decision Trees: Solving 'Combination Sum'