Not only will we dissect the following function that is laden with es6+ syntax, but show how a humble es5 implementation is not only easier to read, but also more efficient.
Today, a student came to a mentor session wanting to dissect the following function:
function omit(obj, keyToOmit) {
return Object.entries(obj).reduce(
(newObj, [ key, value ]) => (key === keyToOmit ? newObj : { ...newObj, [key]: value }),
{}
);
}
You could say that it functions like filter
for objects. The intended usage is to pass in an object as obj
, and a string as keyToOmit
, and to then return a new object with all but the omitted key/value pair.
Like this:
const dog = { plays: true, guards: true, barksAllNight: true };
let dogUpgrade = omit(dog, 'barksAllNight');
// ahh, now we can get some sleep.
// dogUpgrade => { plays: true, guards: true }
So to understand how it works, we went through the exercise of rewriting it into a less showy, but so much more clear, implementation of filtering out a key/value pair from the object.
You may be wondering why we didn't just do something like
delete dog[keyToOmit]
;
Good question! It's because we don't want to 'mutate' the object that was originally passed into the function. Similar to the original omit
function, we wanted to return a fresh object that had all of the original key/value pairs, but not the key/value pair we want to filter out.
Simple Version
Here is the rewrite we came up with. If you were having any trouble following the original function, see how this one compares when it comes to readability:
function omit(obj, keyToOmit) {
var result = {}; // start with an empty object (don't mutate the original obj)
// iterate and only add key/val pairs that are not the keyToOmit
for (let key in obj) {
if (key !== keyToOmit) {
result[key] = obj[key];
}
}
return result;
}
Using a for in
loop to iterate through the keys of the object, we simply compare the key
to the keyToOmit
, and if it doesn't match, extend our result
object before returning it at the end.
So now that we reverse engineered the function and made a basic implementation, we set out to decode the syntactic sugar sprinkled inside the original omit
function.
Stunt Version
Object.entries()
is a fantastic method that returns nested arrays consisting of the key/value pairs of the object that is passed as an argument to the method.
Invoking Object.entries
on our dog
would evaluate to
var entries = Object.entries(dog);
// entries => [["plays", true], ["guards", true], ["barksAllNight", true]]
Now that the key / value pairs are serialized within an array, we can use the HOF (higher order function) reduce
to boil the values down to a single final result.
The function signature of the callback in reduce
is (accumulator, currentValue, currentIndex, array)
. The last 2 are optional, and not used in this function, so we'll ignore them for now, but it is good to know that they exist. You can read more about reduce on mdn here.
So, since we are passing the first two arguments to reduce
, what are they in the context of the omit
function?
[["plays", true], ["guards", true], ["barksAllNight", true]].reduce(
(newObj, [ key, value ]) => (key === keyToOmit ? newObj : { ...newObj, [key]: value }),
{}
);
The first argument newObj
corresponds to the accumulator. The second is written as [ key, value ]
. This is a nice use of array destructuring assignment which allows for a concise assignment of 2 variables at once, corresponding to the array indices of the current value being passed to the reduce callback.
Here's an example:
var [a, b] = ['first', 'second'];
// a => 'first'
// b => 'second'
So, since the reduce callback function is receiving another array with 2 values (key / value from the object) about the dog with each iteration, we can use destructuring to assign the variables key
and value
at once. Nice.
Now, inside the =>
expression, we can refer to newObj
key
and value
. The ternary operator is set up so that if key
is exactly equal to keyToOmit
, we just return the accumulator unchanged. Otherwise, we use the object spread operator to succinctly copy the key/value pairs from the accumulator, and amend it with a new key/value pair. The [key]: value
syntax is yet another es6+ trick. It allows using a Computed Property Name for the key. It evaluates the expression between the brackets, which is the key
variable. The string that evaluates to becomes the computed key that is added to the accumulator. The value of this key/value pair is the value of the value
variable.
After the reduce function completes, the entire expression evaluates to the new object that is now missing the keyToOmit
and it is returned from the function.
It's super clever coding, but if used at scale, is quite inefficient.
Let's take a look!
Our lowly es5 implementation is less likely to get us labelled as 1337
but let's consider its runtime complexity.
- enter the function
- set up the results object
- iterate through the enumerable properties of the object once
- make a comparison of each value with the
keyToOmit
- make a comparison of each value with the
- return the results object
Now let's break down the steps required for the 'stunt JavaScript™️' version:
- enter the function
- derive
Object.entries
by:- iterating through the enumerable properties and assigning the key/value pairs to nested arrays
- set up the reduce function
- iterate again through the derived nested array and
- make the comparison of each value with the
keyToOmit
- if it matches, return the accumulator unchanged
- otherwise iterate again due to the object spread operator! (I think this pushes the time complexity closer to
O(n^2)
- make the comparison of each value with the
- return the evaluated result of the reduce function.
I think that both are technically O(n)
run time complexity, but if you were to show a graph of both, they would both be linear, but the second version would be steeper in terms of operations per input size. And it is definitely less efficient in terms of space complexity (auxiliary space). As a careful reader pointed out, if the implementation of the object spread operator truly iterates through each key value pair of the accumulator again, then this is now a nested for-loop. This would be an efficiency downgrade, closer to O(n^2)
Takeaway
For a simple helper function, this time difference is immaterial, but I still think it's good to get in the habit of thinking about algorithmic complexity in order to build up intuition about runtimes in preparation for writing efficient code in the situations when it does matter.
Final words
As a teacher / mentor of newer engineers, I get a lot of input about what kinds of information is confusing to people who are earlier on in their programming journeys. I like to write about about these topics, to share these learnings with a wider community.
I hope you've enjoyed coming along on this code exploration 🔎. More to come, so + FOLLOW and I'll see you further on down the code road!
Both functions side-by-side:
Top comments (0)