Suggestion for reversing object.entries?
Closed this issue Β· 24 comments
Let's say I use Object.entries() on an object and modify the array, is there a convenient function for putting back the array into an object? For example:
Object.entries({ a: 1, b: 2 })
.map(([key, val]) => ([key, val++]))
.TO_OBJECT_METHOD_HERE()
as a shorthand for:
Object.entries({ a: 1, b: 2 })
.map(([key, val]) => ([key, val + 1]))
.reduce((obj, [k, v]) => ({ ...obj, [k]: v }), {})
Indeed, your reduce
is the current way to reconstruct a POJO from an array of entries. However, also yes, using Maps is generally preferred over using objects as dicts.
If one existed, it would absolutely never be on Object.prototype
or Array.prototype
, but it certainly might be Object.from
or Object.fromEntries
. However, there are no current plans for this proposal that I'm aware of.
Example of using a Map
for the same effect: new Map(Array.from(mapObj).map(([k, v]) => [k, v + 1]))
@olalonde it's not mainly designed for that purpose - imo its primary purpose is to produce an array for easy iteration and reflection over objects. Reconstructing a plain object from entries isn't a common use case, and new Map
is the only API in the language that accepts entries and constructs an object from it, so there's very little precedent.
@ljharb gotcha.
Reconstructing a plain object from entries isn't a common use case
It isn't right now but if you're trying to write functional-ish Javascript, iterating over object keys to create a new updated object is a bit awkward without .map
which is why I typically do something like Object.keys().map().reduce(...)
the reduce(...)
part is not very DRY (always same sh...) hence why I was asking if by any chance there was a reverse operation for Object.entries
:)
There's some wishful thinking involved in calling using plain objects as dictionaries uncommon, though; iteration over object keys is a selling point for all the utility libraries like lodash or Ramda, so I'd suggest looking into using those.
Thanks, I'm aware of those libraries but try to stick to plain JS wherever possible. By the way, why do you keep assuming I'm using objects as dictionaries? I don't believe I am.
It's kind of a loose distinction, since technically every object is a dictionary; to give an idea, the Map
article in MDN has criteria for when you'd want to use Map
, and iterating over the values with map()
fits some of them, hence calling it a dictionary.
Here's the code:
// initialise models
import initHashItem from './models/hash-item'
import initTree from './models/tree'
const initializers = {
HashItem: initHashItem,
Tree: initTree,
}
export default (services) => {
const { bookshelf } = services
return Object.entries(initializers).map(([modelName, initModel]) => {
const Model = initModel(services)
bookshelf.model(modelName, Model)
return [modelName, Model]
}).reduce((obj, [k, v]) => ({ ...obj, [k]: v }), {})
}
I just have 2 models at the moment so it seems a bit overkill but as I add model, it's nice being able to just add an import statement and an object key.
Using a Map
, this works:
const initializers = new Map(Object.entries({
HashItem: initHashItem,
Tree: initTree,
});
export default (services) => {
const { bookshelf } = services;
return newMap(Array.from(initializers.entries()).map(([modelName, initModel]) => {
const Model = initModel(services);
bookshelf.model(modelName, Model);
return [modelName, Model];
}));
};
it means that your module will export a Map
instead of an object, but that's a more robust API anyways.
.entries()
is the default iterator for Map
, so it's redundant in initializers.entries()
.
it means that your module will export a Map instead of an object, but that's a more robust API anyways
π I use my models all over my code, having to type new models.get('HashItem')({ id: 'someid' })
would be unbearable and I fail to see how using a map is a more robust in this case. It's not like I'm storing an indefinite amount of models that can grow at runtime or anything like that. I wonder why you think it's a good idea to put classes in a map vs a POJO, it would just make the rest of my code more verbose with no tangible benefit as far as I can tell.
The best option may be to make a helper function in a utility module.
This method continues to be needed; for example, I need to filter object values before converting the object to JSON, so it makes little sense to convert it to a Map
, and I end up needing a helper function for something so basic that it should be provided by the language.
There isn't even a terse way to convert objects to maps, or to iterate maps, or literal syntax for maps; maps are simply cumbersome to use and a poor replacement for POJOs.
Edit: should be noted that a proposal for Object.fromEntries
exists.
You can also set it up in a utility in your own code:
const objectify = (obj, [k, v]) => ({ ...obj, [k]: v });
Usage:
.reduce(objectify, {});
Another solution:
const objectify = arr => Object.assign(...arr.map(([k,v]) => ({[k]: v})))
Usage:
objectify([["foo", 1], ["bar", 2]])
This is conceptually too slow, maybe the browsers can understand and bypass this, but i've tested it, and it's not good, not good at all...
const objectify = ( obj, [ k, v ] ) => ( { ...obj, [ k ]: v } );
If you write some clean JS you should never use ...
anyway, and never ever in a loop...
Normally the algorythm should be:
const objectify = ( obj, [ k, v ] ) => ( obj[ k ] = v, obj );
or
const objectify = ( obj, [ k, v ] ) => {
obj[ k ] = v;
return obj;
} );
Imagine yourself having to code the loop in C, you will never ever malloc
and object temporaraly for freeing it just at the end of the loop, etc.
This isnβt C; itβs a memory-managed language, and in actual practice (like in actual concrete usage), with arrays of less than hundreds of millions of entries, the speed of this is utterly insignificant with or without the spread.
It's not because the time difference is not perceptible that we have to code like if computer-science never existed...
Saying "it's fast enough" is right but no-ecological. Because somewhere a slow algorythm will decrease a little too much 10 millions phone's batteries. And all these batteries will be plugged too soon because of this poor algorythm.
And doing wrong on the ObjectCopy function is really bad, because we can imagine calling this function a lot.
That's why I posted my variant, it has the same time complexity as the fastest solution: O(n) (where n is number of attributes) instead of O(n^2) for the one that uses ...
and reduce.
Your version is maybe better conceptually you're right, but you're still creating a lot of temporary object for nothing and a final array somewhere for the ...
itself.
We should never code like this at the low level, and this kind of common function is low-level.
With this code you can see the similitude with the two different methods:
const objectifyV1 = ( obj, [ k, v ] ) => ( { ...obj, [ k ]: v } );
const objectifyV2 = ( obj, [ k, v ] ) => ( obj[ k ] = v, obj );
const objectifyV3 = arr => Object.assign( ...arr.map( ( [ k, v ] ) => ( { [ k ]: v } ) ) );
const nbLoops = 1 * 1000 * 1000;
const kv = [
[ "qwe", 123 ],
[ "asd", 456 ],
[ "zxc", 789 ],
];
let t0;
t0 = performance.now();
for ( let i = 0; i < nbLoops; ++i ) {
kv.reduce(objectifyV1, {});
}
console.log( "v1", performance.now() - t0 );
t0 = performance.now();
for ( let i = 0; i < nbLoops; ++i ) {
kv.reduce(objectifyV2, {});
}
console.log( "v2", performance.now() - t0 );
t0 = performance.now();
for ( let i = 0; i < nbLoops; ++i ) {
objectifyV3(kv)
}
console.log( "v3", performance.now() - t0 );
v1 866.4000000571832
v2 198.39999999385327
v3 758.4999999962747