deepmerge and deepmerge.all behave differently with certain options
ArtskydJ opened this issue · 4 comments
deepmerge(a, b, opts)
and deepmerge.all([a, b], opts)
behave differently given certain options.
Repro:
const deepmerge = require(`deepmerge`)
const data1 = {
a: false,
b: true,
c: false,
d: true,
}
const data2 = {
a: false,
b: true,
c: true,
d: false,
}
const deepmerge_opts = {
customMerge: key => (a, b) => a && b,
isMergeableObject: () => true,
}
const log = o => console.log(JSON.stringify(o))
log(deepmerge(data1, data2, deepmerge_opts)) // {"a":false,"b":true,"c":false,"d":false} (EXPECTED)
log(deepmerge.all([ data1, data2 ], deepmerge_opts)) // {"a":false,"b":true,"c":true,"d":false} (NOT EXPECTED)
I think the root cause is that deepmerge(data1, {}, opts)
is assumed to have no effect, which is essentially what happens in the reduce
bit of deepmergeAll
here's a potential fix, but it might not work right if the array is <2 elements in length... I'm not sure.
function deepmergeAll(array, options) {
if (!Array.isArray(array)) {
throw new Error(`first argument should be an array`)
}
return array.slice(1).reduce((prev, next) => deepmerge(prev, next, options), array[0] || {})
}
Oh hey, that's interesting.
I think for arrays passed in with 2+ elements, the nicest solution would be to just not pass in an initial value to reduce
at all, since if you don't supply one, reduce
uses the first element in the array.
If we made that change, I think we'd have to handle the 0 and 1 element cases specifically – if the array has 0 elements, return {}
, if it has 1 element, return deepmerge({}, array[0], options)