add deepmerge to benchmark
ar53n opened this issue · 3 comments
I added deepmerge
and run benchmark and i got this results
Validation:
✘ JSON.stringify (FAILED @ "initial copy")
✘ fast-clone (FAILED @ "initial copy")
✔ lodash
✔ clone-deep
✘ deep-copy (FAILED @ "initial copy")
✔ depcopy
✔ klona
✘ deepmerge (FAILED @ "initial copy")
Benchmark:
JSON.stringify x 20,873 ops/sec ±3.54% (80 runs sampled)
fast-clone x 8,542 ops/sec ±15.12% (61 runs sampled)
lodash x 21,840 ops/sec ±7.48% (80 runs sampled)
clone-deep x 43,664 ops/sec ±6.99% (75 runs sampled)
deep-copy x 65,898 ops/sec ±5.09% (81 runs sampled)
depcopy x 15,026 ops/sec ±2.96% (83 runs sampled)
klona x 149,326 ops/sec ±3.57% (84 runs sampled)
deepmerge x 18,183,850 ops/sec ±3.00% (80 runs sampled)
Hmm.. I think you added the item incorrectly? It requires you to pass two parameters: a target and a source. Otherwise, deepmerge
is just performing a NOOP – hence the failed validation and crazy speed 😄 (though I would have believed it!)
Your bench item should look like this:
bench.add('deepmerge', () => deepmerge([], INPUT));
When I run this, I get this – notice the passing validation:
Validation:
✘ JSON.stringify (FAILED @ "initial copy")
✘ fast-clone (FAILED @ "initial copy")
✔ deepmerge
✔ lodash
✔ clone-deep
✘ deep-copy (FAILED @ "initial copy")
✔ deepcopy
✔ klona
Benchmark:
JSON.stringify x 38,008 ops/sec ±0.67% (87 runs sampled)
fast-clone x 24,130 ops/sec ±0.62% (93 runs sampled)
deepmerge x 29,913 ops/sec ±0.66% (96 runs sampled)
lodash x 40,275 ops/sec ±1.20% (93 runs sampled)
clone-deep x 84,841 ops/sec ±0.11% (96 runs sampled)
deep-copy x 115,552 ops/sec ±0.12% (97 runs sampled)
deepcopy x 23,948 ops/sec ±0.55% (98 runs sampled)
klona x 266,543 ops/sec ±0.26% (97 runs sampled)
@lukeed Thanks for your comment. Indeed, I forgot that this is a merge and not a clone. That different tools
Correct. Deepmerge is great but I didn't include it because even though it can match behavior, it's meant to serve a different purpose.
If Duff wants to be listed here I'll happily do so, but closing in the meantime.
Thanks 👍