问题
I have two lists of nodes, who shaped like so:
interface TreeNode {
data: {
name: string,
sharedProp: boolean,
oldProp: boolean
},
children: TreeNode[],
parents: TreeNode[],
thereAreSomeShallowProps: any,
}
The full dataset would be an array of TreeNode
What I'd like is to have a function that I can traverse down this tree, merging changes in a changes
tree into the base tree. Some of the feature it'd need:
- Match the values of a specified key (in this case support multilevel keys), and merge the matches
- possibly with
flatten
andgroupBy
- possibly with
- When the values of the object are arrays, recurse
- Resistant to circular references
- Able to work with very large objects (over 100k nodes, at least)
Some of the functions I've looked at (but am unsure how to string together to build the function I want):
applySpec
groupBy
mergeWithKey
mergeDeepWithKey
Here is a sandbox to check out, with some tests that should explain better what I'm trying to achieve
回答1:
While this may not be the best approach, it's one we can build easily with tools we have around the house. (In my case, with ones I wrote in another StackOverflow answer.) I freely used Ramda functions here, as the question was tagged Ramda (disclaimer: I'm a Ramda author), but below I show an alternate version that builds the required utility functions from scratch.
This makes the assumption that your changes object will be and/or will include sparse arrays. If not, how do you plan on matching things up?
Here is my approach:
// Helper or utility functions
function * getPaths(o, p = []) {
if (Object(o) !== o || Object .keys (o) .length == 0) yield p
if (Object(o) === o)
for (let k of Object .keys (o))
yield * getPaths (o[k], [... p, Number.isInteger (Number (k)) ? Number (k) : k])
}
const allPaths = (o) => [... getPaths(o)]
// Main function
const applyChanges = (obj, changes) =>
reduce ((o, p) => assocPath (p, path (p, changes), o), obj, allPaths (changes))
// Sample data
const base = [
{a: 1, b: {c: 11, d: [{e: 100}, {e: 111}]}},
{a: 2, b: {c: 22, d: [{e: 200}, {e: 222}]}},
{a: 3, b: {c: 33, d: [{e: 300}, {e: 333}]}},
]
const deltas = [
{a: 8, b: { d: [ , {e: 888}]}},
,
{ b: {c: 99, d: [{e: 999}, ]}},
]
// Demonstration
console .log (
applyChanges (base, deltas)
)
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.27.0/ramda.js"></script>
<script> const {reduce, assocPath, path} = R </script>
allPaths
finds the paths to all leaf nodes in an object, with array indices shown as numbers and other keys as strings. For instance,
const foo = {a: 42, b: {c: 12, d: [{e: 10}, {e: 20}]}}
allPaths (foo) //=> [["a"], ["b", "c"], ["b", "d", 0, "e"], ["b", "d", 1, "e"]]
That is just a thin wrapper around the generator function getPaths
, which does the actual recursive heavy lifting for this. We could write a plain recursive version of this, but generators often make it simpler to write such traversals.
With a list of paths in the changes object, we can then apply the values to make a new copy of our main object. This is done in applyChanges
, our main function. It finds the paths in the changes
object, and then uses Ramda's assocPath
and reduce
to fold them into our main object.
Here we might have some inefficiencies in speed and memory for two reasons. For speed, we're chasing down the value at each path when we call path(p, changes)
, but we'd already done the appropriate traversal in getPath
. There would probably be some savings in reporting a different structure with both path
and value
out of getPath
and then using them in applyChages
. This does not affect the algorithmic complexity, just the coefficients, and I wouldn't worry about it unless it turned out to have measurable problems. As to memory, this style of reduce
with assocPath
involves creating new objects on every iteration. Given that there is important structural sharing, this may not be a big deal, but for a large changes
object, this might conceivably be an issue. (These would not be major worries for me, but I keep such sorts of things in the back of my head.)
Without Ramda
Because I tend to think in Ramda, I wrote the above using Ramda tools. But there are only a few functions involved. R.reduce
can trivially be replaced in this case by Array.prototype.reduce
, and we can write our own versions of R.assocPath
and R.path
fairly easily. Here is another version that uses no library:
// Utility functions
const isInt = Number.isInteger
const path = (ps = [], obj = {}) =>
ps .reduce ((o, p) => (o || {}) [p], obj)
const assoc = (prop, val, obj) =>
isInt (prop) && Array .isArray (obj)
? [... obj .slice (0, prop), val, ...obj .slice (prop + 1)]
: {...obj, [prop]: val}
const assocPath = ([p = undefined, ...ps], val, obj) =>
p == undefined
? obj
: ps.length == 0
? assoc(p, val, obj)
: assoc(p, assocPath(ps, val, obj[p] || (obj[p] = isInt(ps[0]) ? [] : {})), obj)
// Helper functions
function * getPaths(o, p = []) {
if (Object(o) !== o || Object .keys (o) .length == 0) yield p
if (Object(o) === o)
for (let k of Object .keys (o))
yield * getPaths (o[k], [...p, isInt (Number (k)) ? Number (k) : k])
}
const allPaths = (o) => [... getPaths(o)]
// Main function
const applyChanges = (obj, changes) =>
allPaths(changes).reduce((o, p) => assocPath(p, path(p, changes), o), obj)
// Sample data
const base = [
{a: 1, b: {c: 11, d: [{e: 100}, {e: 111}]}},
{a: 2, b: {c: 22, d: [{e: 200}, {e: 222}]}},
{a: 3, b: {c: 33, d: [{e: 300}, {e: 333}]}},
]
const deltas = [
{a: 8, b: { d: [ , {e: 888}]}},
,
{ b: {c: 99, d: [{e: 999}, ]}},
]
// Demonstration
console .log (
applyChanges (base, deltas)
)
Direct Approach
These two versions both use a fairly indirect approach to the problem. I happened to have handy these tools that let me build the main function quickly. But I'm sure there is a more direct recursive approach. If I find time, I'll look to create one.
来源:https://stackoverflow.com/questions/60333488/ramda-recursive-merge-based-on-keys-match