Sometimes the value changes to an equivalent one. And here there are different approaches to cutting off degenerate calculations..
👯 Every: Reaction to every action
🆔 Identity: Comparison by reference
🎭 Equality: Structural comparison
🔀 Reconcile: Structural reconciliation
👯 Every
In libraries like RxJS, each value is a unique event, which results in reactions being fired unnecessarily.
777 != 777
To prevent this from happening, you need to write additional code, which is often forgotten and then screwed up.
🆔 Identity
Many libraries can still compare values. And if the state has not changed, then the reactions do not work. And if it has changed, even by an equivalent value, then they work.
777 == 777
[ 1, 2, 3 ] != [ 1, 2, 3 ]
If we filtered a new array with the same content, then most likely we do not need to run a cascade of calculations. But manually keeping track of all such places is not very realistic.
🎭 Equality
The most advanced libraries, like $mol_wire, do a deep comparison of the new and old value. In some others, such as CellX, you can enable it in the settings.
777 == 777
[ 1, 2, 3 ] == [ 1, 2, 3 ]
[ 1, 2, 3 ] != [ 3, 2, 1 ]
This allows you to cut off unnecessary calculations as early as possible - at the time of making changes. And not at the moment of rendering the newly generated VDOM into the real DOM, as often happens in React, in order to find out that there is nothing to change in the DOM.
Deep comparison is, of course, a more expensive operation in itself than simply comparing two links. However, sooner or later, you will still have to compare all the contents. But it’s much faster to do this while the data is nearby, and not when it scatters across a thousand components during the rendering process.
However, if the data has changed, it will fly further through the application and will be deeply compared again and again, which is no good. Therefore, it is important to implement caching of the result of a deep comparison for each pair of compared objects.
🔀 Reconcile
Finally, you can go even further and not just compare values deeply, but also reconcile them to preserve references to old objects when they are equivalent to new ones.
const A = { foo: 1, bar: [] }
const B = { foo: 2, bar: [] }
reconcile( A, B )
assert( B.foo === 2 )
assert( B.bar === A.bar )
As you can see, A
and B
Here we have different ones, but the bar
property remained as it was. This is good from the GC's point of view, because we are reusing objects that were in the old generation of garbage collector. And the object from the younger generation was discarded during reconciliation, which was very quick.
In addition, when in the future the component that renders bar
is checked again, this will happen extremely quickly, because the old and new values will be identical. On the other hand, if the objects still differ, then the re-verification will again go through all the insides of the object. Here again caching is necessary. But..
Changing the fields of a new object to values from the old one is not always a safe operation. For example, you cannot perform such tricks with DOM elements. At best it won't work, and at worst it will break it altogether. In some cases, you will receive an exception when you try to modify an object. Sometimes changes will simply be ignored, and sometimes setters will be launched that do some weird things.
In addition, if the objects are identical, we need to compare them deeply first. And if they are the same, then return the old one, and if not, start changing the new one. Well, either immediately start changing the new one, only to find out later that we have changed all its properties, so we need to return the old one.
In short, this approach is not the fastest or most reliable, so in $mol_wire we abandoned it in favor of a deep comparison with caching.
Fast Deep Comparison
Some objects (for example, Value Object) can be compared structurally, while others (for example, DOM elements or business entities) cannot. How to distinguish them?
Well, standard types (arrays, structures, regular expressions, etc.) can simply be detected and compared structurally.
With custom ones it’s a little more complicated. By default, we will not take risks, but will compare them using the link. But if the Symbol.toPrimitive
method is declared in an object, then we consider that it is a serializable object, which means that such objects can be compared by comparing their serialized representations.
class Moment {
iso8601:string
timestamp: number
native: Date
[ Symbol.toPrimitive ]( mode: 'number' | 'string' | 'default' ) {
switch( mode ) {
case 'number': return this.timestamp
case 'string': return this.iso8601
case 'default': return this.iso8601
}
}
}
If we compared deep structures and found that they are generally different, then when we pass on their subtrees, it would be reckless to compare them again and again, because we have already done this before. Therefore, we will cache the result of a deep comparison of pairs of objects in double WeakMap, which will ensure that the cache is automatically cleared as the objects are collected by the garbage collector.
After comparison, one of the Left
and Right
objects is usually discarded. In the first case, this will free the entire cache and all its data, which means that when the value changes, we will not have anything extra in memory. In the second case, only the data will be freed, and the cache itself will remain for future comparisons, which prevents unnecessary allocation of caches when equivalent values arrive frequently.
Finally, there may be cyclic references in the data. At a minimum, you can’t go into an endless loop here. It is advisable to compare them correctly.
For example, the following two objects are structurally equivalent:
const left = { id: 'leaf', kids: [] }
left.kids.push({ id: 'son', parent: left })
const right = { id: 'leaf', kids: [] }
right.kids.push({ id: 'son', parent: right })
It turns out that maintaining circular references is not at all difficult when we already have a cache. First we write into it that the objects are equivalent, and dive into the depths. If we come across this pair of objects again, we will take the value from the cache and move on. If we find differences somewhere, then we will later correct it in the cache that the objects are still not equivalent.
As a result, we got a library $mol_compare_deep, 1 kilobyte in size, which is several times faster any others presented in NPM:
Top comments (0)