Let's start with a definition: A transducer
is a function that takes a reducer
function and returns a reducer
function.
A reducer
is a binary function that takes an accumulator and a value and returns an accumulator. A reducer can be executed with a reduce
function (note: all function are curried but I've cat out this as well as definitions for pipe
and compose
for the sake of readability - you can see them in live demo):
const reduce = (reducer, init, data) => {
let result = init;
for (const item of data) {
result = reducer(result, item);
}
return result;
}
With reduce
we can implement map
and filter
functions:
const mapReducer = xf => (acc, item) => [...acc, xf(item)];
const map = (xf, arr) => reduce(mapReducer(xf), [], arr);
const filterReducer = predicate => (acc, item) => predicate(item) ?
[...acc, item] :
acc;
const filter = (predicate, arr) => reduce(filterReducer(predicate), [], arr);
As we can see there're a few similarities between map
and filter
and both of those functions work only with arrays. Another disadvantage is that when we compose those two functions, in each step a temporary array is created that gets passed to another function.
const even = n => n % 2 === 0;
const double = n => n * 2;
const doubleEven = pipe(filter(even), map(double));
doubleEven([1,2,3,4,5]);
// first we get [2, 4] from filter
// then final result: [4, 8]
Transducers help us solve that concerns: when we use a transducer there are no temporary arrays created and we can generalize our functions to work not only with arrays. Transducers need a Transducers are generally executed by passing to transduce
function to worktransduce
function:
const transduce = (xform, iterator, init, data) =>
reduce(xform(iterator), init, data);
const mapping = (xf, reducer) => (acc, item) => reducer(acc, xf(item));
const filtering = (predicate, reducer) => (acc, item) => predicate(item) ?
reducer(acc, item) :
acc;
const arrReducer = (acc, item) => [...acc, item];
const transformer = compose(filtering(even), mapping(double));
const performantDoubleEven = transduce(transformer, arrReducer, [])
performantDoubleEven([1, 2, 3, 4, 5]); // -> [4, 8] with no temporary arrays created
We can even define array map
and filter
using transducer
because it's so composable:
const map = (xf, data) => transduce(mapping(xf), arrReducer, [], data);
const filter = (predicate, data) => transduce(filtering(predicate), arrReducer, [], data);
live version if you'd like to run the code -> https://runkit.com/marzelin/transducers
Does my reasoning makes sense?
Your understanding is correct but incomplete.
In addition to the concepts you've described, transducers can do the following:
- Support a early exit semantic
- Support a completion semantic
- Be stateful
- Support an init value for the step function.
So for instance, an implementation in JavaScript would need to do this:
// Ensure reduce preserves early termination
let called = 0;
let updatesCalled = map(a => { called += 1; return a; });
let hasTwo = reduce(compose(take(2), updatesCalled)(append), [1,2,3]).toString();
console.assert(hasTwo === '1,2', hasTwo);
console.assert(called === 2, called);
Here because of the call to take
the reducing operation bails early.
It needs to be able to (optionally) call the step function with no arguments for an initial value:
// handles lack of initial value
let mapDouble = map(n => n * 2);
console.assert(reduce(mapDouble(sum), [1,2]) === 6);
Here a call to sum
with no arguments returns the additive identity (zero) to seed the reduction.
In order to accomplish this, here's a helper function:
const addArities = (defaultValue, reducer) => (...args) => {
switch (args.length) {
case 0: return typeof defaultValue === 'function' ? defaultValue() : defaultValue;
case 1: return args[0];
default: return reducer(...args);
}
};
This takes an initial value (or a function that can provide one) and a reducer to seed for:
const sum = addArities(0, (a, b) => a + b);
Now sum
has the proper semantics, and it's also how append
in the first example is defined. For a stateful transducer, look at take
(including helper functions):
// Denotes early completion
class _Wrapped {
constructor (val) { this[DONE] = val }
};
const isReduced = a => a instanceof _Wrapped;
// ensures reduced for bubbling
const reduced = a => a instanceof _Wrapped ? a : new _Wrapped(a);
const unWrap = a => isReduced(a) ? a[DONE] : a;
const enforceArgumentContract = f => (xform, reducer, accum, input, state) => {
// initialization
if (!exists(input)) return reducer();
// Early termination, bubble
if (isReduced(accum)) return accum;
return f(xform, reducer, accum, input, state);
};
/*
* factory
*
* Helper for creating transducers.
*
* Takes a step process, intial state and returns a function that takes a
* transforming function which returns a transducer takes a reducing function,
* optional collection, optional initial value. If collection is not passed
* returns a modified reducing function, otherwise reduces the collection.
*/
const factory = (process, initState) => xform => (reducer, coll, initValue) => {
let state = {};
state.value = typeof initState === 'function' ? initState() : initState;
let step = enforceArgumentContract(process);
let trans = (accum, input) => step(xform, reducer, accum, input, state);
if (coll === undefined) {
return trans; // return transducer
} else if (typeof coll[Symbol.iterator] === 'function') {
return unWrap(reduce(...[trans, coll, initValue].filter(exists)));
} else {
throw NON_ITER;
}
};
const take = factory((n, reducer, accum, input, state) => {
if (state.value >= n) {
return reduced(accum);
} else {
state.value += 1;
}
return reducer(accum, input);
}, () => 0);
If you want to see all of this in action I made a little library a while back. Although I ignored the interop protocol from Cognitect (I just wanted to get the concepts) I did try to implement the semantics as accurately as possible based on Rich Hickey's talks from Strange Loop and Conj.
来源:https://stackoverflow.com/questions/52274362/is-my-understanding-of-transducers-correct