home  Photo by saigontechnology.com
Transducers - higher order functions on steroids
How to leverage the higher order function pattern to solve expensive operations.
Oct 03, 2020
#javascript

We humans are completely domain blind. Even if we recognise one pattern is one domain we usually fail to apply it on another. That is a huge problem with learning. For example, we know how to use the higher-order function pattern to enhance react component or add side effect to otherwise pure functions, yet we might fail to recognise how to optimise operations on large datasets. I am definitely one of these people.

# .css-1598u3e{color:var(--theme-ui-colors-text,black);-webkit-text-decoration:none;text-decoration:none;}Transducers

Transducers meant to solve the problem of iterating through a collection many times. Imagine you have an array. We we talk about arrays mostly but you can substitue with other data types.

`.css-1hrqswa{max-width:1200px;margin:0 auto;}.css-79elbk{position:relative;}.css-wnw485{cursor:pointer;}.css-wnw485:hover{-webkit-text-decoration:underline;text-decoration:underline;}copyjs.css-h5fkc8{margin-top:32px;}.css-l8hj4q{font-family:Fira Code;}`.css-mz9qt7{padding-top:4px;padding-bottom:4px;}const array = [1,2];array.map(x => x + 1); // [2,3]``

Pretty easy thing. We map through an array, and return a new one. Our array contains only two elements, no problem. Let's complicate this. Assume we not only need to implement the increment by one functionality but also a lot of other logic.

`copyjs`const array = [1,2];const increment = (x) => x + 1;array  .map(increment)  .map(increment)  .map(increment)  .map(increment)  .map(increment);``

Due to pure laziness I used increment, but imagine they are different functionalities. This time a new array is recreated on every map, and also map iterates through it.

If our array is big, this can start to be a problem. No problem, since every increment, or any other function we potentially pass to the map as a callback have the same signature:

`copyts`type MapCallback<A, B> = (a: A) => B;``

That makes them composable. In other words the below code will do the same as the above, minus all the new array creations and iterations.

`copyjs`const array = [1,2];const increment = (x) => x + 1;array  .map(compose(increment, increment, increment, increment));``

If you are not familiar with compose, it is basically a function that fees the result of the previous function to the next one evauluating from right to left.

`copyjs`const compose = (...functions) => (x) => functions.reduceRight((result, fn) => fn(result), x);const increment = x => x + 1;const double = x => x * 2;const resultWithCompose = compose(increment, double)(1)const resultWithClassicalCompose = increment(double(1))``

Here resultWithCompose and resultWithClassicalCompose produces the same result. With compose, first we get back a function calling compose with double and increment. Then we call that function with 1. It will first double it, and then increment it.

The same withresultWithClassicalCompose. It is just more visible.

The reason we use compose instaetd of resultWithClassicalCompose is scalibility. It would be tedious to write 500 functions like this.

So far so good. But what happens when we put some filter into the game.

`copyjs`const array = [1,2];const increment = (x) => x + 1;array  .map(increment)  .map(increment)  .filter(x => x > 3)  .map(increment)  .map(increment);``

We cannot compose this. Why? Because filter has a different signature than map.

`copyts`type MapCallback<A, B> = (a: A) => B;type FilterCallback<A> = (a: A) => boolean;``

Filter expects a boolean as a return value. We are in a bit of a trouble. How we can solve this? By making map and filter has the same signature. How? With reducers.

## Reducers to the rescue

Ok, we have seen the problem. We use transducers with large datasets to optimise performance. That means we are never ever going to use it for arrays with a length of for example, 100. Never.

Scaling is very important. What works with your family is not working in a corporate environment. You probably never paid for you Mum making a dinner for you. Neither you accepted a job without a contract just with pure trust. Scaling.

Just like that we can reap the benefits of transducers with large datasets. Its speed benefit is neglectable, not to mention its complexity overhead. Transducers are not the easiest to comprehend even if you know functional programming techniques.

## What are transducers?

That is all good, but what are transducers. Transducer are higher order reducers. Easy. Is it? No.

## Reducify map.

Map applies a transformation function to an array and gives back a new one. We can try to recreate it using reduce.

`copyjs`const array = [1,2];const increment = (x) => x + 1;const map = (mapFn) => {  return (sum, item) => {    return sum.concat(mapFn(item));  };};const reducerFn = map(increment)array.reduce(reducerFn, []);``

So, this map function takes a callback, just as the the normal one, but returns a function that is a reducer. It is a reducer because it receives a sum and item, and gives back a new sum. In this case sum is an array.

`copyts`type ReducerFunction<Sum, Item> = (sum: Sum, item: Item) => Sum;``

So calling map with a callback gives us back a reducer function. That reducer can be passed to reduce, and an empty array as the second argument.

At this point, this works just fine.

## Generalizing map

Map and reduce should work on other collections, so we can generalize the map function.

`copyjs`const array = [1,2];const increment = (x) => x + 1;const map = (mapFn) => (aggregator) =>  {  return (sum, item) => {    return aggregator(sum, mapFn(item))  };};const reducerFn = map(increment)((sum, item) => sum.concat(item))array.reduce(reducerFn, []);``

What happened? We generalized our map function a bit further through closures, and partial application.

The sum.concat is only viable through arrays, so we supply it from the outside.

Notice this. Aggregator is a reducer function because it receives sum item, and returns a new sum. Ok. But our map function not only takes a reducer but returns another one. The returned function has the same signature. That makes it a higher order reducer.

Also, it makes it s perfect candidate for composition.

`copyjs`const array = [1,2];const increment = (x) => x + 1;const double = (x) => x * 2const map = (mapFn) => (aggregator) =>  {  return (sum, item) => {    return aggregator(sum, mapFn(item))  };};const incrementCallback = map(increment)const doubleCallback = map(double)const reducerFn = compose(incrementCallback, doubleCallback)((sum, item) => sum.concat(item))array.reduce(reducerFn, []);``

That is the most confusing step I guess. Both incrementCallback and doubleCallback are transducers (higher order reducers) because they will take a reducer, and return a new reducer.

They will not take the value of sum and item. What we are piping through the compose here are not normal values but functions. That is the mindf**k here.

Functions are first class citizens in js. Therefore we can pass functions as arguments (as we did ((sum, item) => sum.concat(item)) ) and return them.

Let's dissect this.

`copyjs`const reducerFn = compose(incrementCallback, doubleCallback)((sum, item) => sum.concat(item))// imagine this is doublecallback// we pass ((sum, item) => sum.concat(item)) as the argument. That means aggregator here equals to//  ((sum, item) => sum.concat(item))// If we follow the pipeline, doublecallback will return a function, that will be passed to// incrementcallback// that function is// return (sum, item) => {//    return aggregator(sum, mapFn(item))// };const map = (mapFn) => (aggregator) =>  {  return (sum, item) => {    // we can replace aggregator with ((sum, item) => sum.concat(item))    // return sum.concat(item)    return aggregator(sum, mapFn(item))  };};// Imagine this is increment// here the aggregator is the return value of the previous function, doublecallback// const map = (mapFn) => (aggregator) =>  {  return (sum, item) => {    // aggregator is the return value of doublecallback    /**    return (sum, item) => {      return aggregator(sum, mapFn(item))    };    */    // here the aggregator is not the original function, ((sum, item) => sum.concat(item))    return aggregator(sum, mapFn(item))  };};``

This is a bit hard to swallow. The main takeaway is that we are piping through functions on each step. So the next function will receive that as an argument t and can call it.

## Filter

From here, it is just one tiny step to have our solution. We need to elevate the filter to the level of reduce.

`copyjs`const array = [1,2];const increment = (x) => x + 1;const double = (x) => x * 2const map = (mapFn) => (aggregator) =>  {  return (sum, item) => {    return aggregator(sum, mapFn(item))  };};const filter = (predicate) => (aggregator) =>  {  return (sum, item) => {    if(predicate(item)){      return aggregator(sum, mapFn(item))    }    return sum;  };};const incrementCallback = map(increment)const doubleCallback = map(double)const filterTwo = filter(x => x === 2)const reducerFn = compose(filterTwo, incrementCallback, doubleCallback)((sum, item) => sum.concat(item))array.reduce(reducerFn, []);``

With creating a reducer based implementation we have the same reducer => reducer interface. It works a but different than map, but that is expected. The magic is in compose again. filtertwo filters out everything except 2. The awesome thing is that the maps will not even run on anything else, just on 2. No unnecessary iteration, no new array creation.

## Conclusion

Transducers are awesome. If you need them. Even if you don't use it, it is good to see what you can accomplish with function composition. If you like this pattern, check out Reduc middleware. It is the same thing.

check out other topics