js-coroutines has been able to process standard functions like parsing and stringifying JSON, or compressing data in idle time since it was launched - splitting up jobs over multiple frames so that everything stays smooth at 60fps - it now has the ability to build functional pipelines too:
const process =
pipe(
parseAsync,
mapAsync.with((v) => ({...v, total: v.units * v.price})),
stringifyAsync,
compressAsync
)
Here is a dummy routine that parses some JSON, works out a total value of items, stores it back in JSON, and compresses it.
We can then call this pipeline with our data:
const compressedData = await process(inputJSON)
The pipe
function creates an asynchronous process that, in conjunction with the standard js-coroutines, runs all of the jobs collaboratively on the main thread, ensuring that there is enough time for animations and interaction.
We can also just insert our own calculations that we'd like to split up:
const process = pipe(
parseAsync,
function * (data) {
let i = 0
let output = []
for(let item of data) {
output.push({...item,
total: item.units * item.price,
score: complexScore(item)
})
if((i++ % 100)==0) yield
}
return output
},
tap(console.log),
stringifyAsync
)
Here we put a generator function into the pipeline and make sure we call yield
now and again. This yield call will check that we have enough time to continue or will schedule the resumption of the function on the next idle.
New functions
Function | Parameters | Purpose |
---|---|---|
pipe |
...function each function can be an async function, a normal function or a generator A function takes the current value of the pipeline and processes it. You can use the |
Creates an async function to execute the pipeline |
tap |
function(current){...} |
This function adds a function to the pipeline that receives the current value, but does not return it's result. You can use it to cause side effects like logging or saving. The pipeline pauses execution until the function is complete. |
branch |
function(current){...} |
This function adds a function to the pipeline that receives the current value. You can use it to cause side effects like logging or saving. The pipeline DOES NOT pause execution, so a new continuation is formed from this point forwards. |
repeat |
function ,times
|
Creates a function that executes the specified function a number of times |
call |
function ,...params
|
This function enables calling another function that will take the current value of the pipeline but needs extra parameters. The parameters supplied will be appended to the current value of the pipeline. |
Demo
Top comments (9)
Hello,
Great work! I am using the coroutines in the project I am working on. Even though most of the data manipulation is done Redux-Saga and/or Reselect, there are various situations in which the coroutines are highly useful.
Congrats on the home page too. Very helpful explanations and examples. And certainly the "pipe" is most welcome for the functional approach.
Still. I have not found a way to execute the smooth animation while fetching data from server. I make a heavy use of "useSWR" and the animation halts. It seems there is no room left for animation frame:). If you have any suggestion it would be great.
Anyway keep up the good work!
Hey not used it, but looks useful :) Are you letting it parse JSON for you? If you are parsing JSON with it, I'd suggest using it to get plain text and then use
parseAsync
as this can be very slow - do feel free to create an issue on the GitHub and we can see what's possible.Thanks. Actually I do some minor data manipulation on arrival (e.g. lodash/fp "keyBy" s.o) then dispatch to Redux Store. I'll check it deeper and if needed I will make an issue on GitHub as you suggested.
Thanks again!
Ok that keyBy might be an issue if the data is huge. You/I/we could do a version of keyBy that uses reduceAsync and that would split it up over frames. Fundamentally that would probably look like this:
That's making a single lookup rather than a key to array - but it sounds like this is what you are after. It's a useful function, I may clean it and a few others like groupBy and add them. Basically anything that takes a while benefits from going through one of the async functions which yield every 8 times (by default) to see if there is still time.
Yes! It has to be this. The keyBy is really needed as the reducer was built that way for O1 access with selector and so on... I will try to put up something here to test but I am pretty sure that your keyByAsync would be much reliable than mine:) So I will stay tuned for update. Thanks a lot!
Totally makes sense to me. Ok I'll sort it over the next few days.
Reminds me of streams, or RxJS - i dont know if there is any conceptual overlap, just my hunch :)
I've never used RxJS, but I'm beginning to see some overlaps. I guess the main thing here is the use of generator functions to split a more "commonish" pattern up over multiple frames to maintain interactivity. I'm wondering about trying to get a hook into a more serious library that has additional features for functional programming but would benefit from collaborative multi-tasking.
Article on how js-coroutines works:
60fps Javascript while you stringify, parse, process, compress and filter 100Mbs of data
Mike Talbot ・ May 25 ・ 9 min read