Taskq does not rely on any module system, no bundling like rollup, webpack etc. And no require calls etc. That was the plan,
- Arguments: you do a import on a script that has an iief and function enclosed in it:
!function(){
function someF(a,b,c){
}
}
with taskq i can pass those a,b,c.
- Promise all resolves as soon as soon as all the promises in the array are resolved --> there is no concept of order there. And also this is a single dimensional array where in real life you have script1 --> script2 -->script3 while if script2 has internal imports real time in browser!, script3 should wait. If you do plain await that won't work because if one of the dynamic scripts within script 3 etc. keeps importing itself, you will get stuck. You need a system to throttle those.
- You have taken that granted in Taskq? yes it is handled like that. But in your case you are not holding pointers to exported variables anywhere, so the generator can't reach them.
- Recursive imports are cases when a script within itself imports itself. If this happens you need to pause the main thread, allow the script to import itself while throttling this behaviour and allow it to reach exported variables meanwhile. Plain await resolves as fast as it can, you cannot throttle that without returning another Promise --> less economy of motion.
- That is not very relevant, refer to point 1
- I'm not using a module system. I do series of schwartzian transforms to sort dependencies of main thread. Then I can halt the main thread if some dynamic imports are seen.
- Thennables do not take care of pausing resuming their own execution unless you return another Promise to resolve/reject where as in my case I can recycle an entire resolver through the whole main thread which I can think better memory vise. Imagine you have 600 dynamic scripts loads to run, you will return 600 Promises or a single object to recycle?
For RxJS I think it is an ammaaazing library! But it is an entire concept of wrapping everything up in their own terminology including ajax calls etc or even browser events so that they can provide control flow. It is an entirely separate layer. So for example someone could use that within tasks pushed to Taskq!! In taskq there are 4 things to learn: export, push, perform, load.
Where as in my case I wanted someone to have let's say 10 async script tags in his html, browser would fetch them asap, taskq would sort their dependencies, get their inner functions, pass the correct arguments and execute them while pausing if for instance within one script, I realize the client's browser is super old and I need to load one more script, then resume the rest of those initial async scripts etc.
This way I could have a modular system without touching a single piece of module bundlers + pollyfills starting from ie9+ with 6kB.
In general I find bundlers quite antipattern --> Some people loose their minds when they see 100 scripts, I loose my mind when I see a single 500kB bundle.js with tens of pollyfills which I haven't looked into them in detail.
For some projects I use a bundler for instance, but for larger projects I want to see BOTH on development side and client side what is going on at all times (I am willing to give away from loading time here). At some point I can always combine taskq and a bundler, taskq would deal with the bits I push to it and I could still have the rest of 'heavy cannons' in my app.
Now everyone is into concepts like bundling, treeshaking, static analysis etc and I think with wide use of HTTP2.0 this pattern 'might' revert as things that were bottlenecks won't be anymore.
And lastly, the idea of taskq stems from not just ordering asynchronous calls, but also use all the microtask queues available in the browser. So far we had 3, setTimeout/Interval (which for obvious reasons cannot use), requestAnimationFrame (to throttle), Promises (to offload even more). ~17ms throttling is a reasonable time frame for parts of an application that does not need to load asap. I used this sort of throttling for data visualization for years, which I spoke bit in detail in one my articles in ieee:
ieeexplore.ieee.org/document/8291800
I talked too much..It was great conversation Matt, thank you!