First, any implementation using forEach,map etc will at best be 'wrappers' as they are themselves not async & parallel..
I can foresee a path where you might achieve parallelism by sub dividing your iterable (array or whatever) into sub-parts and then 'sequentially' iterate on those sub-parts and then combine them within a promise or another object.
That being said I liked the question so I did a strange experiment. This is sequential, but unique in the sense that it uses both microtask queues:
Array.prototype.asyncIterate = async function(f,i,l){
i = i || 0;
var copy = i ? this : this.slice(),
d = copy[i];
l = l || copy.length;
await (async function(copy,d,i){window.requestAnimationFrame(function(){
f.call(copy,d,i,copy);
})}(copy,d,i));
if(i === l-1) {
return true;
}
return copy.asyncIterate(f,++i,l);
}
now you can:
var x = [1,2,3,4];
x.asyncIterate(function(d,i){console.log(i)});
//0
//1
...
The above will not mutate the array it is called on, and will return a resolved promise to true in the end. Did not do a performance test :) Eventually if you make another function and slice the main array into subparts and call asyncIterate you might achieve the effect you want.
Also don't change native prototypes, I just did it to explain this case only.