I read thousands of user records from Mongo to memory and loop over them to send emails. I was thinking when I will have a lot more users, looping over them using a for/while loop may have performance impact. How do you handle that case? Is there any trick that lets you loop over arrays in non-blocking manner?
Just have a function that processes a limited number of records then calls itself with setImmediate to handle the next batch of records, that will give node time between each batch to handle i/o events.
In ES6 this can be done by using generator functions. Personally I don't like generators because you can have many yields inside a *function while a standard function can have only one return statement.
Promises are what I and use for such things.
Full Stack Dev/Backend Dev with a hobby
Gary Torres
I'm a software developer. Passionate for NodeJS, Javascript and web technologies.
When you have to run long tasks like this you don't want to have your main server unresponsive while processing all this data. A good solution could be running these tasks on a separate worker. Put the tasks on a queue and let a worker handle it by chunks. If your tasks (or jobs) keep growing, you can span another worker to keep up with the load while your main web server is responsive.