Thanks for inviting Amulya.
I have never used Go or Python, so I can't comment on how Node.js stacks up against the other two. But you can certainly use Node.js to achieve this.
Firstly, you need to decide if you really want to scrape other websites or just want to use some kind of API? The problem with scraping is that the layout of the webpages can change any time and they may block your IP if you make excessive requests. So, I would say if you are doing something for production usage you should go with an API based approach. Otherwise, if this is just for fun or internal usage I guess scraping won't be a problem. By the way Cleartrip already has an API for checking flight prices. You may utilise that.
Anyway you need to do a lot of processing. As Jose suggested async module is a neat tool to parallelize various tasks. If you have a set of predefined websites for crawling, you can process each one parallelly and once you are done you can collect the results and send the data to client (see async.parallel()). If you decide to use an API it will be pretty straightforward IMO.
Hope this helps. Let me know if you have any questions.