We have a service that revises over 10 million requests per day. the first image is the structure of the current situation:

We have some problems now. The main part that contains PHP, Mysql is not scalable and we can not change that part at all. But we want to add a new layer before that.
In the next picture, you see a new model that contains Nodejs and Redis and MongoDB as a new layer here. Every query that has been done on MySQL also has been done on MongoDB. I mean now we have a MongoDB that has synced data by MySQL completely. We want to bring an auth layer before Mysql in MongoDB and Nodejs because this layer is scalable for us.
Nodejs will forward only valid requests to next part.

Is Nodejs a good choice here? Or it`s better to use Go Lang instead. Or Scala or Elixir or Rust? Is this model efficente in general? Will it work? What is your suggestion?
if the service just does what you described. I would use GO or Node.
I love Rust but it gives you way more options and GO is straight forward esp using the http library already unleashes a HA web-server handling 155l~ requests a second by just using a small binary. (the footprint is way bigger than Rust, but lets go for low-hanging fruits before you want to think about memory models.
I don't understand mongoDB unless you have a lot of writes, because reads are easy for mysql with writes you can think of switching to tokuDB for higher insert rates ... personally I think what you're trying sounds like a CQRS approach.
But I don't understand everything you want to achieve, maybe you're storing a lot of documents so MongoDB is obviously a very good choice. But I would have to really look through your application and measure it. :)
But from a pure programming / problem domain point of view with the estimation of language complexity and learning curve to output -> node is a very good choice if you want to have it easy, go if you want to have it a bit more challenging but want higher rewards and rust if you a really willing to walk the extra miles and just use 40MB for doing what other systems do with 400MB ....
Rust should also not have the possible GC bottleneck of GO but 10Mio Requests per day (11.574074074 ~ per second) shouldn't be a problem for GO.
Matt Strom
Software Engineer, TypeScript ninja
Node is quite sufficient as web server. It can handle a large amount of requests very efficiently, especially if it is horizontally scalable (which you should do). Go and Rust (don't know about Elixir) would certainly be faster servers, but something makes me think the Node web servers are not the bottleneck.
If the authentication process is the main bottleneck, perhaps you could use JSON Web Tokens (JWTs) to alleviate the load on the database. Using JWTs will allow the client to send back a cryptographically signed token with each request that can be checked at the web server thus decreasing the number of calls to the database.
The Redis-Mongo-MySQL interconnect is a bit weird, though not necessarily wrong. Why all three? What is Mongo's purpose?
There are some other strategies to improve database throughput. There is sharding which helps with horizontal scalability. If your database requests are mostly reads, you can use read replicas on MySQL.