There are some possibilities to work with requests that require heavy processing. I will base my response on the assumption that you are using nodejs as frontend server.
First you will have to decide whether the request that requires processing will be synchronous (user expects server response) or asynchronous ( user sends the request and forgets it... at some point the server finishes processing and sends a notification to the client, by websocket or long client pooling).
Based on the answer of the previous question the architecture of your back will be different. If you choose to do synchronous, you will need to ensure that the answer will not take long to be generated. So I think the best option in this case is to make a compiled library and wrapper it for the nodejs and make the direct call on Node. In case you chose the asynchronous option. You can choose to use queues to distribute the processing load to other processes and leave the nodejs server free to understand other requests.
When using a queue the nodejs would play the messages containing the information needed to do the processing inside the queue and other processes (Workers) would be monitoring the queue to pick up these messages and actually do the processing.
You can also use queues with synchronous requests, but you’ll have to make sure you always have enough Workers to fill the queue quickly.
Some examples of queues would be: Zeromq, Rabbitmq, Ironmq, SQS(AWS)... There are many others, have to see which one best meets your requirements.
Some advantages of the queued approach are:
- Decoupling between the modules
- Easy to climb
- Most queues have an http interface, which makes it simple to use them in any language. (Zeromq being an exception because it is a library and not a program, but it has wrapper for many languages so it is the same).
- Queues cushion the impact of requests peaks.
As for the part of synchronous vs asynchronous I will go a little deeper to not generate doubts.
There are two levels where you should choose whether the call will be asynchronous or synchronous. First level is the client request to the server. Will the client wait for the answer or will he make the request and go do other things while the server processes it? Second level is how the server will handle the operation itself that it should do upon receiving a request. in case of synchronous the server will be able to meet a request per process/thread. in the case of asynchronous the server is able to meet multiple requests per thread.
In the case of Node because it is javascript and has only one thread that executes its code, the language forces you to use server operations that require asynchronously I/O. It turned out that this model can meet more requests than the synchronous model + a request per thread.
When I asked about the request being asynchronous or synchronous in your case, I was referring to the first option and not the second. How you use Node all your internal server operations that use I/O will be asynchronous. What you should decide is whether your user will receive his or her response at the time, with the response of the HTTP request he or she makes, or after, following one of the 3 options:
- Consulting the pooled server to see if the operation has been completed.
- Receiving a notification via websocket.
- When you refresh the page.
If the answer helped, could mark as answer?
– Vinicius Zaramella
In the case of an asynchronous request, the process would be blocked until the task is finished and the response is sent to the customer, so it should be finished as soon as possible, right? Would this block only for the current request or block the entire server from receiving requests? as I said I don’t know much about nodejs but this server locking is a feature of Node’s single thread nature, right? or am I confusing things here? In the case of asynchronous requests with long pooling, what effect would that have on other requests in general...
– valterrodri
@valterrodri, I updated my answer.
– Vinicius Zaramella
first by correcting, above I exchanged synchronous request for asynchronous right at the beginning of the comment. Second, I assume that with customer waiting you tell the system user/site, right? In this case yes, the client would remain on the page and in the state it was at the time of the request. What leaves me confused is the server side. Like nodejs eh single-thread, I wanted to know an efficient way to handle requests that require high processing and not block receiving new requests. That would be the whole point of getting these "heavy" requests through the Ode.
– valterrodri
The request will go through the Node, but as soon as it hits the Node you have the heavy processing run asynchronously, through queues. Node continues to receive request and when it receives the processing response it sends to the user without blocking anything.
– Vinicius Zaramella
As almost any function in Node is asynchronous it will not lock...unless q vc do a while(true) or a very time consuming. But if you’re using Ode that way it’s wrong.
– Vinicius Zaramella
so even calling the external application from within the Node, the processing remains asynchronous, right? Modelling the system this way allows the system to scale, say, worldwide, with billions/billions of simultaneous requests?
– valterrodri
Correct. To scale worldwide you will need to make the server stateless as well so you can have multiple instances of Node running smoothly.
– Vinicius Zaramella
Okay. I get it. So, since all requests would go through Ode anyway, the following question arises: The http module of Node is reliable to keep an application on this scale in the air?
– valterrodri
Yes. But that’s another question... Normally you will want to put a Nginx in front of Node’s http server to keep more control over static file caching and make it easier to add multiple Node processes in one instance.
– Vinicius Zaramella
Okay. I think I now have a better basis to follow my research. I ask something else.. Thanks for the answers
– valterrodri
@valterrodri, You are welcome, if my reply helped you could mark as a response? Thank you.
– Vinicius Zaramella