What is the best way to make multiple requests to an API?

Asked

Viewed 115 times

4

I came to ask this question once again thinking that it would be beneficial for the community to have a canonical way of making several http requests to a particular API.

More specifically this question motivated me to ask my question.

What is the best way to make multiple requests to an API? (Take into account reducing as much time as possible to get all answers)

To be a little more concrete: How can I get the first 10 pages of the stackechange api, being https://api.stackexchange.com/2.2/answers?page=1&pagesize=10&order=desc&sort=activity&site=stackoverflow the front page.

1 answer

3


Utilizing Tasks

The idea is to sign the orders and add them to a list of tasks with all requests. Then wait for the answers with Task.WaitAll:

var tasks = new Task<string>[10];
var api = "https://api.stackexchange.com/2.2/answers?page={0}&pagesize=10&site=stackoverflow";
using(var client = new HttpClient()){
    for(int i = 1 ; i < 11; ++i){
        var url = string.Format(api, i);
        tasks[i-1] = client.GetAsync(url)
            .ContinueWith(t => t.Result.Content.ReadAsStringAsync())
            .Unwrap();
    }

    Task.WaitAll(tasks);
    foreach (var resposta in tasks){
        Console.WriteLine(resposta.Result);
    }
}

In fact this other question questions this same approach.

An alternative (according to a quick measurement) more efficient than the one above and possibly more understandable can be obtained with the use of keyword await:

var tasks = new List<Task<HttpResponseMessage>>();
using (var client = new HttpClient())
{
    for (int i = 1; i < paginas; ++i)
    {
        var url = string.Format(api, i);
        tasks.Add(client.GetAsync(url));
    }

    var respostas = await Task.WhenAll(tasks);
    var todas = new List<string>();
    foreach(var resposta in respostas)
    {
        todas.Add(await resposta.Content.ReadAsStringAsync());
    }
    return todas;
}

But never make the following code:

var respostas = new List<string>();
using (var client = new HttpClient())
{
    for (int i = 1; i < paginas; ++i)
    {
        var url = string.Format(api, i);
        var resposta = await client.GetAsync(url);
        respostas.Add(await resposta.Content.ReadAsStringAsync());
    }
}
return respostas;

With this code requests are not being made in parallel and take longer so.

  • Bruno, I advise you to use the keywords async/await

  • 1

    @Tobymosque await would only be necessary if I choose to use Task.WhenAll instead of Task.WaitAll. If I use await inside the go I will take 10x more time to execute the code.

  • you can try using the Parallel.For(0, 10, async i => { ... return await client...Unwrap() });

  • 1

    @Tobymosque I had an experiment and it looks like . Net is smarter than I expected. Instead of taking 10x longer it takes "just" <number of colors> times. I’m going to post an alternative where I got a slightly shorter computation time and that uses the await.

  • my concern is if the API is slow to respond, either because of instability in your application, in the Transport layer or in the API itself. in this case the await can help you maintain the scalability of your application by preventing resources from getting stuck unnecessarily.

  • @I’m not sure I follow you. Waiting for all requests at once is better than waiting one by one, unless there are memory concerns. Transport failures, or less responsive server can occur with either 1 request or n requests and nothing prevents it.

  • just pointed out the reason why I mentioned the use of async/await, in any case your second alternative seems to me very good.

Show 2 more comments

Browser other questions tagged

You are not signed in. Login or sign up in order to post.