When to use Success: Function() and . done(Function()) in asynchronous requests?

Asked

Viewed 20,650 times

20

Simply put, I can write an asynchronous request like:

$.ajax({
    url: url,
    dataType: 'json',
    type: 'GET',
    success: function (_user){
        alert (_user)
    }
});

that alerts me to the return _user. Likewise, I write:

$.ajax({
    url: url,
    dataType: 'json',
    type: 'GET'
}).done(function(_user){
    alert(_user);
});

that alerts precisely the same result. I know that success: function() in the first case is a callback, executed in the success of the request, as well as fail: function() and always: function() are executed as the names say.

Speaking of names, it is possible to infer that when the requisition is ready (done), the method chainable .done() is executed. I also understand that the method $.ajax returns a deferred Object, which is exactly in which the .done() therefore the difference in syntax between it and the success.

Anyway, despite my minimalist example, in the vast majority of cases I don’t understand when to use each of the ways - be she a callback or the chaining of another method - since I get the expected result in both. I still know a third way to get this result, but it involves the directive async: false, that ends up escaping the scope of the question.

So when to use .done()? When to use the callback success: function()? I would like practical examples and, if possible, examples whose result is different when one or other approach is implemented.

  • 4

    The answer is not used but some explanation follows: http://stackoverflow.com/questions/14754619/jquery-ajax-success-callback-function-definition/14754681#14754681

  • 3

    @War I had already read this answer, and it is excellent. I even thought about getting into the merit of resolve in my question, but I think this is worth a completely different question. Anyway, good reference =)

  • 3

    I don’t know a case that has to specifically use one of them. And I think the use of Promisses, the done(), facilitates code reading and brings the language closer to OO.

  • 2

    @Leonardoleal, I agree. The approach to the OO, even more now with the ES6 turning the corner, I like it a lot, outside that it is possible to separate the responsibilities of capture and manipulation of the data, in this case in particular

3 answers

15

The practical results are the same. The difference is mainly in code style.

As promises (in jQuery, implemented as deferred Objects) are a widely used model for handling asynchronous operations, in Javascript and other languages. They facilitate various treatments over asynchronous operations, especially when you need to deal with the outcome of more than one operation, whether in sequence or in parallel.

Think for example of a chain of asynchronous operations where each operation depends on the result of the previous one, with the callback of the first operation starting the second one and so on. It quickly becomes a "hell of callbacks" (callback Hell). The typical consequence is that the code becomes less readable, looking like an arrow to the right:

callback hell

This is very common to occur with ifs also, but in the case of them, which are all synchronous, it is simpler to solve, often simply putting them in sequence instead of nested, or joining multiple conditions in a single if.

To solve this problem with asynchronous operations is not so simple. You need to take care of the state of these operations, which changes over time, and create a mechanism to record what to do depending on the state.

And basically that’s what promises do, allowing you to turn the figure code into something like this:

a().then(b).then(c).then(d).then(e).then(f);

Another example, this time with jQuery, running a callback only when 3 ajax requests have been completed:

$.when(reqA, reqB, reqC).then(callback);

Of course, you can create your own solutions to solve the problems of readability versus asynchronicity, but the promises already do so in a standardized way - that is, which independent code modules are able to understand. Remember that asynchronous operations are not just about HTTP requests. In jQuery, for example, it is also possible to get promises other types, such as completed animations, confirmed modal dialogues, etc. And with these promises you can express in a simpler way the flow of operations that your code performs. In Node.js (hence on the server), Hell callback is even easier to achieve, considering that essential operations such as database access or file system access are asynchronous.

  • 2

    "This is very common to occur with ifs as well, but in their case, which are all synchronous, it is simpler to solve" - excellent example

  • Excellent! I liked the quote to node.js. + 1

12


TL;DR: the .done() is the modern way of using the success, and meets more or less (*) the specifications of a Promise. Which means it can be chained to the jQuery style and protects execution in case of errors.


Before the concept of Promises and deferred callbacks emerged the usual method was to pass an object to the ajax method with the necessary settings. So the callback went on this object as well as the future code that depended on that answer, which had to start from inside the callback:

$.ajax({
    url: url,
    dataType: 'json',
    type: 'GET',
    success: function (_user){
        fazerUpdateTabela(_user);
        guardarUmCookie(_user);
        procurarNoutraAPIqqCoisa(_user.nif, function(dadosPessoais){
            verificarCartao(dadosPessoais.nrCartao, function(ver){
                if (!ver) fazerTudoDeNovo();
                // etc
            });
        });
        // etc
    }
});

In the case of for example procurarNoutraAPIqqCoisa had there been another step to follow after that to have completed the chain of actions is fragmented and passed some lines it is already difficult to know the origin and direction of execution of the code.

Later, with the concept of Promises it is possible to write code that is a skeleton of what will happen and has a more visual and easy to perceive path. The above example could be adapted (with some internal adjustments in functions) to this:

var ajax = $.ajax({
    url: url,
    dataType: 'json',
    type: 'GET'
});
ajax
    .done([guardarUmCookie, fazerUpdateTabela])
    .done(procurarNoutraAPIqqCoisa.then(verificarCartao).fail(fazerTudoDeNovo));

* - jQuery has had problems with jQuery deferreds linkages. There is a bug/pr with a long discussion about it on Github. It seems that in version 3 this will be solved but in my view too late because the browsers already allow a native version of the same idea.

The modern version is therefore more versatile and allows as I mentioned the chaining of functions that should run when the server response arrives. Allows functions as argument but also function arrays, like the Promise.all, what can be practical.

This version has an API parallel to the files to forward these strings to .done() and .fail()` with various methods, giving more flexibility to manage the application flow.

Another important aspect of Promises is that the errors generated within Promises don’t do the same damage as before. An error ('throw') inside a Promise causes it and the following of the string to be rejected and calls the .catch() chain. This is very useful to avoid crashing code.

After what I wrote above and returning to the question: Which to use?

I prefer to use Promises and native Ajax. Nowadays this is already possible.

So the "normal" version with callbacks could be:

function _ajax(method, url, done) {
  var xhr = new XMLHttpRequest();
  xhr.open(method, url);
  xhr.onload = function () {
    done(null, xhr.response);
  };
  xhr.onerror = function () {
    done(xhr.response);
  };
  xhr.send();
}

// e para usar
_ajax('GET', 'http://example.com', function (err, dados) {
  if (err) { console.log(err); }
  else console.log('A resposta é:', dados);
});

The version with Promises could thus be an encapsulation of this old version with Promise powers:

function ajax(method, url) {
    return new Promise(function(resolve, reject) {
        _ajax(method, url, function(err, res) {
            if (err) reject(err);
            else resolve(res);
        });
    });
}

or remaking:

function _ajax(method, url) {
    return new Promise(function (resolve, reject) {
        var xhr = new XMLHttpRequest();
        xhr.open(method, url);
        xhr.onload = resolve;
        xhr.onerror = reject;
        xhr.send();
    });
}

and then use with:

ajax('GET', 'http://sopt.moon')
    .then(function(dados) {
        console.log(dados);
    }).catch(function(err) {
        console.error('Oh não!!', err.statusText);
    });

With a few more adaptations may allow POST, PUT, etc. There is a full polyfill on MDN for this.


  • 1

    I liked the example vanilla, useful good!

  • 1

    I will mark this answer as the correct one, because of the example with pure JS, despite all that appeared to be very good and didactic.

9

This has always been a great paradigm for me too. I’ve researched about and I’ll tell you what I could absorb.

In the beginning it was always used the callback success speaking of $.ajax. But when came the implementation of $.Deferreds, that would be returns with more income, began to use done for callback positive.

Comparison of callback positivo:

Before the $.Deferreds

$.ajax({
  url: 'url.php',
  type: 'POST'
  success: function(data) {
      alert("SUCESSO"); 
  }
});

After the $.Deferreds

$.ajax({
  url: 'url.php',
  type: 'POST'
}).done(function() { 
     alert("SUCESSO"); 
});

From what I could understand the main advantage in using $.Deferreds is that you may have a common function for different requests.
This can be explained in a very simplistic way:

function ajax_get_somethings(id) {

  return $.ajax({
    url: get_somethings.php,
    type: 'GET',
    data: {id: id},
    dataType: 'json'
  }).always(function() {
    // Sempre de um alerta
  })
  .fail(function() {
    // Caso falhe solicite outro id
  });

}

ajax_get_somethings(1).done(function(data) {
  // pegue algo com esse id e faça determinada tarefa
});

ajax_get_somethings(2).done(function(data) {
  // pegue algo com esse outro id e faça outra tarefa
});

And that doesn’t end there are still a number of fans to be explored if necessary: https://api.jquery.com/category/deferred-object/

I hope I’ve helped.

Question reference: https://stackoverflow.com/questions/8840257/jquery-ajax-handling-continue-responses-success-vs-done

  • In addition, the separation of what is actually expected is more visible to the programmer’s eyes: What you have to do when it works, what you have to do when it goes wrong. I just find a flaw (or lack of implementation) that the fail() do not receive parameters such as status_code, for example.

  • 1

    This approach to parameterize the request is very interesting. For now, +1

Browser other questions tagged

You are not signed in. Login or sign up in order to post.