Data synchronization on android

Asked

Viewed 539 times

1

So, I’m in a project that needs to save a large mass of data coming from a json api on android’s Sqlite, something around 20 thousand records, in order to be able to work offline without problems.

Only that the synchronization of this data mass is taking a long time in the application, about 10 to 15 min. And it is totally impossible for the user to wait this long to download the data.

I got to take a look also at the lite couchbase.

My question is, Does sqlite work well with a large amount of records? If so, would there be a way to shorten the timing of the data to be able to use sqlite? How to save entire json inside couchbase.


Edited

Parsejson and Insert source in the bank

final DBController sett = new DBController(getBaseContext());
Calendar c = Calendar.getInstance();
SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");

final String formattedDate = df.format(c.getTime());

RequestQueue queue = Volley.newRequestQueue(this);
JsonObjectRequest jsonObjReq = new JsonObjectRequest(Request.Method.GET, url, null, new Response.Listener<JSONObject>() {
    @Override
    public void onResponse(JSONObject response) {
        try {
            JSONObject json = response.getJSONObject("DATA");
            Iterator keys = json.keys();

            while(keys.hasNext()) {
                String currentDynamicKey = (String)keys.next();
                if (!currentDynamicKey.equals("SYSTEM") && !currentDynamicKey.equals("SA3")) {
                    JSONObject currentDynamicValue = json.getJSONObject(currentDynamicKey);

                    JSONArray records = currentDynamicValue.getJSONArray("RECORD");
                    JSONArray dbo = (JSONArray) currentDynamicValue.getJSONObject("SETTINGS").get("DBO");
                    String createTable = dbo.getString(0).toString();

                    createTable = createTable.replace("\u0000", "");
                    createTable = createTable.replace("\\u0000", "");

                    sett.dropTable(currentDynamicKey);
                    sett.createTable(createTable);

                    for(int i=0; i < records.length(); i++) {
                        JSONObject jsonOBject = records.getJSONObject(i);
                        Map<String, String> dataArray = new HashMap<String, String>(jsonOBject.length());

                        for (int x=0; x < jsonOBject.length(); x++) {
                            String key = jsonOBject.names().getString(x).toString();
                            String value = jsonOBject.getString(key).toString();

                            dataArray.put(key, value);
                        }

                        sett.insertDataHash(currentDynamicKey, dataArray);
                    }
                }
            }
        } catch (JSONException e) {
            e.printStackTrace();
            Log.e("---[Erro no sync]---", e.getMessage());
        }
    }
}, new Response.ErrorListener() {
    @Override
    public void onErrorResponse(VolleyError error) {
        Log.d("---[Volley]---", "Error: " + error.getMessage());
        hidepDialog();
    }
}){
    @Override
    public Map<String, String> getHeaders() throws AuthFailureError {
        HashMap<String, String> headers = new HashMap<String, String>();
        headers.put("Accept", "application/json");
        headers.put("Content-type", "application/json");
        headers.put("Token", appToken);
        headers.put("DateTime", formattedDate);
        headers.put("Mac", deviceMac);
        headers.put("user", user);
        return headers;
    }
};
queue.add(jsonObjReq);

Json

{
  "RET": true,
  "DATA": {
    "SYSTEM": {
      "SYNC": true
    },
    "TABELA1": {
      "RECORD": [
        {
          "CAMPO1": "AAAAA",
          "CAMPO2": "00000",
          "CAMPO3": "AAA00"
        },
        {
          "CAMPO1": "BBBBBB",
          "CAMPO2": "111111",
          "CAMPO3": "BBB111"
        },
      ]
    },
    "TABELA2": {
      "RECORD": [
        {
          "CAMPO1": "CCCCCC",
          "CAMPO2": "000000",
          "CAMPO3": "CCCC00"
        },
        {
          "CAMPO1": "DDDDDD",
          "CAMPO2": "222222",
          "CAMPO3": "DDD222"
        },
      ]
    }    
  }
}
  • What also affects is how you transform JSON to SQL... SQL use lite.. I import 8 thousand records and it doesn’t take 1 minute..

  • face, 20000 records is fichinha pro sqlite, now a question, if you already have the data in json format, so that Voce will pass pro sqlite if Voce can very well work with the own json in the same way?

  • What it takes is to save in the Sqlite database or download the data?

  • 1

    The delay is time to save in the bank, download is fast

  • As it is a very large amount of data, it would also give you to create a file .db using json and saving that file inside the app. This way, you can save to sqlite on demand.

  • @Scussel posts a snippet of the source you parse from your json to the object and the inclusion in the database so we can analyze.

  • @seamusd I edited the post with the source snippet and json format

  • Have you tried Realm? The inserts are well optimized. Following comparison: http://db-engines.com/en/system/Realm%3BSQLite Documentation: https://realm.io/docs/java/latest/

  • @Marcogiovanni can give some hint on how to improve my code?

  • @Scussel you can use the lib Logansquare, to turn your JSON into an object, for that need to assemble a model in your app, it generates code at compile time. Then you can use a ORM or mount sql in hand from the model...

Show 5 more comments

2 answers

2

I am currently working on a project where we have two tables with no less than 200,000 records in one and 2.5 million in another. That’s right, millions. In my case, the user does an "initial charge" before leaving for field work. This load is done via Wi-Fi as follows: I perform the queries to the oracle database, save the result in csv files, then use the "import" command from the sqlite to import csv into the sqlite database (https://www.sqlite.org/cvstrac/wiki?p=ImportingFiles). After importing the data, upload the database to Android (normal process of serialization and upload). On Android the performance even with this mass of data is not harmed in any way.

  • Okay, I like the idea, I made a change to the routine, that I download all the json from the api to a tmp file and then work on top of this file, I can do something similar to that?

  • And to complement, my case is the same, the user will give an initial load of data to be able to work offline.

2


I was able to solve the problem of slowness by changing the api, now instead of bringing the fields and mounting the Insert on the device, I already bring the Insert api ready to run on the device’s Sqlite. Sync dropped to 30 sec.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.