How to transform an array from one dimension to two in Java?

Asked

Viewed 684 times

1

I have an array of one-dimensional integers, like this:

int[] a = new int[]{1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12};

And I want to turn it into a two-dimensional array, for example:

[1, 2, 3],

[4, 5, 6],

[7, 8, 9],

[10, 11, 12]

How I do this transformation, in Java?

1 answer

4


Note that there are several ways to turn a one-dimensional array into a two-dimensional array. See some alternative representations of the example:

[1, 2],           [1, 2, 3, 4],            [1, 2, 3, 4, 5, 6],

[3, 4],           [5, 6, 7, 8],            [7, 8, 9, 10, 11, 12]

[5, 6],           [9, 10, 11, 12]

[7, 8],

[9, 10],

[11, 12]

Examples represent 6x2, 3x4, and 2x6 matrices, respectively. Therefore, it is necessary to know the width of the array, and the height can be calculated from there, knowing the size of the one-dimensional array.

static int[][] dimensionar_uma_em_duas (int[] matriz, int largura) {
      int altura = matriz.length / largura;
      int[][] ret = new int[altura][largura];
      for(int i=0; i<matriz.length; i++) {
            ret[i/largura][i%largura] = matriz[i];
      }
      return ret;
}

The method dimensionar_uma_em_duas, gets the height of the array by dividing its total size by the width, which is an input parameter. From there it traverses the one-dimensional vector at once, using the integer and module division functions to correctly position the elements in the new two-dimensional vector.

For a vector with height 4 and width 3, and the input of the first example, the calculations (i/width) and (i%width) will return the positions: 0, 0; 0.1; 0.2; 1.0; 1.1; 1.2; 2.0; 2.2; 3.0; 3.1 and 3.2; respectively.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.