What happens when I convert int to char?

Asked

Viewed 459 times

9

A whole has 4 bytes, whereas a char has only 1 byte.

When I make that definition:

int a = 1000; // 1111101000
char b = (char) a;

I believe it will take only 1 byte of data, but what I want to know is whether it will take the beginning or the end.

4 answers

8


whereas a int are 32 bits and a char are 8. In this conversion you will take only the 0 bits up to 7 of your int.

If the int storing 1000 is:

0000 0000 0000 0000 0000 0011 1110 1000 // equivale a 1000 em 32 bits
^                                     ^
32                                    0

Converting to char you will only have:

1110 1000 // equivale a 232 em 8 bits
^       ^
7       0

Hence, the int which stores 1000 when converted to char, results in 232.

  • Got it, thank you very much.

  • 3

    Note that there are processors that store their data in memory differently, so this same operation would have different results, see more on http://pt.wikipedia.org/wiki/Extremidade_%28sort%C3%A7%C3%A3o%29

4

It will take the start byte, which in human representation, are the least significant digits, ie the final digits.

I say this because in memory, the first byte actually represents the least significant data, and the last the most significant. Whereas we human beings do the opposite.

Example:

0x12345678 <- representação humana

0x78, 0x56, 0x34, 0x12 <- ordenação dos bytes na memória do computador x86

As pointed out by @pmg, there are architectures where this order of addresses is inverse to the one placed above.

  • Thanks for the help.

  • Arrange... it’s always good to help!

  • 1

    Not all computers are 'little-endian' :)

  • @pmg: It’s true. That’s an important detail. In the case of this particular operation should not make a difference.

  • @pmg: You could bring one of those answers from you about that from SOEN here. = D

2

Explicitly referring to the C standard (I used N

6 Language

6.3 Conversions

6.3.1 Arithmetic operands

6.3.1.3 Signed and unsigned integers

1 When a value with integer type is converted to Another integer type other than _Bool, if the value can be represented by the new type, it is unchanged.

2 Otherwise, if the new type is unsigned, the value is converted by repeatedly Adding or subtracting one more than the Maximum value that can be represented in the new type until the value is in the range of the new type.

3 Otherwise, the new type is Signed and the value cannot be represented in it; either the result is implementation-defined or an implementation-defined Signal is Raised.

A brief translation:

1 - If the value can be represented on the target type, it will be.
2 - If they are unsigned bits will be truncated. (as the other answers say).
3 - If they are signed, result is defined by the implementation.

The third point is quite important. If you are going to convert a int for a char may have unexpected results from your compiler. Although virtually everyone truncates the bits, you should not assume that this always occurs.

Prefer to convert from unsigned int for unsigned char when possible.

-1

What can happen in the conversion of int for char is the conversion based on tabela ascii.

For example: inserir a descrição da imagem aqui

  • 1

    Note: char nothing to do with the ascii table. It just so happens that most implementations use this table to represent characters. And that those characters are like char. Note2: The ascii table goes from 0 to 127. The values from 128 to 255 are part of the extended table, not compatible with UTF-8 for example.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.