6
I wonder how I could ignore rounding in divisions, for example:
4/3 = 1,33... ~ 1 | 5/3 = 1,66... ~ 2
Right?
I wanted to make a program in C, that in these cases, the rounding is not done, because the idea is to increase a number if it contains decimals, regardless if the decimal number (doubtful number) is greater, less than or equal to 5. Any idea?
Code:
void main() {
float n1 = 4.0;
float n2 = 3.0;
float result;
result = n1 / n2; // res = 1,33 ~ 1
// Aqui eu quero que o resultado seja 2
printf("Resultado incrementado: %d\n", result);
n1 = 5;
result = n1 / n2; // res = 1,66 ~ 2
// Aqui eu quero que o resultado seja 2
printf("Resultado incrementado: %d\n", result);
}
In the first printf
I need to increment to get the desired result, in the second no. That is, it uses the rules of rounding. What I want is that if there are decimals in the number, that it is incremented and not decremented.
Show us what you’ve done to make us understand what’s wrong.
– Maniero
I put an example
– Vynstus
Voce knows that since its "result" is a variable of type "int", it will never connect the decimal parts, no?
– jsbueno