Two main causes that can break programs with aggressive optimization (such as -O3
):
- compiler bugs
- programmer mistakes
Bugs occur because compiler developers make mistakes, and the user doesn’t have much to do (and it’s not up to him, except to send a report and wait for the problem to be fixed). This can happen, for example, when some optimization algorithm is implemented that does not consider all possible output/input cases.
Already the mistakes of the programmer usually start from bad programming practices, are more common and I will give an example that I have found in real codes:
Let’s say you have a constant variable:
int const a = 5
And a function that receives a pointer int*
:
void f(int* x)
You can call the function by passing a pointer to a
:
f((int*)a)
Now consider the following code snippet:
int const a = 5;
f((int*)a);
if(a!=5)
{
//faz alguma coisa
} else
{
//faz outra coisa
}
It seems plausible to me that aggressive optimization will eliminate the branch of if
, as well a
was declared constant, its value should be 5
. It turns out that the function f
can change the value of a
, through your pointer.
Without optimizations, the compiler would issue instructions to reload the value of a
and analyze the condition, making the program work as expected, and the optimization would break the program, but, in fact, an error occurred here by the programmer when making an unsecured cast of a constant variable (and changing it then results in undefined behavior by the standard).
Compiler errors are intermittent... In part, better ways are created to test them, decreasing bugs, but in part compilers compete aggressively for more glaring optimization techniques and sometimes new bugs are introduced. I feel like the compilers are getting stronger, and the flag -O3
(or equivalent) is safe. (I realize that new optimization techniques in general are launched with a specific flag, and only then incorporated into the -O3
.)
In a personal opinion: the biggest problem is that programmers still use techniques that rely on direct memory manipulation, tricks that go beyond the scope of language abstraction, breaking the premises on which the compiler performs the optimizations.
+1 having defined undefined behavior (:p) is the greatest source of problem. I have devised an answer with an example.
– Kahler
+1 for answering and mentioning about undefined behavior. I chose to choose @Kahler’s answer for having an example and being easier to understand for beginners like me.
– jlHertel
@jlHertel did well.
– Maniero