Yes, sorry for not being precise: UB applies to executions. When I said "global" I meant global over that entire execution, so if your path ends up hitting undefined behavior it can go back and logically undo its entire execution, including parts which it shared with a well-defined execution or where you'd generally expect side effects to be placed.
No, that logically doesn't make sense. The program cannot know whether it is going through a particular execution ahead of time without actually executing all the side effects along that path first (which in this case would include the fflush()). The very difference between a "program" and a "program execution" is the fact that an execution includes the interactions of the program with the external world (as defined by the standard, all of which I loosely called "inputs" in my previous comment). The interactions basically extend prefixes of the execution through performing the semantics of the program according to the abstract machine and observing the responses from the external world. You don't have an "execution" of the program until the point of UB, until the interactions (aka side effects) up to that point have first occurred (and the responses of the system observed for continuing the execution).
P.S. Have you ever seen a single example of a compiler time-traveling UB through observable behavior like this? I sure haven't. If you have, I'd love to see it, because despite all the crazy ways compilers take advantage of UB, I've never seen C/C++ compilers actually agree with the stance that this way would be somehow legal (if it's even logically possible).
Can the compiler not use that to assume that (x > 4) is false because otherwise it triggers undefined behavior? Hence it is allowed to drop the entire branch?
The only real counter-argument I could see is "fflush might terminate the program, hence we need to run the function before we know if UB will be triggered". I suppose once you call a function that the compiler cannot analyze (e.g. system-calls, FFIs) the compiler may not be certain the function doesn't contain an 'exit()' call.
That's right, I think. If you replace the "fflush()" (which should have an argument by the way) with "f()" and declare "void f(void);" then the test and the call appear in the binary. But if you declare "__attribute__((pure)) void f(void);" then the test and the call disappear.
It seems this is correct, but there are very quick cases where the compiler does not consider a program 'pure'. Even a simple call to 'puts' already is enough to be compiled. Probably because it has side-effects in setting a value for ferror(file) to return.
I wonder if we can find an example of a function that is externally observable to a user, but that is guaranteed to finish. Then specifically i wonder if the compiler can proof that the undefined behavior is guaranteed to happen so it elides the branch, proving 'real' timetravel. That is observable.
> I wonder if we can find an example of a function that is externally observable to a user, but that is guaranteed to finish.
I don't think the standard has such a thing, but if it did, the closest thing would probably be a write to a volatile variable. You'd have to make sure the compiler sees the variable as having a side-effect in the first place (so it would probably need external linkage).
> The only real counter-argument I could see is "fflush might terminate the program, hence we need to run the function before we know if UB will be triggered".
The thing to realize is there is no such thing as "UB will be triggered". The only thing that exists is "UB is triggered", combined with the as-if rule, which allows modifications that don't affect what the standard considers observable behavior. Or in other words, the standard defines a program according to its observable behavior. People think it's time-travel because they think of the program in terms of expressions and statements rather than side effects, but if you think of the programs in terms of observable behaviors rather than the lines of code executing, you see that there's no time travel.
The program still contains undefined behavior. It is probably a matter of order of optimization whether the compiler catches the undefined behavior before it elides the useless statement.
But it is certainly 'legal' for the compiler to consider that statement to invoke undefined behavior, and prune any branch that is guaranteed to reach that statement.
"However, if any such execution contains an undefined operation, this International Standard places no requirement on the implementation executing that program with that input (not even with regard to operations preceding the first undefined operation)."
The "[...] executing that program with that input [...]" part maybe could be read as making it specific to a given UB triggering execution; but I'm no language lawyer :).
True, only executions of a program that exhibit undefined behavior are affected.
But the moment it is clear a program will exhibit undefined behavior, the compiler is already allowed to do whatever it wants. So if 20 lines below an important function call you will certainly call a function that will certainly cause undefined behavior, the important function call can be already be left out.
I agree with your sentiment, but the way I square that with what I mentioned is that the compiler can undo side effects. As far as I am aware there is nothing special about fflush in the standard where you can't go back to where the program was before it happened.
(I have never actually seen a compiler act on this, but I maintain that this is just because they're either not willing to optimize on this or unable to do so. But there's a lot of UB that compilers do not exploit, so this isn't particularly concerning to me.)