"#define" defines a global variable?

Asked

Viewed 280 times

8

I’ve always used a few definebut now this doubt has arisen, when I use a #define am I creating a global variable? Is there any harm in this use? Example:

make program to read 10 numbers:

I put in #define quantity 10 and quantity use in the for parameters.

1 answer

10


No, not even close.

#define just says that a text is the same as another text, nothing else. So everywhere your code has that first text after initial processing is replaced with the second text.

There’s nothing variable about it. It’s not even constant, although it looks like one. So there’s no scope to it either. That name does not actually exist for the code that is compiled. The exchange is done in all the code where the #define, i.e., from the moment it is found in the code to the end of that compilation unit.

In the way you’re using it looks like a constant, so you have a name that’s changed to a value, literally, the value only exists there at that location, again it has nothing of scope.

For more modern code it is not recommended to use virtually anything from the preprocessor, including the #define. Not that it should ever be used, but it’s best to avoid.

Behold:

Browser other questions tagged

You are not signed in. Login or sign up in order to post.