Skip to main content
4 of 5
replaced http://arduino.stackexchange.com/ with https://arduino.stackexchange.com/

Why use an int variable for a pin when const int, enum or #define makes much more sense

Why do people use a variable to specify a pin number when the pin is unlikely to change throughout the execution of the code?

Many times I see an int being used for a pin definition,

int led = 13;

when the use of a const int

const int led = 13;

or enum, or #define

#define LED 13

makes much more sense.

It is even in tutorials on the Arduino site, for example, the first tutorial that most people run, Blink.

I read somewhere that const int is preferred over #define. Why isn't this encouraged right from the beginning, rather than allowing people to develop bad habits, from the outset? I noticed it a while back, but recently it has started to irritate me, hence the question.

Memory/processing/computing wise is a const int, enum, or for that matter #define, better than a plain int, i.e. occupies less memory, stored in different memory (Flash, EEPROM, SRAM), faster execution, quicker to compile?


This may appear to be a duplicate of Is it better to use #define or const int for constants?, but I am addressing the question of why people use variables, and how does the performance improve when they don't, rather than which type of constant is better.

Greenonline
  • 3.2k
  • 7
  • 37
  • 49