Skip to main content
1 of 5
Greenonline
  • 3.2k
  • 7
  • 37
  • 49

Why use an int variable for a pin when const int, enum or #define makes much more sense

Why do people use a variable to specify a pin number when the pin is unlikely to change throughout the execution of the code?

Many times I see an int being used for a pin definition,

int led = 13;

when the use of a const int

const int led = 13;

or [enum][1], or [#define][1]

#define LED 13

makes much more sense.

It is even in tutorials on the Arduino site, for example, the first tutorial that most people run, Blink.

I read somewhere that const int is preferred over #define. Why isn't this encouraged right from the beginning, rather than allowing people to develop bad habits, from the outset?

Memory/processing/computing wise is a const int, enum, or for that matter #define, better than a plain int, i.e. occupies less memory, stored in different memory (Flash, EEPROM, SRAM), faster execution, quicker to compile?


This may appear to be a duplicate of Is it better to use #define or const int for constants?, but I am addressing the question of why people use variables, and how does the performance improve when they don't, rather than which type of constant is better.

Greenonline
  • 3.2k
  • 7
  • 37
  • 49