How are 0 and 1 signals actually encoded in today's communication and storage systems? Would their actual encoding cost the same or differently? My intuitive guess is that there must always be some differences. Examples I can think of include,
Morse code (yes, Morse doesn't sound modern, but I am not familiar with the state-of-the-art physical technologies): 1 would cost longer transmission delay than 0.
Radio systems: 1 may be encoded with high frequency while 0 is low frequency. So presumably, they cost different amounts of energy to transmit.
SSD writes: perhaps writing "1"s requires more power/delay than "0"s since "0" states are by default but "1" states requires some electric power to reset things.
Time and power are two metrics that would meaningfully differentiate the two signals. But are there something else interesting? Anyway, I would really appreciate technical references (papers, url links) that either corroborate or disapprove these thoughts.
Thanks!