From a Data Comms textbook:
These are the approximate square waves (Fig 3.7a and 3.7b):

This is where the effect of changes in bandwidth on data rate is determined:
The problem is the explanation/calculation makes no sense to me. I don't see anything in the calculation that ties the bandwidth of the channel to the data rate that was eventually calculated. All they did was take the toggling rate (frequency) of the square wave and multiply it by 2 to get the data rate; nothing that explains how increasing bandwidth increased the data rate.
What I understood previously is that the main reason increasing bandwidth allows higher data rate is because as high bandwidth allows transmission of more frequency components, which in turn allows the receiver to better distinguish between 1 and 0 and ultimately allowing the data source to increase its toggling rate (i.e. data rate) since it knows the receiver can handle it. My questions are:
- What have I missed in the textbook's explanation, if anything; or is it just a poor explanation?
- Is there anything wrong with my understanding of the bandwidth-data rate relationship?
- And just to clarify, the bandwidth considered here is the bandwidth of the channel and not that of the input signal? (though I know the channel's bandwidth does affect the signal bandwidth by limiting it)
Thanks.

