I believe quantum computers are a form of analog computers if an analog computer is defined as computing continuous variables.
There is Continuous-Value quantum computation, indeed, but usually the point about quantum computers is that the quanta have quantized states, i.e., not continuous states. So, let's start with "this (in its generality is wrong), but it doesn't invalidate the question".
However, their errors are modeled as bit errors.
No, not usually. That's definitely wrong. (The result of a quantum computer's computation might be something you can interpret in bits; but it's not the way errors are modelled, unless you look at the quantum computer as a black box with bits coming out, in which case all sensible things you can do is assign error probabilities.)
I am not familiar with analog computers and have found no reference that systematically discuss how errors may be modeled in an analog computer.
wikipedia would be your friend, the analog computer article is well-sourced.
Anyways, you don't care about classical analog computers, but about quantum computers.
So, read an intro book to quantum computers. Errors and error correction in quantum computers are so central to them that any textbook covers them, early on. The go-to book is "Mike & Ike", correctly called
M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information: 10th Anniversary Edition. Cambridge University Press, 2010.
The introduction chapter can be downloaded from one of the authors, and would have resolved quite a few misconceptions.