I'm trying to determine the time complexity of adding $n$ numbers that each have a bit length of $n\log n$. I'm confused because sometimes I've seen the addition of two $n$ bit numbers as requiring $O(n)$ time and other places $O(1)$ time - especially in cases where there is multiplication involved in other steps.
So I know that my addition problem takes at least $O(n)$ time since there are $n$ numbers. But is the time complexity instead $n^2\log n$? I'm thinking that since the length of the numbers (in binary) is comparable to the number of numbers to be added, the time complexity could be greater. Thanks.