Let's suppose we have this expression:
(i % 1000000) * 1000 + ms
where i is an always increasing number, and ms is the millisecond part of the current time ( ranging 0..999). So whileeach time we are calling the expression above, at random intervals, while i is increasing ialways obtaining all unique values, the result of the expression above will potentially returns duplicated, intuitively. How to show this in an acceptable form? Is there a way to show the probability next iteration will generate a duplicate?