Skip to main content
minor wording and formatting improvements
Source Link
Gareth Rees
  • 50.1k
  • 3
  • 130
  • 211

Now, isIs this code "efficient" as the question asks? Normally in computing we use "efficient" to refer tomean algorithmic efficiency: that is, to the rate at which the resources used by the program grow as a function of the input, usually expressed in big-O notation.

AccordingThe question says, "Python is not efficient", but according to thisthe usual view of efficiency, the programming language does not matter: efficiency is a property of the algorithm, not of the language it's implemented in. But you've written "Python is not efficient", which suggests that you have a different understanding of this word. Maybe you could update your post to explain what you mean by it? For the remainder of this answer I'm going to assume the question intends the usual meaning.

Now, what's the runtime of survivor2? Again, looking at the time complexity page on the Python Wiki, we can see that the "get slice" operation takes time \$ O(k) \$ where \$ k \$ is the number of items in the slice. In this case each slice is half the length of persons, so the runtime is $$ O\big({n \over 2}\big) + O\big({n \over 4}\big) + O\big({n \over 8}\big) + \dots $$$$ O\left({n \over 2}\right) + O\left({n \over 4}\right) + O\left({n \over 8}\right) + \dots $$ which is \$ O(n) \$. Again, we can check that experimentally:

5. Making it (poly)logarithmicpolylogarithmic

def survivor3(n):
    """Return the survivor of a circular firing squad of n people."""
    first = 1
    gap = 1
    while n > 1:
        if ngap %*= 2: 
 first += 2 * gap
   n, odd = divmod(n, 2)
 gap *= 2     if odd:
        n //= 2  first += gap
    return first

Notice that the graph shows the runtime scaling roughly proportionally to \$ \log n \$ and not to \$ (\log n)^2 \$. That's because the values of \$ n \$ are small (less than \$ 2^{64} \$) and so in this range the arithmetic operations are effectively \$ O(1) \$. We'd have to use much larger values of \$ n \$ to show the asymptotic behaviour.

Is survivor3 as efficient as possible? Well, we could squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. We can't possibly make it any faster than that, because it takes \$ Ω(\log n) \$ just to output the answer.

Now, is this code "efficient" as the question asks? Normally in computing we use "efficient" to refer to algorithmic efficiency: that is, to the rate at which the resources used by the program grow as a function of the input, usually expressed in big-O notation.

According to this view of efficiency, the programming language does not matter: efficiency is a property of the algorithm, not of the language it's implemented in. But you've written "Python is not efficient", which suggests that you have a different understanding of this word. Maybe you could update your post to explain what you mean by it? For the remainder of this answer I'm going to assume the question intends the usual meaning.

Now, what's the runtime of survivor2? Again, looking at the time complexity page on the Python Wiki, we can see that the "get slice" operation takes time \$ O(k) \$ where \$ k \$ is the number of items in the slice. In this case each slice is half the length of persons, so the runtime is $$ O\big({n \over 2}\big) + O\big({n \over 4}\big) + O\big({n \over 8}\big) + \dots $$ which is \$ O(n) \$. Again, we can check that experimentally:

5. Making it (poly)logarithmic

def survivor3(n):
    """Return the survivor of a circular firing squad of n people."""
    first = 1
    gap = 1
    while n > 1:
        if n % 2: first += 2 * gap
        gap *= 2
        n //= 2
    return first

Notice that the graph shows the runtime scaling proportionally to \$ \log n \$ and not to \$ (\log n)^2 \$. That's because the values of \$ n \$ are small (less than \$ 2^{64} \$) and so in this range the arithmetic operations are effectively \$ O(1) \$. We'd have to use much larger values of \$ n \$ to show the asymptotic behaviour.

Is survivor3 as efficient as possible? Well, we could squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. We can't possibly make it any faster than that, because it takes \$ Ω(\log n) \$ just to output the answer.

Is this code "efficient" as the question asks? Normally in computing we use "efficient" to mean algorithmic efficiency: that is, the rate at which the resources used by the program grow as a function of the input, usually expressed in big-O notation.

The question says, "Python is not efficient", but according to the usual view of efficiency, the programming language does not matter: efficiency is a property of the algorithm, not of the language it's implemented in.

Now, what's the runtime of survivor2? Again, looking at the time complexity page on the Python Wiki, we can see that the "get slice" operation takes time \$ O(k) \$ where \$ k \$ is the number of items in the slice. In this case each slice is half the length of persons, so the runtime is $$ O\left({n \over 2}\right) + O\left({n \over 4}\right) + O\left({n \over 8}\right) + \dots $$ which is \$ O(n) \$. Again, we can check that experimentally:

5. Making it polylogarithmic

def survivor3(n):
    """Return the survivor of a circular firing squad of n people."""
    first = 1
    gap = 1
    while n > 1:
        gap *= 2 
        n, odd = divmod(n, 2)
        if odd:
            first += gap
    return first

Notice that the graph shows the runtime scaling roughly proportionally to \$ \log n \$ and not to \$ (\log n)^2 \$. That's because the values of \$ n \$ are small (less than \$ 2^{64} \$) and so in this range the arithmetic operations are effectively \$ O(1) \$. We'd have to use much larger values of \$ n \$ to show the asymptotic behaviour.

Bounty Awarded with 50 reputation awarded by Caridorc
add graphs of runtime
Source Link
Gareth Rees
  • 50.1k
  • 3
  • 130
  • 211

Log-log plot of n against runtime in seconds

Log-log plot of n against runtime in seconds

You can see that for each doublingincrease of n by a factor of 16, the runtime increases by a roughly constant amount, which is what we expect for a polylogarithmic algorithm.

IsSemi-log plot of n against runtime in seconds

Notice that the graph shows the runtime scaling proportionally to \$ \log n \$ and not to \$ (\log n)^2 \$. That's because the values of \$ n \$ are small (less than \$ 2^{64} \$) and so in this range the arithmetic operations are effectively \$ O(1) \$. We'd have to use much larger values of \$ n \$ to show the asymptotic behaviour.

Is survivor3 as efficient as possible? Well, we could squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. We can't possibly make it any faster than that, because it takes \$ Ω(\log n) \$ just to output the answer.

You can see that for each doubling of n, the runtime increases by a roughly constant amount, which is what we expect for a polylogarithmic algorithm.

Is this as efficient as possible? Well, we could squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. We can't possibly make it any faster than that, because it takes \$ Ω(\log n) \$ just to output the answer.

Log-log plot of n against runtime in seconds

Log-log plot of n against runtime in seconds

You can see that for each increase of n by a factor of 16, the runtime increases by a roughly constant amount, which is what we expect for a polylogarithmic algorithm.

Semi-log plot of n against runtime in seconds

Notice that the graph shows the runtime scaling proportionally to \$ \log n \$ and not to \$ (\log n)^2 \$. That's because the values of \$ n \$ are small (less than \$ 2^{64} \$) and so in this range the arithmetic operations are effectively \$ O(1) \$. We'd have to use much larger values of \$ n \$ to show the asymptotic behaviour.

Is survivor3 as efficient as possible? Well, we could squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. We can't possibly make it any faster than that, because it takes \$ Ω(\log n) \$ just to output the answer.

further reading
Source Link
Gareth Rees
  • 50.1k
  • 3
  • 130
  • 211

Is this as efficient as possible? Well, we could probably squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. But for modest sizes of firing squads, this makes no practical differenceWe can't possibly make it any faster than that, so I'll leavebecause it theretakes \$ Ω(\log n) \$ just to output the answer.

6. Further reading

This problem is (a simple case of) the well-known Josephus problem.

Is this as efficient as possible? Well, we could probably squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. But for modest sizes of firing squads, this makes no practical difference, so I'll leave it there.

Is this as efficient as possible? Well, we could squeeze out one of the factors of \$ O(\log n) \$ by being cleverer about how we manipulate the values in the loop (each operation on first only affects one bit of the result), making the algorithm \$ O(\log n) \$. We can't possibly make it any faster than that, because it takes \$ Ω(\log n) \$ just to output the answer.

6. Further reading

This problem is (a simple case of) the well-known Josephus problem.

Source Link
Gareth Rees
  • 50.1k
  • 3
  • 130
  • 211
Loading