8
Crost
3y

By any chance can someone explain logarithms to me and how they relate to complexity growth? I can do the calculation but it's only from memorisation. I just don't understand how it relates to a graph.

Comments
  • 11
    The logarithm is the reverse of a power. If
    a = b^c, then
    c = log_base_b(a).

    As for complexity growth. If you have an algorithm like binary search, then in each step, you cut the interval in half until your interval size is 1.

    Looking at that process from the reverse, i.e. when you are at interval size 1 after doing n steps, what was the original search list size N? Well the reverse of cutting in half is doubling, so the original size N was:
    N=2^n.

    Now if you resolve that equation as to know how many steps n you need for a given size N, that is:
    n=log_2(N).
  • 2
    @Fast-Nop that was a very good explanation!
  • 2
    @Fast-Nop thank you. I lost you on the 3rd paragraph though.

    I think I understand now though. So the equation itself I have memorised l, and I memorised binary search a long time ago.

    So O(log2 N) because with a set of 4 elements worst case would be:

    Searching for 9.
    1,3,7,9
    7,9
    9
    So 2 steps

    Searching for 8
    1,2,3,4,5,6,7,8
    5,6,7,8
    7,8
    8
    3 steps

    As the N doubles the steps are incremented by one.

    So in the first example N has a length of 4 and it takes 2 steps.

    log2 4 = 2

    Second example N has a length of 8 and it takes 3 steps.

    log2 8 = 3

    But we don't care about specific inputs for big oh and the input can change so log2 N.

    Plotting on the graph I think makes sense to me now as we are plotting how many steps increase as N grows and I was able to give some numbers above for steps as N grows....

    Does this sound right?
  • 3
    @craig939393 Yes, that's it. The next thing is that the big-Oh notation doesn't care about constant factors or constants added.

    For example, O(N*constant) is the same as O(N) because we only look at how stuff scales with growing N, not at actual runtime. Similarly, O(N + constant) is O(N).

    Interestingly, O(N² + N) is just O(N²) because for large N, N² is so much larger than N that only the N² is important.

    Now, a logarithm of a number relative to base A is the same as the logarithm to base B, but with a constant factor. For example:
    log_2(32)=5, log_10(32)=1.5, factor=3.32.
    log_2(256)=8, log_10(256)=2.4, factor=3.32.

    So for big Oh notation, it doesn't matter whether you use log2, log10 or any other log because the factor between two logs is a constant. That's why we write O(log N), not O(log2 N).
  • 2
  • 2
    @Fast-Nop so last question if it's ok- embarrassing as I should know all this - you said different log bases have equivalent factors. Can you go into more detail about what this means please?

    I was just thinking about if log3 would be a thing since I don't think I've ever seen it when you wrote that...
  • 1
    @craig939393 If you have two logarithm bases A and B, so you look at log_A(N) and log_B(N), then:
    log_A(N) = log_B(N) / log_B(A)

    log_B(A) does not depend on N and is a constant factor, that's why it is ignored in big Oh notation, and logarithm to any base is equivalent when we look how an algo scales.

    What this says in plain English: each doubling of your N adds the same amount of time to your runtime. It's not important what exact amount, it's about the qualitative behaviour.
  • 2
    @Fast-Nop Time to hit the maths books 😅.

    Thanks for the explanation.
  • 1
    @Fast-Nop
    You should become a maths teacher - the pupils out there need you!
  • 1
    @Oktokolo Back then as uni student, I made some money with private math lessons for high school pupils. Provided that the client was willing to put in some work, I used to get him from a failing grade to average. :-)
Add Comment