Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
@atheist
What you're saying is a little over my head. Would you explain?
I wasn't sure if it as genuinely calculating the log or if it was a fluke. I'm interested to know why a set of seemingly random numbers seems to converge on the log though.
Also is this actually fast or is the standard logarithm algorithm faster (aside from hardware acceleration)? -
@atheist
> You could use a bisection search to calculate the log algorithmically, would outperform this by an order of magnitude I'd guess
Just read about bisection search (thanks). Thats an interesting approach, if it works the way I think it works. Its sort of like the old lookup tables for logarithms, the ones that used fractions to decide intervals for interpolation yeah?
> I'm guessing the randomness gives the benefit that it can get results that are accurate to sub integer precision.
Yeah, but it diverges after a while from the actual logarithm for some n, and it makes me wonder why.
I was thinking of the quake fast inverse square hack, the one with the magic number and how they derived it. If we generated a large enough set of N's and calculated the average of their error from the actual log of each, we could come up with a rough constant that minimizes the error for most values of N here. -
It also makes me wonder why, regardless of the series generated, why each N settles on a finer and finer precision number that reproduceable.
You ould think that given the same N, and given a unique stream of integers, that the estimated log would always vary slightly between runs, but it doesn't. It just increases the precision of the output with the number of runs.
I'm not familiar with statistics though so eh.
Related Rants
"Fast" random-number/sample based estimation of logarithms:
https://pastebin.com/niVB57Ay
The result of rAvg(p) is usually "pretty close" to log(p)/2
rAvg(p, 1000) seems to be the golden number. 100 is a little low, but I've already pasted the code. Eh.
Don't know why it works, or if average results are actually considered "close" for logarithms of e.
random
logarithms
code