Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
This is totally true.
I remember reading an article that showed the venn diagram of how various parts of math, not just statistics and machine learning, all fit together. -
The whole weights and activation being compared to a neuron is blown way out of proportion.
I still envision it as a 3D/nD height map that water (input data) is poured over it. Then the height map is manipulated, similar to using Newton’s approximation...so the water doesn’t pool in a local minimum and lock into a specific answer if a more broad one exists. Hints why the nodes must be randomized before training starts. True learning should start with the same values and still be able to train to the correct solution.
I could be wrong. But it’s a rant. IMHO, Neurons have little to do with neural networks. Is statistics and training. Even with heuristics that help neural networks identify how to train. People had to have trained those heuristics that do that. So it may beat chess. I don’t see it coding or defeating the Turing problem.🤔😡
Related Rants
Polished statistics
joke/meme
ai
memes
machine learning