64
Comments
  • 1
    This is totally true.
    I remember reading an article that showed the venn diagram of how various parts of math, not just statistics and machine learning, all fit together.
  • 1
    The whole weights and activation being compared to a neuron is blown way out of proportion.

    I still envision it as a 3D/nD height map that water (input data) is poured over it. Then the height map is manipulated, similar to using Newton’s approximation...so the water doesn’t pool in a local minimum and lock into a specific answer if a more broad one exists. Hints why the nodes must be randomized before training starts. True learning should start with the same values and still be able to train to the correct solution.

    I could be wrong. But it’s a rant. IMHO, Neurons have little to do with neural networks. Is statistics and training. Even with heuristics that help neural networks identify how to train. People had to have trained those heuristics that do that. So it may beat chess. I don’t see it coding or defeating the Turing problem.🤔😡
  • 0
    (calculus 2)++
  • 0
    That's about it
Add Comment