Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "perceptron"
-
My AP Computer Science teacher changed my life in 2001. Until then, I wasn’t considering a career in development. He challenged me to write a back propagating multi layer perceptron neural network, and an RPN calculator. Every day made me think about how to solve problems at an algorithmic level. He was brutally honest, and one of the reasons why I’m an team manager now. RIP, your legacy won’t be forgotten.5
-
I recently went through a very detailed and well-explained Python-based project/lesson by Karpathy which is called micrograd. This is a tiny scalar-valued autograd engine and a neural net on top of it.
The project above is, as expected, built on Python. For learning purposes, I wanted to see how such a network may be implemented in TypeScript and came up with a 🤖 micrograd-ts - https://github.com/trekhleb/... repository (and also with a demo - https://trekhleb.dev/micrograd-ts/ of how the network may be trained).
Trying to build anything on your own very often gives you a much better understanding of a topic. So, this was a good exercise, especially taking into account that the whole code is just ~200 lines of TS code with no external dependencies.
The micrograd-ts repository might be useful for those who want to get a basic understanding of how neural networks work, using a TypeScript environment for experimentation.
With that being said, let me give you some more information about the project.
## Project structure
- [micrograd/](https://github.com/trekhleb/...) — this folder is the core/purpose of the repo
- [engine.ts](https://github.com/trekhleb/...) — the scalar `Value` class that supports basic math operations like `add`, `sub`, `div`, `mul`, `pow`, `exp`, `tanh` and has a `backward()` method that calculates a derivative of the expression, which is required for back-propagation flow.
- [nn.ts](https://github.com/trekhleb/...) — the `Neuron`, `Layer`, and `MLP` (multi-layer perceptron) classes that implement a neural network on top of the differentiable scalar `Values`.
- [demo/](https://github.com/trekhleb/...) - demo React application to experiment with the micrograd code
- [src/demos/](https://github.com/trekhleb/...) - several playgrounds where you can experiment with the `Neuron`, `Layer`, and `MLP` classes.
Demo (online)
---------------------
To see the online demo/playground, check the following link:
🔗 https://trekhleb.dev/micrograd-ts3 -
So, I feel wayyy behind the tech curve right now.
The SSD implementations you see online, they're still just a bunch of seperate sort of chaos machines that contain the standard perceptron-like model of a weight, cost, and bias right ? They just kind of inferred their values by training like any other neural network, in its sep-erate parts and just fed pieces of output data generated by other parts of the neural network to it right ?
I mean it implements with pytorch so its basically a really big array of tuples in a sense that are maniupulated in a specific way.
and then CNN's just feed data back into another trained piece of the model right ?
I'm curious because object classification is about the ONLY thing I've seen work even close to properly lol
there is just so much fraud these days. sigh.
and so many lamentable tech choices and attempts... like node lol