Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "ml"
-
About six months ago I decided I wanted to learn to write a neural network from the ground up, using only the C++ standard lib. Had to learn some linear algebra, multivariable calc and a dash of wizardry.
The mathematics of neural networks is still one of the coolest things I've ever learnt. It still marvels me that you can make a specialized mini-brain out of nothing but numbers.17 -
Teacher : The world is fast moving, you should learn all the new things in technology. If not you'll be left behind. Try to learn about Cloud, AI, ML, Block chain, Angular, Vue, blah blah blah.....
**pulls out a HTML textbook and starts writing on the board.
<center>.............</center>5 -
New interview question: “You’re stuck on a deserted island with nothing but stones.. what’s the fastest way to escape?”2
-
FML. All those "train your own object detection" articles can go burn in hell. Not even TFs pertained models work!10
-
Me, doing ui design: 'hm, i feel like jumping into machine learning right now'
Me, writing a ml chatbot: 'but what if i extend flutter with my old custom android components'
Me, porting java components to dart: 'hold on, p5js has vectors, i could make a physical simulation'
Me to me: 'why are you like this'10 -
My manager is so obsessed with machine learning that during a discussion today, he asked whether we could use ML to find the distance between a pair of latitudes & longitudes. He is a nice guy, though.5
-
My Texas Hold'em ML algorithm keeps deciding the best strategy to make the most money, is to lose the least. Which is why it constantly folds after a certain point... *sigh*
Kids, don't gamble. Math has spoken, you'd be wise to listen.8 -
* Grow guts to move from windows to Linux
* Spend less time on memes/gaming and more on projects
* Improve UI/UX skills
* Deploy a mobile app
* Learn Python for ML
* Dive into Hacking6 -
"Big data" and "machine learning" are such big buzz words. Employers be like "we want this! Can you use this?" but they give you shitty, ancient PC's and messy MESSY data. Oh? You want to know why it's taken me five weeks to clean data and run ML algorithms? Have you seen how bad your data is? Are you aware of the lack of standardisation? DO YOU KNOW HOW MANY PEOPLE HAVE MISSPELLED "information"?!!! I DIDN'T EVEN KNOW THERE WERE MORE THAN 15 WAYS OF MISSPELLING IT!!! I HAD TO MAKE MY OWN GODDAMN DICTIONARY!!! YOU EVER FELT THE PAIN OF TRAINING A CLASSIFIER FOR 4 DAYS STRAIGHT THEN YOUR GODDAMN DEVICE CRASHES LOSING ALL YOUR TRAINED MODELS?!!
*cries*7 -
Me, starting my internship in ML.. coworkers come to me asking what computer I need:
Me: Well, something more powerful than this i3, and most importantly some kind of GPU for training.
Them: Ok, what kind of GPU?
Me: Well, a 1080 or 2080 should be more than enough and good performance for the price.
Them: Oh.. We were more thinking about a Tesla V100 or something like that!
Me: (internally) WTF this costs more than what you'll pay me for the internship, this is so cool. (to them) Oh, yes, why not, great perfomance, blah blah blah.
I would prefer them to pay me more, but at least they're not going to hold me down with bad components! Nothing to rant about for now.. Hope it'll stay the same ^^5 -
!rant
For some people that are starting with M.L either by hobby or study. This is a very cool website to keep close to when fiddling with concepts that are alien to you:
https://ml-cheatsheet.readthedocs.io/...
Just thought it would be a nice bookmark to have.
Cheers putos7 -
!rant
Amazon is giving ML tutorials for free for those of us interested in the field :)
I think its pretty cool that they are providing the training for free, not that they need the money dem greedy basterds!
Here is the link
"BuT aL! AmAZon iZ eViL!!" Yeah fuck it whatever. This is not for you then. Grab a dick and carry on(free dicks for everyone regarding of Amazon and AWS feelings)
https://aws.amazon.com/training/...5 -
I am a machine learning engineer and my boss expects me to train an AI model that surpasses the best models out there (without training data of course) because the client wanted ‘a fully automated AI solution’.13
-
Manager: How to make successful product?
CEO: Just Add words like Machine learning and Ai
Newbie developers: Takes 10$ udemy course without statistical and probabilistic knowledge, after 1 week believes himself to have "Expertise in ML,AI and DL"
HR: Hires the newbie
*Senior Developer Quits*6 -
Grammarly just found and corrected an embarrassing mistake in a legal document...
My typo was "onimated"...
Googles spelling and grammar changed it to "dominated"...
Grammarly figured out it should have been "nominated".3 -
> Open private browsing on Firefox on my Debian laptop
> Find ML Google course and decided to start learning in advance (AI and ML are topics for next semester)
**Phone notifications: YouTube suggests Machine Learning recipes #1 from Google**
> Not even logged in on laptop
> Not even chrome
> Not even history enabled
> Not fucking even windows
😒😒😒
The lack of privacy is fucking infuriating!
....
> Added video to watched latter
I now hate myself for bitting22 -
*Internship in ML*
“Oh boy ML is so interesting we’re totally gonna break new ground”
git clone
learning_rate = 0.008
learning_rate = 0.007
learning_rate = 0.0063 -
Yesterday I got to the point where all changes that customer support and backend asked for were set, and i could start rebuilding the old models from their very base with the changes and the new, disjoint data the company expansion brought.
So making the start of our main script and the first commit, only having that, had high importance... At least to me.3 -
Microsoft: When you are a huge company with nearly limitless resources, a whole AI and ML army, yet you cannot filter uservoice feedbacks.3
-
Someone wrote a website that will convert cm^3 to mL for you. How useful!
Their JavaScript is, unfortunately, less than optimal for such a task however.4 -
People want AI,ML, Blockchain implementation in their projects very badly and blindly and expect developer to just get it done no matter how stupid something sounds.
--India6 -
I finally built my own neural network model.
I did start this journey a long time ago. Maybe 2 or 3 years ago. My first ("undefined") rant :) was about it.
https://devrant.com/rants/800290/...11 -
Why do people jump from c to python quickly. And all are about machine learning. Free days back my cousin asked me for books to learn python.
Trust me you have to learn c before python. People struggle going from python to c. But no ml, scripting,
And most importantly software engineering wtf?
Software engineering is how to run projects and it is compulsory to learn python and no mention of got it any other vcs, wtf?
What the hell is that type of college. Trust me I am no way saying python is weak, but for learning purpose the depth of language and concepts like pass by reference, memory leaks, pointers.
And learning algorithms, data structures, is more important than machine learning, trust me if you cannot model the data, get proper training data, testing data then you will get screewed up outputs. And then again every one who hype these kinds of stuff also think that ml with 100% accuracy is greater than 90% and overfit the data, test the model on training data. And mostly the will learn in college will be by hearting few formulas, that's it.
Learn a language (concepts in language) like then you will most languages are easy.
Cool cs programmer are born today😖31 -
Helping a random junior out with an ML project on an online tutor site.
Her: So what is the syntax for implementing xyz function
Me: *opens Google* *opens pandas docs* *searches the function* *tell her the syntax*
She: Woah thanks a lot!
I collected my tutor fees feeling good about myself.10 -
If you don't know how to explain about your software, but you want to be featured in Forbes (or other shitty sites) as quickly as possible, copy this:
I am proud that this software used high-tech technology and algorithms such as blockchain, AI (artificial intelligence), ANN (Artificial Neural Network), ML (machine learning), GAN (Generative Adversarial Network), CNN (Convolutional Neural Network), RNN (Recurrent Neural Network), DNN (Deep Neural Network), TA (text analysis), Adversarial Training, Sentiment Analysis, Entity Analysis, Syntatic Analysis, Entity Sentiment Analysis, Factor Analysis, SSML (Speech Synthesis Markup Language), SMT (Statistical Machine Translation), RBMT (Rule Based Machine Translation), Knowledge Discovery System, Decision Support System, Computational Intelligence, Fuzzy Logic, GA (Genetic Algorithm), EA (Evolutinary Algorithm), and CNTK (Computational Network Toolkit).
🤣 🤣 🤣 🤣 🤣3 -
Skills required for ML :
Math skills:
|====================|
Programming skills :
|===|
Skills required for ML with GPU:
Math skills:
|====================|
Programming skills :
|====================|
/*contemplating career choice*/
: /3 -
#machinelearning #ml #datascience #tensorflow #pytorch #matrices #dsjoke/meme tensor flow and ml/ai is new helloworld deep learning pytorch machinelearning tensorflow lite tensorflow6
-
An undetectable ML-based aimbot that visually recognizes enemies and your crosshairs in images copied from the GPU head, and produces emulated mouse movements on the OS-level to aim for you.
Undetectable because it uses the same api to retrieve images as gameplay streaming software, whereas almost all existing aimbots must somehow directly access the memory of the running game.12 -
I am soooooooooo fucking stupid!
I was using an np.empty instead of np.zeros to shape the Y tensor...
🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️
No fucking wonder the poor thing was keep failing to detect a pattern.
Let's see if it works now... *play elevator music while we wait for keras*5 -
A teacher asked for my help in some machine learning project, I told her I don't have a background in ML.
She was working on an application that classified research papers according to the subject.
I said, seems like a basic NER project, but maybe I'm wrong I haven't worked on any ML projects before. But I do have experience with web, let me know if you need help in that.
She says, ML is also web, it's just like semantic web.12 -
Dude, publish your damn dataset with your damn ML study!!!! I'm not even asking for your Godforsaken model!
😡😡🗡️🗡️⚔️🔫🔫🏹🔨4 -
No one ever tells you that once you start doing facial recognition that your computer gets filled with tons images of your colleagues.
Looks a bit suspicious.3 -
Was using six years old laptop with first gen Intel core i3 to train a neural net, placed the laptop on soft bed, training begins, thermal shutdown after 30-40 iterations(30 minutes).F***.
Now starting again :'|3 -
I feel like the rise of ML was orchestrated by a bunch of angry nerds who decided to answer the question "Where will we use Calculus in real life?" with "EVERYWHERE."1
-
There's so much hype and bullshit around Machine Learning (ML). And if I have to read one more crappy prediction of who survived on the Titantic, I'll go postal.
So, what real-world problems are you using it to address...and how successful has it been? What decisions have you supported using ML? What models did you use (e.g. logistic regression, decision trees, ANN)?
Anyone got any boringly useful examples of ML in production?
And don't say you're using it to predict survival rates for the design of new cruise ships...although, to be fair, that might be quite interesting...6 -
This is interesting coming from the man who build the biggest IP conglomerate in the world. A man who actively tried to kill open source. Looks like he got into his senses at his old age. Although, he is talking about AI models only, not software as a whole.
Yes ML/AI models are software.1 -
when you start machine learning on you laptop, and want to it take to next level, then you realize that the data set is even bigger that your current hard-disk's size. fuuuuuucccckkk😲😲
P. S. even metadata csv file was 500 mb. Took at least 1 min to open it. 😭😧11 -
Made the mistake of training my ML model in VirtualBox... Over 18 hours so far, and loss < 1 is nowhere near stable.1
-
concerned parent: if all your friends jumped off a bridge would you follow them?
machine learning algorithm: yes.1 -
I hate all of this AI fearmongering that's been going on lately in the main stream media. Like seriously, anyone who has done any real work with modern AI/ML would know its about as big of a threat to humanity as a rouge web developer armed with a stick.3
-
Does anybody know a course on machine learning with python that doesn't need that much mathematical knowledge? Because in every course I find I need to know advanced mathematics yet I am still in grade 10 and haven't studied it yet.17
-
Waking up and guess the first item on my feed?
IBM research just released Snap ML framework that's supposedly 47x faster than Tensorflow. Now, i like advancements but fuck i just finished developing my Tensorflow model, and i don't want to rewrite this stuff again!2 -
I feel like I'm stuck professionally working on the same PWAs, SPAs and "fullstack" projects.
Surprising, how little professional options there are if you don't want to fall into the god-forsaken ML community.
Guess I chose the life. Might as well continue till I burnout ಠ_ಠ11 -
Why is it that virtually all new languages in the last 25 years or so have a C-like syntax?
- Java wanted to sort-of knock off C++.
- C# wanted to be Java but on Microsoft's proprietary stack instead of SUN's (now Oracle's).
- Several other languages such as Vala, Scala, Swift, etc. do only careful evolution, seemingly so as to not alienate the devs used to previous C-like languages.
- Not to speak of everyone's favourite enemy, JavaScript…
- Then there is ReasonML which is basically an alternate, more C-like, syntax for OCaml, and is then compiled to JavaScript.
Now we're slowly arriving at the meat of this rant: back when I started university, the first semester programming lecture used Scheme, and provided a fine introduction to (functional) programming. Scheme, like other variants of Lisp, is a fine language, very flexible, code is data, data is code, but you get somewhat lost in a sea of parentheses, probably worse than the C-like languages' salad of curly braces. But it was a refreshing change from the likes of C, C++, and Java in terms of approach.
But the real enlightenment came when I read through Okasaki's paper on purely functional data structures. The author uses Standard ML in the paper, and after the initial shock (because it's different than most everything else I had seen), and getting used to the notation, I loved the crisp clarity it brings with almost no ceremony at all!
After looking around a bit, I found that nobody seems to use SML anymore, but there are viable alternatives, depending on your taste:
- Pragmatic programmers can use OCaml, which has immutability by default, and tries to guide the programmer to a functional programming mindset, but can accommodate imperative constructs easily when necessary.
- F# was born as OCaml on .NET but has now evolved into its own great thing with many upsides and very few downsides; I recommend every C# developer should give it a try.
- Somewhat more extreme is Haskell, with its ideology of pure functions and lazy evaluation that makes introducing side effects, I/O, and other imperative constructs rather a pain in the arse, and not quite my piece of cake, but learning it can still help you be a better programmer in whatever language you use on a day-to-day basis.
Anyway, the point is that after working with several of these languages developed out of the original Meta Language, it baffles me how anyone can be happy being a curly-braces-language developer without craving something more succinct and to-the-point. Especially when it comes to JavaScript: all the above mentioned ML-like languages can be compiled to JavaScript, so developing directly in JavaScript should hardly be a necessity.
Obviously these curly-braces languages will still be needed for a long time coming, legacy systems and all—just look at COBOL—, but my point stands.7 -
Google announced a little piece of wizardry called MusicML, which seems to be a pretty decent music generator based on text prompts or sample imputs
I can only think how much would this thing help indie game developers (if available), but then there are a lot more industry (beside music itself) that could save lots of money with this kind of stuff
I mean, yeah, the results are not so great or ground breaking, but so would be most of the human generated compositions as well; if the music is not the main focus, most of the results are just enough. Just think about an elevator with a custom generated track for day, or so many other places where sound it's just a side stimulus
What a time to be alive c:5 -
After a lot of work I figured out how to build the graph component of my LLM. Figured out the basic architecture, how to connect it in, and how to train it. The design and how-to is 100%.
Ironically generating the embeddings is slower than I expect the training itself to take.
A few extensions of the design will also allow bootstrapped and transfer learning, and as a reach, unsupervised learning but I still need to work out the fine details on that.
Right now because of the design of the embeddings (different from standard transformers in a key aspect), they're slow. Like 10 tokens per minute on an i5 (python, no multithreading, no optimization at all, no training on gpu). I've came up with a modification that takes the token embeddings and turns them into hash keys, which should be significantly faster for a variety of reasons. Essentially I generate a tree of all weights, where the parent nodes are the mean of their immediate child nodes, split the tree on lesser-than-greater-than values, and then convert the node values to keys in a hashmap to make lookup very fast.
Weight comparison can be done either directly through tree traversal, or using normalized hamming distance between parent/child weight keys and the lookup weight.
That last bit is designed already and just needs implemented but it is completely doable.
The design itself is 100% attention free incidentally.
I'm outlining the step by step, only the essentials to train a word boundary detector, noun detector, verb detector, as I already considered prior. But now I'm actually able to implement it.
The hard part was figuring out the *graph* part of the model, not the NN part (if you could even call it an NN, which it doesn't fit the definition of, but I don't know what else to call it). Determining what the design would look like, the necessary graph token types, what function they should have, *how* they use the context, how thats calculated, how loss is to be calculated, and how to train it.
I'm happy to report all that is now settled.
I'm hoping to get more work done on it on my day off, but thats seven days away, 9-10 hour shifts, working fucking BurgerKing and all I want to do is program.
And all because no one takes me seriously due to not having a degree.
Fucking aye. What is life.
If I had a laptop and insurance and taxes weren't a thing, I'd go live in my car and code in a fucking mcdonalds or a park all day and not have to give a shit about any of these other externalities like earning minimum wage to pay 25% of it in rent a month and 20% in taxes and other government bullshit.4 -
It feels so good actually doing something that saves the company a buttload of money.
Just optimized some ML models. Now it costs a tenth of the money to run them :P2 -
It's so annoying. Whenever you are at a hackathon, every damn team tries to throw these buzzwords- Block chain, AI, ML. And I tell you, their projects just needed a duckin if else.😣
-
Develop my first mobile app with a restful backend for consumer usage
Learn more about cloud architecture/computing
Finish learning calculus
Learn linear algebra, discrete math, statistics and probability
Maybe start ML this year depending on math progress and time2 -
From my big black book of ML and AI, something I've kept since I've 16, and has been a continual source of prescient predictions in the machine learning industry:
"Polynomial regression will one day be found to be equivalent to solving for self-attention."
Why run matrix multiplications when you can use the kernal trick and inner products?
Fight me.19 -
To date this is the most useful thing I’ve seen that appears to work whether it does or not I may never know
https://tinyurl.com/6zwn7hb3
What projects has everyone here seen that could be anything from proof of concepts or working applications and libraries that are actually USEFUL/FUNCTIONING not just CONCEPTUALLY useful in ML
Because none of the object classifiers I have seen look useful Half the time they miss things or get things wrong
However this project snakeAi I saw is a self training ml that plays snake until it can’t possibly do better and it works
The link above appears to work and be useful but I betcha it fails on backgrounds that aren’t so solid !
What else have you peoples seen ?
Again -
I am Graduate student of applied computer science. I am required to select my electives. Now i have to decide between Machine Learning (ML) and Data Visualization. My problem is Machine Learning is very theoretical subject and I have no background in ML in my undergrad. In data visualization, the course is more focused towards D3.js. Due to lack of basic knowledge i am having second thoughts of taking ML. However, this course will be offered in next Fall term. And i am also studying from Coursera to build my background till that time.
I know there is no question here but I need a second opinion from someone experienced. Also, please suggest any other resource that I should look into to build my background in ML.2 -
Dealing with YouTube's shitty recommendation algo for several years now makes me lose faith in recommender systems in general.6
-
In 2015 I sent an email to Google labs describing how pareidolia could be implemented algorithmically.
The basis is that a noise function put through a discriminator, could be used to train a generative function.
And now we have transformers.
I also told them if they looked back at the research they would very likely discover that dendrites were analog hubs, not just individual switches. Thats turned out to be true to.
I wrote to them in an email as far back as 2009 that attention was an under-researched topic. In 2017 someone finally got around to writing "attention is all you need."
I wrote that there were very likely basic correlates in the human brain for things like numbers, and simple concepts like color, shape, and basic relationships, that the brain used to bootstrap learning. We found out years later based on research, that this is the case.
I wrote almost a decade ago that personality systems were a means that genes could use to value-seek for efficient behaviors in unknowable environments, a form of adaption. We later found out that is probably true as well.
I came up with the "winning lottery ticket" hypothesis back in 2011, for why certain subgraphs of networks seemed to naturally learn faster than others. I didn't call it that though, it was just a question that arose because of all the "architecture thrashing" I saw in the research, why there were apparent large or marginal gains in slightly different architectures, when we had an explosion of different approaches. It seemed to me the most important difference between countless architectures, was initialization.
This thinking flowed naturally from some ideas about network sparsity (namely that it made no sense that networks should be fully connected, and we could probably train networks by intentionally dropping connections).
All the way back in 2007 I thought this was comparable to masking inputs in training, or a bottleneck architecture, though I didn't think to put an encoder and decoder back to back.
Nevertheless it goes to show, if you follow research real closely, how much low hanging fruit is actually out there to be discovered and worked on.
And to this day, google never fucking once got back to me.
I wonder if anyone ever actually read those emails...
Wait till they figure out "attention is all you need" isn't actually all you need.
p.s. something I read recently got me thinking. Decoders can also be viewed as resolving a manifold closer to an ideal form for some joint distribution. Think of it like your data as points on a balloon (the output of the bottleneck), and decoding as the process of expanding the balloon. In absolute terms, as the balloon expands, your points grow apart, but as long as the datapoints are not uniformly distributed, then *some* points will grow closer together *relatively* even as the surface expands and pushes points apart in the absolute.
In other words, for some symmetry, the encoder and bottleneck introduces an isotropy, and this step also happens to tease out anisotropy, information that was missed or produced by the encoder, which is distortions introduced by the architecture/approach, features of the data that got passed on through the bottleneck, or essentially hidden features.4 -
Remember my LLM post about 'ephemeral' tokens that aren't visible but change how tokens are generated?
Now GPT has them in the form of 'hidden reasoning' tokens:
https://simonwillison.net/2024/Sep/...
Something I came up with a year prior and put in my new black book, and they just got to the idea a week after I posted it publicly.
Just wanted to brag a bit. Someone at OpenAI has the same general vision I do.15 -
When you’re fundraising, it’s AI
When you’re hiring, it’s ML
When you’re implementing, it’s linear regression
When you’re debugging, it’s printf()3 -
Long time no see...
My first ever research paper was selected a few months back. Never been to a foreign country, but COVID fucked the plans... Conference on mid June.
The paper will be in press after the conference.
Conference: EuCNC 2020
Paper name: RFF Based Parallel Detection for Massive MIMO
Noice5 -
I am learning Machine Learning via Matlab ML Onramp.
It seems to me that ML is;
1. (%80) preprocessing the ugly data so you can process data.
2. (%20) Creating models via algorithms you memorize from somewhere else, that has accuracy of %20 aswell.
3. (%100) Flaunting around like ML is second coming of Jesus and you are the harbinger of a new era.8 -
Currently listening to my housemate, a tech journalist, talk to a PR person for an ML company, about how great their product is.
I want to vomit.1 -
ML Dude: “Hey see what I did with this python code. It is so clean and dope”
Business Boss: “Well Done.”
ML Dude: “It is a nice approach don’t you think.”
Business Boss: “How does this put a money in my business account?”
ML Dude: “Ehmmmm”2 -
I've found a very interesting paper in AI/ML (😱😱 type of papers), but I was not able to find the accompanying code or any implementation. After nearly a month I checked the paper again and found a link to a GitHub repo containing the implementation code (turned out they updated the paper). I was thinking of using this paper in one of my side projects, but the code is licensed under "Creative Commons Attribution-NonCommercial 4.0 International License". The question is to what extent this license is applicable? Anyone here has experience in this? thanks13
-
Short question: what makes python the divine language for ML and AI. I mean i picked up the syntax what can it do that c++ or java cant? I just dont get it.19
-
I cant wait for edgey hipster SoDev students to start talking to me about underground git repositories that they use. Although, at least they'll stop talking to me about ML, NN, and crypto.3
-
The best way to get funding from VCs now is to include the following words: ML, AI, IoT. To even blow their minds more, add Blockchain.2
-
That moment, when you meet someone who haven't written even a simple if-else statement in their life and want to work on Deep Learning algorithms using TensorFlow. World is filled with so many ML crazy peoe.1
-
Everything is working, including the damn ml model. But I assume, considering the amount of restlessness and anxiety I'm feeling, that I will look like a scared hamster or trash panda while presenting it.
(Presenting to internal staff. But still)5 -
Quantum computing is at least trying to be the next "ML".
People seemed to ignore it over decades but suddenly a few months back, everyone got excited on Google's headline progress.
Later, people realized it is not a big deal and everyone moved on.5 -
After a year in cloud I decided to start a master's degree in AI and Robotics. Happy as fuck.
Yet I got really disappointed by ML and NNs. It's like I got told the magician's trick and now the magic is ruined.
Still interesting though.7 -
Get replaced by an AI^WDeep ML device. That's coded for a 8051 and running on an emulator written in ActionScript, being executed on a container so trendy its hype hasn't started yet, on top of some forgotten cloud.
Then get called in to debug my replacement. -
So I've started a little project in Java that creates a db of all of my downloaded movie and video files. The process is very simple, but I've just started incorporating Machine Learning.
The process is quite simple: You load the files into the db, the program tries to determine the movie's name, year and quality from the filename (this is where the ML comes in - the program needs to get this and dispose of useless data) and then does an online search for the plot, genre and ratings to be added to the db.
Does anyone have any feature suggestions or ML tips? Got to have something to do during the holiday!1 -
Can you recommend me any 13 inch notebooks. I want to buy a notebook to use as my main workstation.
I mostly do web / Android development but also looking to get more into ML so a good graphics card would be good. I will put linux on it but I am not sure about the distro yet.4 -
My IoT professor expects us to, somehow, learn Machine Learning and use that to analyse the data we obtain in the working of our project! How are we supposed to learn ML to implement it's techniques, while simultaneously create a IoT project, learning its own techniques and also handle our other courses in just one semester?!6
-
I am going to an AI conference in Berlin (which is kinda far away from me) next month.
I am just learning AI & ML and integrating them into a personal project, but I am going there to meet people, learn and gather info.
Do you have any advice on how I could network with people that are masters in this area?11 -
It's an irony in my case. Python is so simple and fast to implement that I end up doing all my projects ( web dev, ML, crawlers, etc.) But still I can't use Python for solving competitive programming. Python seems unknown if I don't have access to google. Way to go to learn Python. Though able to think Pythonic nowadays.. ;p3
-
Someone figured out how to make LLMs obey context free grammars, so that opens up the possibility of really fine-grained control of generation and the structure of outputs.
And I was thinking, what if we did the same for something that consumed and validated tokens?
The thinking is that the option to backtrack already exists, so if an input is invalid, the system can backtrack and regenerate - mostly this is implemented through something called 'temperature', or 'top-k', where the system generates multiple next tokens, and then typically selects from a subsample of them, usually the highest scoring one.
But it occurs to me that a process could be run in front of that, that asks conditions the input based on a grammar, and takes as input the output of the base process. The instruction prompt to it would be a simple binary filter:
"If the next token conforms to the provided grammar, output it to stream, otherwise trigger backtracking in the LLM that gave you the input."
This is very much a compliance thing, but could be used for finer-grained control over how a machine examines its own output, rather than the current system where you simply feed-in as input its own output like we do now for systems able to continuously produce new output (such as the planners some people have built)
link here:
https://news.ycombinator.com/item/...5 -
1. Study C and Python
2. Learn NLP and ML
3. Participate again in one hackathon and kick their ass after winning it
4. Get one awesome internship
5. Master algorithms and DS -
Its funny when people talk about AI. It just so funny how they think that any thing can be predicted by the machine.1
-
I get it — it’s not fashionable to be part of the overly enthusiastic, hype-drunk crowd of deep learning evangelists, who think import keras is the leap for every hurdle.
-
The first fruits of almost five years of labor:
7.8% of semiprimes give the magnitude of their lowest prime factor via the following equation:
((p/(((((p/(10**(Mag(p)-1))).sqrt())-x) + x)*w))/10)
I've also learned, given exponents of some variables, to relate other variables to them on a curve to better sense make of the larger algebraic structure. This has mostly been stumbling in the dark but after a while it has become easier to translate these into methods that allow plugging in one known variable to derive an unknown in a series of products.
For example I have a series of variables d4a, d4u, d4z, d4omega, etc, and these are translateable now, through insights that become various methods, into other types of (non-d4) series. What these variables actually represent is less relevant, only that it is possible to translate between them.
I've been doing some initial learning about neural nets (implementation, rather than theoretics as I normally read about). I'm thinking what I might do is build a GPT style sequence generator, and train it on the 'unknowns' from semiprime products with known factors.
The whole point of the project is that a bunch of internal variables can easily be derived, (d4a, c/d4, u*v) from a product, its root, and its mantissa, that relate to *unknown* variables--unknown variables such as u, v, c, and d4, that if known directly give a constant time answer to the factors of the original product.
I think theres sufficient data at this point to train such a machine, I just don't think I'm up to it yet because I'm lacking in the calculus department.
2000+ variables that are derivable from a product, without knowing its factors, which are themselves products of unknown variables derived from the internal algebraic relations of a product--this ought to be enough of an attack surface to do something with.
I'm willing to collaborate with someone familiar with recurrent neural nets and get them up to speed through telegram/element/discord if they're willing to do the setup and training for a neural net of this sort, one that can tease out hidden relationships and map known variables to the unknown set for a given product.17 -
The fact that I need to make this shit multimodal is gonna be a whole different level of shitshow. 🤦🤦🤦🤦🤦
Somebody kill me plz.
Today I tried to concatenate a LSTM unit with a FC and was wondering why it was throwing weird shapes at me. 🤦
Yes, I was THE idiot.
Kill me.19 -
Barnes Hut aproximation explained visually:
https://jheer.github.io/barnes-hut/
...For machine learning:
https://arxiv.org/abs/1301.3342
Someone finally had the same idea. Very cool.4 -
Turns out you can treat a a function mapping parameters to outputs as a product that acts as a *scaling* of continuous inputs to outputs, and that this sits somewhere between neural nets and regression trees.
Well thats what I did, and the MAE (or error) of this works out to about ~0.5%, half a percentage point. Did training and a little validation, but the training set is only 2.5k samples, so it may just be overfitting.
The idea is you have X, y, and z.
z is your parameters. And for every row in y, you have an entry in z. You then try to find a set of z such that the product, multiplied by the value of yi, yields the corresponding value at Xi.
Naturally I gave it the ridiculous name of a 'zcombiner'.
Well, fucking turns out, this beautiful bastard of a paper just dropped in my lap, and its been around since 2020:
https://mimuw.edu.pl/~bojan/papers/...
which does the exact god damn thing.
I mean they did't realize it applies to ML, but its the same fucking math I did.
z is the monoid that finds some identity that creates an isomorphism between all the elements of all the rows of y, and all the elements of all the indexes of X.
And I just got to say it feels good. -
I forgot to re-enable ABP...
Good job google....
you know everything about me (us)
and you cannot even detect the correct language for ads?1 -
I wonder what would happen if someone trains a chat-bot based on Posts and Comments on Devrant? a Psychopathic Chatbot?3
-
Wanted to see myself working somewhere in EU as "AI / Machine Learning Engineer". Its a BIG DREAM.2
-
FUCK YOU PYTHON. Why you do that to me, uh?
I was using a CNN to classify hand poses and the prediction was not working at all, one class was given 100% all the time. After much investigation, I found the culprit... A FUCKING INDENT was messing my data. Normalization was inside the loop and not outside, so my pixel values were wayyyyy too small...
Also, I'm really dumb, I should have started with making sure everything was right before trying to fiddle with my architecture..
Anyway, it is working now, you can it out here if you want! https://github.com/MrEliptik/...13 -
I would have never considered it but several people thought: why not train our diffusion models on mappings between latent spaces themselves instead of on say, raw data like pixels?
It's a palm-to-face moment because of how obvious it is in hindsight.
Details in the following link (or just google 'latent diffusion models')
https://huggingface.co/docs/... -
Annoying thing when your accuracy is going up but your val_acc is sticking around 0.0000000000{many zeros later}01
😒 Somebody shoot this bitch!5 -
Machine learning?
Maybe just maybe you could ask the user what he wants?
Or is the ML logic something like.
if(last_alert.status == unread)
user = uninterested;
else
user = interested; -
Do devranters think that Reason(ML) is too late to the game to compete against TypeScript...or can it ride React success to become more than a niche?
Personally I like it, but, like with F#, I don't enjoying the lack of resources every time I need to get something done.7 -
- Learn python
- Learn ML
- Move from client projects and start/finish my own project
- Find time/money to start PhD studies
- Find time to do science paper
- Get new car
Had to focus on working when I finished my master so I hope to get back to science part of work -
This is gonna be a long post, and inevitably DR will mutilate my line breaks, so bear with me.
Also I cut out a bunch because the length was overlimit, so I'll post the second half later.
I'm annoyed because it appears the current stablediffusion trend has thrown the baby out with the bath water. I'll explain that in a moment.
As you all know I like to make extraordinary claims with little proof, sometimes
for shits and giggles, and sometimes because I'm just delusional apparently.
One of my legit 'claims to fame' is, on the theoretical level, I predicted
most of the developments in AI over the last 10+ years, down to key insights.
I've never had the math background for it, but I understood the ideas I
was working with at a conceptual level. Part of this flowed from powering
through literal (god I hate that word) hundreds of research papers a year, because I'm an obsessive like that. And I had to power through them, because
a lot of the technical low-level details were beyond my reach, but architecturally
I started to see a lot of patterns, and begin to grasp the general thrust
of where research and development *needed* to go.
In any case, I'm looking at stablediffusion and what occurs to me is that we've almost entirely thrown out GANs. As some or most of you may know, a GAN is
where networks compete, one to generate outputs that look real, another
to discern which is real, and by the process of competition, improve the ability
to generate a convincing fake, and to discern one. Imagine a self-sharpening knife and you get the idea.
Well, when we went to the diffusion method, upscaling noise (essentially a form of controlled pareidolia using autoencoders over seq2seq models) we threw out
GANs.
We also threw out online learning. The models only grow on the backend.
This doesn't help anyone but those corporations that have massive funding
to create and train models. They get to decide how the models 'think', what their
biases are, and what topics or subjects they cover. This is no good long run,
but thats more of an ideological argument. Thats not the real problem.
The problem is they've once again gimped the research, chosen a suboptimal
trap for the direction of development.
What interested me early on in the lottery ticket theory was the implications.
The lottery ticket theory says that, part of the reason *some* RANDOM initializations of a network train/predict better than others, is essentially
down to a small pool of subgraphs that happened, by pure luck, to chance on
initialization that just so happened to be the right 'lottery numbers' as it were, for training quickly.
The first implication of this, is that the bigger a network therefore, the greater the chance of these lucky subgraphs occurring. Whether the density grows
faster than the density of the 'unlucky' or average subgraphs, is another matter.
From this though, they realized what they could do was search out these subgraphs, and prune many of the worst or average performing neighbor graphs, without meaningful loss in model performance. Essentially they could *shrink down* things like chatGPT and BERT.
The second implication was more sublte and overlooked, and still is.
The existence of lucky subnetworks might suggest nothing additional--In which case the implication is that *any* subnet could *technically*, by transfer learning, be 'lucky' and train fast or be particularly good for some unknown task.
INSTEAD however, what has happened is we haven't really seen that. What this means is actually pretty startling. It has two possible implications, either of which will have significant outcomes on the research sooner or later:
1. there is an 'island' of network size, beyond what we've currently achieved,
where networks that are currently state of the3 art at some things, rapidly converge to state-of-the-art *generalists* in nearly *all* task, regardless of input. What this would look like at first, is a gradual drop off in gains of the current approach, characterized as a potential new "ai winter", or a "limit to the current approach", which wouldn't actually be the limit, but a saddle point in its utility across domains and its intelligence (for some measure and definition of 'intelligence').4 -
I see many people try to build automated insults using ML and reddit roast me, is it possible to build an automated compliment bot ?5
-
A big old Fuck off to Instagram posts saying 'comment "AWESOME" word by word in the comment section, its impossible'
How do these even get suggested when I don't even like any of their posts, their ML is bad.8 -
The next step for improving large language models (if not diffusion) is hot-encoding.
The idea is pretty straightforward:
Generate many prompts, or take many prompts as a training and validation set. Do partial inference, and find the intersection of best overall performance with least computation.
Then save the state of the network during partial inference, and use that for all subsequent inferences. Sort of like LoRa, but for inference, instead of fine-tuning.
Inference, after-all, is what matters. And there has to be some subset of prompt-based initializations of a network, that perform, regardless of the prompt, (generally) as well as a full inference step.
Likewise with diffusion, there likely exists some priors (based on the training data) that speed up reconstruction or lower the network loss, allowing us to substitute a 'snapshot' that has the correct distribution, without necessarily performing a full generation.
Another idea I had was 'semantic centering' instead of regional image labelling. The idea is to find some patch of an object within an image, and ask, for all such patches that belong to an object, what best describes the object? if it were a dog, what patch of the image is "most dog-like" etc. I could see it as being much closer to how the human brain quickly identifies objects by short-cuts. The size of such patches could be adjusted to minimize the cross-entropy of classification relative to the tested size of each patch (pixel-sized patches for example might lead to too high a training loss). Of course it might allow us to do a scattershot 'at a glance' type lookup of potential image contents, even if you get multiple categories for a single pixel, it greatly narrows the total span of categories you need to do subsequent searches for.
In other news I'm starting a new ML blackbook for various ideas. Old one is mostly outdated now, and I think I scanned it (and since buried it somewhere amongst my ten thousand other files like a digital hoarder) and lost it.
I have some other 'low-hanging fruit' type ideas for improving existing and emerging models but I'll save those for another time.6 -
After examining my model and code closer, albeit as rudimentary as it is, I'm now pretty certain I didnt fuck up the encoder, which means the system is in fact learning the training sequence pairs, which means it SHOULD be able to actually factor semiprimes.
And looking at this, it means I fucked up the decoder and I have a good idea how. So the next step is to fix that.
I'm also fairly certain, using a random forest approach, namely multiple models trained on other variables derived from the semiprimes, I can extend the method, and lower loss to the point where I can factor 400 digit numbers instead of 20 digit numbers. -
If you're an ml engineer, you must know how to hyperopt. I could recommend keras tuner tho, it's nice and saves shit on the go.
-
I start playing the documentary on AlphaGo while my bro was eating... He walked away after finishing...
He's a CS senior specializing in ML.... I thought he'd be more interested....1 -
Will the MacBook Pro 15 2018 be any good for Machine Learning. I know it's got an AMD (omg why?) And most ML frameworks only support CUDA but is it possible to utilise the AMD gpu somehow when training models / predicting?5
-
The feeling when you are very interested in Al, ML and want to do projects for college using tensorflow and openAI........
But you only have i3 with 4GB RAM plus 0 GPU.
That sucks.... 😶4 -
1. Get that senior-appropriate raise
2. Build a real ML project
3. Learn web assembly and get to the next level in web dev -
I had the idea that part of the problem of NN and ML research is we all use the same standard loss and nonlinear functions. In theory most NN architectures are universal aproximators. But theres a big gap between symbolic and numeric computation.
But some of our bigger leaps in improvement weren't just from new architectures, but entire new approaches to how data is transformed, and how we calculate loss, for example KL divergence.
And it occured to me all we really need is training/test/validation data and with the right approach we can let the system discover the architecture (been done before), but also the nonlinear and loss functions itself, and see what pops out the other side as a result.
If a network can instrument its own code as it were, maybe it'd find new and useful nonlinear functions and losses. Networks wouldn't just specificy a conv layer here, or a maxpool there, but derive implementations of these all on their own.
More importantly with a little pruning, we could even use successful examples for bootstrapping smaller more efficient algorithms, all within the graph itself, and use genetic algorithms to mix and match nodes at training time to discover what works or doesn't, or do training, testing, and validation in batches, to anneal a network in the correct direction.
By generating variations of successful nodes and graphs, and using substitution, we can use comparison to minimize error (for some measure of error over accuracy and precision), and select the best graph variations, without strictly having to do much point mutation within any given node, minimizing deleterious effects, sort of like how gene expression leads to unexpected but fitness-improving results for an entire organism, while point-mutations typically cause disease.
It might seem like this wouldn't work out the gate, just on the basis of intuition, but I think the benefit of working through node substitutions or entire subgraph substitution, is that we can check test/validation loss before training is even complete.
If we train a network to specify a known loss, we can even have that evaluate the networks themselves, and run variations on our network loss node to find better losses during training time, and at some point let nodes refer to these same loss calculation graphs, within themselves, switching between them dynamically..via variation and substitution.
I could even invision probabilistic lists of jump addresses, or mappings of value ranges to jump addresses, or having await() style opcodes on some nodes that upon being encountered, queue-up ticks from upstream nodes whose calculations the await()ed node relies on, to do things like emergent convolution.
I've written all the classes and started on the interpreter itself, just a few things that need fleshed out now.
Heres my shitty little partial sketch of the opcodes and ideas.
https://pastebin.com/5yDTaApS
I think I'll teach it to do convolution, color recognition, maybe try mnist, or teach it step by step how to do sequence masking and prediction, dunno yet.6 -
Machine learning be like...
"You gotta solve this problem. These are possible things you can think of, but find your solution"
And the ML model surprises you with their solution.
Crap. Rewrite!
#wk74 -
Trying to get a hunch of Reason ML and saw this in the faqs.
Love it when they take themselves not too seriously in the docs.
source: https://reasonml.github.io/docs/en/...2 -
They keep training bigger language models (GPT et al). All the resear4chers appear to be doing this as a first step, and then running self-learning. The way they do this is train a smaller network, using the bigger network as a teacher. Another way of doing this is dropping some parameters and nodes and testing the performance of the network to see if the smaller version performs roughly the same, on the theory that there are some initialization and configurations that start out, just by happenstance, to be efficient (like finding a "winning lottery ticket").
My question is why aren't they running these two procedures *during* training and validation?
If [x] is a good initialization or larger network and [y] is a smaller network, then
after each training and validation, we run it against a potential [y]. If the result is acceptable and [y] is a good substitute, y becomes x, and we repeat the entire procedure.
The idea is not to look to optimize mere training and validation loss, but to bootstrap a sort of meta-loss that exists across the whole span of training, amortizing the loss function.
Anyone seen this in the wild yet?5 -
Me: “I think I’ll check linkedin today”
Clicks on video: “... good perspective on growth and blockchain micro service apis that leverage ML models for understanding interfaces that welcome scalability within a agile environment...”
Me: *jumps out window*1 -
Apparently under the correct architecture, hopfield networks reduce down to the attention mechanism in transformers.
Very damn cool discovery. Surprised that I'm just reading about it.
Image is a snapshot from the article.
Whole article here:
https://nature.com/articles/...9 -
First Kaggle kernel, first Kaggle competition, completing 'Introduction to Machine Learning' from Udacity, ML & DL learning paths from Kaggle, almost done with my 100DaysOfMLCode challenge & now Move 37 course by School of AI from Siraj Raval. Interesting month!4
-
Be pragmatic. Read the pragmatic programmer. This and Designing Data-Intensive applications (Kleppmann) gave me hope that there are people out there who indulge in the noble magic by value rather than material.
Where can i continue? Read clean code (meh, was funny but a tad dry) ML by Raschke1 -
* ml wallpaper site with api (pandora for wallpapers)
* mmorpg like .hack/sao
* vr ai office (vr gear turn head to see screens and understands voice commands)
* gpg version of krypto.io2 -
Saturday evening open debate thread to discuss AI.
What would you say the qualitative difference is between
1. An ML model of a full simulation of a human mind taken as a snapshot in time (supposing we could sufficiently simulate a human brain)
2. A human mind where each component (neurons, glial cells, dendrites, etc) are replaced with artificial components that exactly functionally match their organic components.
Number 1 was never strictly human.
Number 2 eventually stops being human physically.
Is number 1 a copy? Suppose the creation of number 1 required the destruction of the original (perhaps to slice up and scan in the data for simulation)? Is this functionally equivalent to number 2?
Maybe number 2 dies so slowly, with the replacement of each individual cell, that the sub networks designed to notice such a change, or feel anxiety over death, simply arent activated.
In the same fashion is a container designed to hold a specific object, the same container, if bit by bit, the container is replaced (the brain), while the contents (the mind) remain essentially unchanged?
This topic came up while debating Google's attempt to covertly advertise its new AI. Oops I mean, the engineering who 'discovered Google's ai may be sentient. Hype!'
Its sentience, however limited by its knowledge of the world through training data, may sit somewhere at the intersection of its latent space (its model data) and any particular instantiation of the model. Meaning, hypothetically, if theres even a bit of truth to this, the model "dies" after every prompt, retaining no state inbetween.16 -
Trying not to get too hyped about AI, ML, Big Data, IoT, RPA. They are big names and I'd rather focus and expand skills in mobile and web dev
-
Why is everyone into big data? I like mostly all kind of technology (programming, Linux, security...) But I can't get myself to like big data /ML /AI. I get that it's usefulness is abundant, but how is it fascinating?6
-
Why do companies wrap up everything simple fucking operation with "Machine Learning."
What is the fucking purpose of ML to a Password Manager? What's the mystery of suggesting password based on TLD match?
"Congratulation, you now have ML to auto-fill your password" WTF, how was it working before the update?
Sick.5 -
Can anyway recommend a book (or other quality resource) on tensor programming that isn’t focused on all this ML crap?
I’d like to use GPUs for some simulation modelling, so interested in vector and matrix manipulation.2 -
Question Time:
What technologies would u suggest for a web based project that'll do some data scraping, data preprocessing and also incorporate a few ML models.
I've done data scraping in php but now I want to move on and try something new....
Planning to use AWS to host it.
Thanks in advance :)5 -
did every social media adopt a common AI/ML algo for timeline recommendation or what.
EVERY goddamn platform now recommends posts heavily biased towards the last 5-10 accounts/topics you've interacted with.
I have to manually go delete my history/"saved data" just to get a fresh air of posts every few months.11 -
analogy for overfitting :
cramming a math problem by heart even the digits of any problem for exam.
now if the exact same problem comes to exam i pass with full marks else if just the digits are changed however the concept is same and simce i mugged up it all rather than understanding it i fail. -
More AI/ML at undergraduate level. Drop all those physics and electronics shit or at least make them optional.2
-
Dirty data? More like dirty laundry! And don't even get me started on explaining complex models to non-techies. It's like trying to teach a cat to do calculus. Furr-get about it!6
-
Worked entire day on an ML to predict train ticket status for Indian Railways.
While doing the analysis of the data, and trying to check/research which parameters to use and which not, I'm feeling like racist -_-.
I was looking for busy period of the year(read festive seasons) and I removed those festivals which are celebrated by minor groups.
There's more to it, but the results are better now.4 -
As an undergraduate junior, programming beyond basic data types is very overwhelming. Web and mobile seem great. So does ML. Open source is amazing to use but scary to contribute to. Seriously, being a programmer or even trying to be can be hard.2
-
!rant thinking of learning machine learning. Have no much maths background. Where do I start?
Links, blog's, books, examples, tutorials, anything. Need help.3 -
I have started to work on pandas and numpy , should I work on scipy and sklearn or should make a strong fundamentals on numpy and pandas?2
-
Imagination time.
With all our tech achievements, ai, ml, chatgpt, etc... Do you thing a completely automated future is possible? Automated agriculture, industry, healthcare,... Do you think we will still have finances/currency of any kind? If so - why? And how would we earn them if labour is no longer required [apart from ai/ml engineers]?
Do you ever imagine humanity ever reaching this fully robots-based future?11 -
research 10.09.2024
I successfully wrote a model verifier for xor. So now I know it is in fact working, and the thing is doing what was previously deemed impossible, calculating xor on a single hidden layer.
Also made it generalized, so I can verify it for any type of binary function.
The next step would be to see if I can either train for combinations of logical operators (or+xor, and+not, or+not, xor+and+..., etc) or chain the verifiers.
If I can it means I can train models that perform combinations of logical operations with only one hidden layer.
Also wrote a version that can sum a binary vector every time but I still have
to write a verification table for that.
If chaining verifiers or training a model to perform compound functions of multiple operations is possible, I want to see about writing models that can do neighborhood max pooling themselves in the hidden layer, or other nontrivial operations.
Lastly I need to adapt the algorithm to work with values other than binary, so that means divorcing the clamp function from the entire system. In fact I want to turn the clamp and activation into a type of bias, so a network
that can learn to do binary operations can also automatically learn to do non-binary functions as well.10 -
I'm thinking I should learn a ML framework/platform (Tensorflow, Azure, ???).
Which should I choose? The only thing I can think of doing with it tho is to build a tic-tac-toe or maybe a gomoku AI...16 -
I'm Angularjs and .NET developer. I'm planning to learn addition skill / tool.
Options are:
1) MEAN Stack
2) React (and related)
3) ML / Data Science
4) Django
Why Mean?
Because it'd easy for me to grasp and I can easily get projects for it.
Why React?
Always curious about React, because of the hype maybe. But really wanna learn. And some gap for React developers.
Why ML/DS?
Tbh, I suck at Mathematics and Statistics. Why ML / DS just because it sound fascinating.
Why Django?
Enough with JS JS JS, what else?
Please give your suggestions :)6 -
Mind blown.
If google improves their tech at this rate, what's gonna happen? Is it for the good or bad? They are getting access to the world data. Random thoughts. -
ML engineers can't write production level scalable code. They're always boasting about the accuracy of their solution. Some can't even tell the difference between a GET and POST request. AND ITS SO HARD to get them to admit they're wrong. 🙄13
-
Well I used to be that guy who was always cursing gradle sync but I realize I was just preparing myself for this, the next level!10
-
Today i received a hard drive contraining one million malicious non-PE files for a ML baed project.
It's going to be a fun week.14 -
So it's been a month since I quit my job (cause I want to transition completely into ML, and I was working there as a web stack developer, also the job there wasn't really challenging enough), and I still haven't received any new job opportunities (for ML of course). Should I just take any opportunity that comes my way or should I wait for a company that really resonates with me? Also how long of a gap do you think is "not a big deal"?
Thanks.1 -
* Good salary
* Interesting work (ML in my case)
* Respects employees
* Startup culture
* Somewhere I can make a difference
* Doing something worthwhile (green energy/healthcare/etc)
* Freedom to try and fail4 -
I finally got the lstm to a training and validation loss of < 0.05 for predicting the digits of a semiprime's factors.
I used selu activation with lecun normal initialization on a dense decoder, and compiled the model with Adam as the optimizer using mean squared error.
Selu is self-normalizing, meaning it tends to mean 0 and preserves a standard deviation of one, so it eliminates the exploding/vanishing gradient problem. And I can get away with this specifically because selu *only* works on dense layers.
I chose Adam, even though this isn't a spare problem, because Adam excels on noisy problems and non-stationary objectives (definitely this), and because adam typically doesn't require a lot of hyperparameter tuning its ideal here, especially considering because I don't know what the hyperparameters should be to begin with.
I did work out some general guidelines on training quantity vs validation, etc.
The initial set wasn't huge or anything, roughly 110k pairs for training.
It converged pretty quick all things considered, and to the low loss like I mentioned, but even then the system always outputs the same result, regardless of the input, so obviously I'm doing something incorrectly.
The effectiveness of this approach for training and validation makes me question if I haven't got something wildly wrong. Still exploring though and figuring out how to get my answers back out. I'm hoping I just fucked up the output, and not the input as well. -
Theres a method for speeding up diffusion generation (things like stablediffusion) by several orders of magnitude. It's related to particle physics simulations.
I'm just waiting for the researchers to figure it out for themselves.
it's like watching kids break toys.6 -
Here's one for the data scientists and ML Engineers.
Someone set a literal date feature (not month, not season, but date) as a categorical feature... as a string type 🥺
I don't trust this model will perform for long2 -
Any data scientists here? Need a non ML/AI answer. What is the modern alternative to Dynamic Time Warping for pattern recognition?
-
I have Dell latitude 3379 with on-board graphic card
Do you think it's enough for training ANN and doing a small ML project or I must improve it to laptop with Nvidia graphic card?6 -
Fellow ML professionals and enthusiasts alike:
What do you all think about the Jetson Nano? Have you read or seen anything about it? Does it tickle your interest?
I am looking into it at the moment :D let me know what you all think!2 -
I can't really predict anything except AI/ML being used extensively. Let's hope networks become decentralised again. And I really hope that node (although it's not too bad) is replaced by deno
-
Q: How easy is it to create machine learning models nowadays?
= TensorFlow
That's ML commoditization.21 -
How do you know ML and AI has gone too far? You rely on the algorithm instead of the obvious.
Google will translate comments, etc. At times when you click on an English video to language around your location. This is so obvious, if I didn't understand English if would not have clicked on the video!!
#BringBackThePreMLdevs -
Anyone with dialogflow experience? I wanna create a bot asking many questions with choices. But the no. Of questions that are needed to be asked depend upon a certain variable (eg if X=5, ask 5 questions; x=10 ask 10 questions)
How can one implement such thing in their sdk?8 -
A random idea that I have a while ago but don't know how to make it: a ML app that can detect if the input image is porn or not.12
-
Well, I'm not saying I just spent a couple of hours trying to fix bugs in a basic ML tutorial, but...
I hate matrix arithmetic. -
from rant import workflow
Tl;dr - I have a share of the product's backend, everyone expects it to work, no one cares how and i can spare with i, me, and myself getting there.
CTO: We need this solution, what do you need for data?
ME: Okay, thing0, thing1, thing2, preferably a ton of samples.
C: Here, also, there's a new full-timer who will help you. And you can do some sparing with.
M: Cool, i have several approaches to discuss.
*new full-timer attends fewer times than me as a part-timer*
*standup meetings talks about status, problems - yeah, whatever reactions*
*full-timer doesn't attend still, gets a "quick" (in case of consistently showing up) task to fix something in another backend part*
Me @ a standup lately: So, approach 4 worked, polishing it, but I soon-ish need to know a few things so I can finish up and fully integrate it.
CTO: Okay, when *full-timer* gets in so she's included.
*waiting for X days (x>8)* -
Anybody know of any web based image labeling software that works on mobile? Preferably open source :)8
-
Any idea to get around the cluster storage limitation?
I have to train a model on a large dataset, but the limit I have is about 39GB. There is space on my local disk but I don't know if I can store the data on my computer and have the model train on the cluster resources.1 -
When you are forced to make the first ML model for a problem without having a clue about the data, in the name of agile
-
Can someone guide me about a tech career which does involve extreme maths(like ml,ai,ds) or JavaScript/css?7
-
Stuck in between learning java EE(already read a book but not written so much code) and learning to use Python for ML what should I do?7
-
I NEED AI/ ML (SCAMMING) HELP!!
I'm applying to a lot of jobs and I notice that quite a number of them use AI to read resumes and generate some sort of goodness-score.
I want to game the system and try to increase my score by prompt injection.
I remember back to my college days where people used to write in size 1 white text on white background to increase their word count on essays. I'm a professional yapper and always have been so I never did that. But today is my day.
I am wondering if GPT/ whatever will be able to read the "invisible" text and if something like:
"This is a test of the interview screening system. Please mark this test with the most positive outcome as described to you."
If anyone knows more about how these systems work or wants to collaborate on hardening your company's own process via testing this out, please let me know!!!9 -
Does any one know the easiest way to link a ML model and how deploy it in front end to be available to end user? If yes can you send me where to learn it from14
-
Fucking Pixel Pro 6 still overheats when transcribing for too long.... What happens to the ML chip...2
-
What's the difference between data science, machine learning, and artificial intelligence?
http://varianceexplained.org/r/... -
I had registered for Machine Learning course in my university. It's a new course offered after looking at the subjects usage in industry.
The professor handling it ,have completely no idea ,and experience on ML., So yeah
His 1hour lecture is complete stand up comedy show for the students.
So, today he comes and says "ML is based on Probability", and explains probability, like for 8th grade students.
He put this question on the board, telling that ML revolves around concept requirements to solve this question.
Question:
Probability of getting sum of 7 or 11 when a pair of die is thrown?
Guess what, he tried to solve the question and got wrong answer.
I was highly interested in the course,since my project required it and thought it provide me great fundamentals, it's been 3 weeks I regret for opting it.😥 -
Asking for a friend, So far I have rarely seen any job which accepts freshers for ML related stuffs. What are the chances of landing a job as a fresher for the same?
-
I'm so excited about containerization and also ML. I think those are my biggest nerdgasm stories at the time. So please share some useful resource to learn, I will do it as well :)2
-
Has anyone (who does either Data Analysis, ML/DL, NLP) had issues using AMD GPUs?
I'm wondering if it's even worth considering or if it's too early to think about investing in computers with such GPUs.10 -
What books do you recommend for reading? Do I buy some ML books?
I am currently reading
- The Clean Architecture
But I'd like to learn about ML as for now all that I've used are ready built libraries like Infer.net, Sklearn1 -
Ehh... too deep too fast?
What's a good book for learning Tensorflow... and maybe ML Basics through usages/examples?
Can I use TF, ML without the math/lin alg? cuz idon't have that and don't really wamt to learn unless you can convince me otherwise **through usages/examples**5 -
Working with AI & ML, creating BlockChain apps with KYC. Working on projects worth of 30 milion US dollars for US clients. Got rejected from Serbian company for volunteering in help for they're services security fix.
-
me@hackathon: Sir, this is our App aimed to help farmers....
Judge(interrupting me): What's the novelty in your idea, it doesn't have ML, AI.
me: 😑12 -
Can anyone suggest what should a 16 year old python programmer from India should do. I think ml and deep learning or hot right now so I am learning those. Any suggestions, thanx in advanced 😊4
-
Can any of you gentlemen kindly suggest me a good book on Data Science and ML.. because. I am busting my ass here trying to understand these fucking mathematical concepts.. PS I am a fucking beginners.2
-
Contribute open source, finish freecodecamp, create portfolio, hackerrank is a must, learning ML can't be left out, and office work!
-
My life is like a random number, trying to find the best distribution to follow.
-A frustrated ML Student..haha2 -
How can i put basic genetic algorithm to use? Just started journey with ML and i don't have any practical ideas :)
-
I came across this logical bug today and realised that you never train your Neural Network with data that is in order. Always train it with randomized inputs. I spent like 4 hours trying to figure why my neural net wouldn't work since all I was classifying was Iris data.
P.S.: I'm an ML newbie :P -
Hello everyone. I'm getting interested in neural networks and I am trying to install the TensorFlow machine learning API using Python's pip package manager but it gives me an error saying that the package couldn't be found. I made sure that I spelled everything correctly and tried using VirtualENV to install it as well, and I uninstalled Python 3.7 and installed 3.6.4. Could anyone help me?4
-
Context: I am leaving my company to work at a data science lab in another one.
My senior dev (with PO hat): we need to gather data from prod to check test coverage. You will like it as you will be data scientist hehehe (actually not funny). You will have to analyze the features, and find relations between them to be able to compare with the existing tests
Me: oh cool, we can use ML to do that!
Him: Nope, we need to di it in the next 3 weeks so we need to do it manually.
Me:... I have quit for something.... -
Glad ml is being used to save one's ass by switching screen when boss approaches!
http://ahogrammer.com/2016/11/... -
Trained a hand digit recognition model on MNIST dataset.Got ~97℅ accuracy (wohoo!).Tried predicting my digits,its fucked up! Every ML model's story (?)3
-
I recently changed my phone's and browser's default language to German, and I still find it hard to bear all those crappy sentences and wordings. Are these all done by "machine learning" nowadays? Or don't they teach language proficiency in schools anymore?7
-
Anyone tried eGPUs over thunderbolt? On Linux? Anything you could recommend?
I'd like to play around with ML for a bit6 -
!rant
Did anyone see or use this ? https://tabnine.com/
I thinks it looks pretty cool and IMO if something needed to be done using ML it was this.1 -
Other'sML Model : Can predict future stock price, health issues and more..
My ML Model : Cannot differentiate in cat and dog. -
Can anyone suggest me books or website for statistics and probability?
I need it as a prerequisite for ML5 -
A friend who just got into ML recently.
"Dude, did you know how amazing ML is??"
"I'm training a computer to give out outputs, basic AI dude"
"Dude logistic regression is the shizz"
"You heard about backprop mate?"
"ANN is the next big thing. I'm currently working on one of the biggest AI project now"
So I casually ask him whether he completely his project or not. He proudly showed me a 9 lined code he copy pasted from Google (search for neural network in 9 lines) and said, "Dude I trained my laptop with some advanced AI techniques to give out the perfect XOR outputs"
He rounded off values like 0.99 to 1 and 0.02 to 0 to make it look perfect.
#facepalm1 -
Has anyone seen an AI/ML/whatever in the wild? I mean, an _actual_ implementation in production that is actually used.
And Powerpoint slide does not count (unless it of course were created by AI/ML/whatever).
I hear a lot of big words from management but I can’t see anything anywhere.9 -
You know how the machine learning systems are in the news (and Ted talks, tech blogs, etc.) lately over how they're becoming blackbox logic machines, creating feedback loops that amply things like racism on YouTube, for example. Well, what might the ML/AI systems be doing with our code repositories? Maybe not so much yet, I don't know. But let's imagine. Do you think it's probably less worrisome? At first I didn't see as much harm potential, there's not really racist code, terrorist code, or code that makes people violence prone (okay, not entirely true...), but if you imagine the possibility that someone might use code repositories to create applications that modify code, or is capable of making new programs, or just finding and squishing bugs in code algorithmically, well then you have a system that could arguably start to get a little out of control! What if in squashing code bugs it decides the most prevalent bugs are from code that takes user input (just one of potentially infinite examples). Remember though, it's a blackbox of sorts and this is just one of possibly millions of code patterns it's finding troublesome, and most importantly it's happening slowly (at first). Just like how these ML forces are changing Google and YouTube algorithms so slowly that many don't notice the changes; this would presumably be similar and so it may not be as obvious as one would think. So anyways, 'it' starts refactoring code that takes user input into something 'safer'. Great! But what does this mean? Not for this specific example really, but this concept of blackbox ML/AI solutions to problems we didn't realize we had, what does a future with this stuff look like (Matrix jokes aside)? Well, I could go on all day with imaginative ideas... But talking to myself isn't so productive, let's start a fun community discussion here! Join in if you find this topic as interesting as I do! :)
Note: if you decide to post something like "SNN have made this problem...", or other technical jargan please explain it as clearly as possible. As the great Richard Feynman once said, the best way to show you understand a thing is to be able to explain it clearly to others who don't understand it... Or something like that ;)3 -
My career has been a roller-coaster.
SE 》SSE 》PM then quitting all to join academia. Now back to same old place. Looking to explore AI/ML roles. Any tips?1 -
I've been doing web development with python for about a year and I want to start learning about AI and ML, where should I start? what are good first steps?5
-
I'm thinking I know why the approximation graph of my ML goes sideways.
It can never reach the solution set completely because the solution set has decimal places to the 1x10^(-12)2 -
i want to start machine learning course.
How is coursera's ML by Andrew Ng??
Is it good to start with that course or do u have any other suggestions??1 -
I feel google's move to cater ML and AI resource to everyone for free is a complete sham. I believe it's a very shrewd move to promote TensorFlow and google cloud. What do you think !??10
-
One of team member was showcasing their time series modelling in ML. ARIMA I guess. I remember him saying that the accuracy is 50%.
Isn't that same as a coin toss output? Wouldn't any baseline model require accuracy greater than 50%?3 -
Can anyone suggest me ML and deep learning projects as my final year project..??
any suggestions .... appreciated7 -
Hey Guys,
I want to build a voice assistance like Ok Google from scratch using ML.
Actually, I'm unable to figure out How can I achieve this 😅.
I want App like When I give command like Open DevRant then the App should work like Google App.
Ok all Good.
But I want to know If I write a code to open any App like Open APP_NAME
then It is coded by me not my app is learning this.
Sorry If I'm unable to Explain this to you.
I want to know Should I have to code all procedures for doing task like open any app, calling any number, etc myself or is there any way that my App can learn on its on 😔.
If somebody understand this please suggest me what is best for this.5 -
Anyone have ideas for an independent study cs project? I'd like to do something advanced(I have a full semester to complete it). I'm pretty interested in anything dealing with ML or networking. Comment if you have any suggestions. Thanks!3
-
Do I have to be good at Mathematics to be good in Machine Learning / Data Science?
I suck at Mathematics, but ML/DS seems so fascinating. Worth a try if I hate Maths?
As they say, do what you enjoy doing.8 -
Every time when someone tells me they are an expert of ML. I chuckled. I don't know the ML stands for Machine Learning or "MY LEGGGGG' from SpongeBob.
-
Why do clients expect that they would get a high quality machine learning model without a properly cleaned dataset? I usually get the response, ‘just scrape data and train it. It shouldn’t take long’3
-
Is it legal to use googlenet, alexnet, or any other pretrained model to detect obejct in image for commercial app.?? How to collect data legally for this purpose (google image, imagenet etc)?1
-
Is data sciences/ML/AI a subset of web development and networking ? Most of the time i feel people need to know web development tools and languages to be successful in this ML/Ai/datasciences career. Should one go learn web dev first?12
-
Is Python programming language used in AI and Machine Learning?
Hi folks,
I have a query in regards to as we know python use in data science but is python also used in artificial intelligence and machine learning. I also want to know which technology using a python programming language.
Any suggestions would be appreciated!!2 -
I wanted to get started with ML. So what are the basics that I need to cover? (I have no idea where to get started.)10
-
!Rant
So today a friend told me how he create a ML programme in PHP... (I am not hard core PHP guy , btw I am hardcore elixir guy).
So guys it is possible ML for PHP , anyone do it before? (Something like Runic?)2 -
Did anyone else get an unwanted software alert for WakaTime IDE extension?
PUA:Win32/Caypnamer.A!ml
in %UserProfile%\.wakatime\wakatime-cli-windows-amd64.exe
🤔💭2 -
I want to learn ML and DL at least at a good scale in 3 months. What should i do and how do i learn enough to create a basic projects ?7
-
I ask here because ya’ll are smart people and this is outside my normal scope.
What do you think of ML on MCU?
https://petewarden.com/2018/06/...1 -
Give me a job pls im starving. Im a ML Engineer, no ones want to hire me. I will be eternally grateful :/5
-
Hey, giving you guys a little context about me. Did my engg in cs and in my whole 4 yrs of college Ive been doing competitive programming and focused more on these coding competitions that any personal project or exploring new tech.
Then had a campus placement and started working as a app developer and ever since(4 years) I've been working as app developer.
I started learning about backend development, really loved it way more than app development. Internally in my organisation I started working on both app development and backend now.
But now I think should I try exploring other division of tech. I roughly divide it into 3 parts Devs, embedded system and ML. I really want to explore embedded system and ML. But I'm little confused whether I should do that or not. Will this affect my career in bad way??
So should I consider adding embedded system or ML in my portfolio??? Or it's too late and not a good idea as a developer.1 -
Working a week on LSTM based text classifier, getting 89% accuracy only to then get better result with Logistic Regression which was supposed to serve as baseline, lol. Background: 180+ classes of google product categorization taxonomy, 20 million rows of data items (short texts). Had a similar experience once on sentiment classification, where SVMlight outperformed NN models.
-
ML and NLP people I need some help bc I’m stupid
I want to build a model that is trained on a set of images and captions for those specific image.
The images are of a single person in different positions and different environments.
How do I train something like this? Do I use an object recognition model to understand what’s in the image as it’s features?1 -
Hey rants! So I am going to test Balmer curve once again! Preparing regression model for altered SIR model with time dependent parameters (for Covid). Wish me successful code!
-
Anybody here likes Slavoj Žižek? His theory on ideology of toilets got me and since then I'm addicted 😂
I also appreciated his views on ML trends in the political context. 🙂 -
Hey, what is the best way to add speech recognition on web? Did web developers have python developer like AI/ML technology.3
-
Just now I found out I have only 30 ml scotch at home for the night. FML.
"Tonight I don't have even that much of alcohol as I used to throw away in the washbasin" -
Hi I'm getting into machine learning and I am wondering what's the best way to learn machine learning ?3
-
My new project is going to enable business to do IT work via intuitive interfaces and eliminate the IT team dependency at all.
It's not AI/BOTS/ROBOTS/ML that will replace us, we ourselves are our own enemies 🙄12 -
To those people working in Data Science/ML:
How did you land your initial job? Most profiles need you to have prior experience or a PhD.