Details
Joined devRant on 4/22/2017
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
having a hard time doing music these days
but jesus christ sometime im listining to my own shit and this is such a banga
https://youtube.com/watch/...
maybe i should send it to some dj friendz so they can play it in rave
maybe the techno club
the thing is i feel like it's a bit of a one off, i cant seems to make something that good these dayz
maybe i should go to the wood and candyflip
wtf tf am i doing here2 -
Polish military has the official "8 wounds" chevron that is given to those who sustained 8 battle wounds. Do you know why Americans don't have those? Because you have to be Polish to get wounded eight times in battle and still be alive enough to wear this thing on your uniform. Poles are built different.17
-
I've been gone a hot minute.
I'm just sharpening sticks to fight in the robo-apocalypse.
John Connors got nothing on pointy sticks.4 -
At $work, I just learned that a daemon on prod makes an SFTP connection to the same domain every 0.5 to 10 seconds, all day long, every single day. That’s a minimum of 8,640 connections per day!
The senior developer responsible for it had the dev skills of a junior and the management skills of a puppet, but she’s a “disadvantaged minority” and is great at stealing credit and throwing people under the bus. Naturally, she has been given multiple promotions and a team to lead… which she fills exclusively with other Indians, all of them at her skill level or below. (I used to do their code reviews and security reviews.)
When I asked one of the fintech managers (a former dev) about the crazy number of SFTP connections, he said “[Her team] did that intentionally, as it didn’t used to be that way. They must have had a reason” and cut me off.
Okay then.
Not my garden, not my fertilizer.
Just another day weeding the fields in hell.9 -
pls stop putting talking into music mixes. you're ruining my jive. I don't wanna hear your opinions. just play the math noises8
-
You know what sucks? When AI appears smart but its explaination is so over your head you don't even fully grasp if it is bullshitting or not.
For reference, what the following does is decomposes several runs of a network, takes them as samples, then generates a distribution with those samples. It then applies a fourier transform on the samples, to get the frequency components of the networks derivatives (first and second order), in order to find winning subnetworks to tune, and enforces a gaussian distribution in the process.
I sort of understand that, but the rest is basically rocket science to me.
Starts with an explanation of basic neural nets and goes from there. Most of the meat of the discussion is at the bottom.
https://pastebin.com/DLqe70uD4 -
Adaptive Latent Hypersurfaces
The idea is rather than adjusting embedding latents, we learn a model that takes
the context tokens as input, and generates an efficient adapter or transform of the latents,
so when the latents are grabbed for that same input, they produce outputs with much lower perplexity and loss.
This can be trained autoregressively.
This is similar in some respects to hypernetworks, but applied to embeddings.
The thinking is we shouldn't change latents directly, because any given vector will general be orthogonal to any other, and changing the latents introduces variance for some subset of other inputs over some distribution that is partially or fully out-of-distribution to the current training and verification data sets, thus ultimately leading to a plateau in loss-drop.
Therefore, by autoregressively taking an input, and learning a model that produces a transform on the latents of a token dictionary, we can avoid this ossification of global minima, by finding hypersurfaces that adapt the embeddings, rather than changing them directly.
The result is a network that essentially acts a a compressor of all relevant use cases, without leading to overfitting on in-distribution data and underfitting on out-of-distribution data.13 -
The biggest challenge of building a free energy device is figuring out where to hide the battery.
The biggest challenge of building an AI product is figuring out where to hide API calls to ChatGPT.2 -
Hey, been gone a hot minute from devrant, so I thought I'd say hi to Demolishun, atheist, Lensflare, Root, kobenz, score, jestdotty, figoore, cafecortado, typosaurus, and the raft of other people I've met along the way and got to know somewhat.
All of you have been really good.
And while I'm here its time for maaaaaaaaath.
So I decided to horribly mutilate the concept of bloom filters.
If you don't know what that is, you take two random numbers, m, and p, both prime, where m < p, and it generate two numbers a and b, that output a function. That function is a hash.
Normally you'd have say five to ten different hashes.
A bloom filter lets you probabilistic-ally say whether you've seen something before, with no false negatives.
It lets you do this very space efficiently, with some caveats.
Each hash function should be uniformly distributed (any value input to it is likely to be mapped to any other value).
Then you interpret these output values as bit indexes.
So Hi might output [0, 1, 0, 0, 0]
while Hj outputs [0, 0, 0, 1, 0]
and Hk outputs [1, 0, 0, 0, 0]
producing [1, 1, 0, 1, 0]
And if your bloom filter has bits set in all those places, congratulations, you've seen that number before.
It's used by big companies like google to prevent re-indexing pages they've already seen, among other things.
Well I thought, what if instead of using it as a has-been-seen-before filter, we mangled its purpose until a square peg fit in a round hole?
Not long after I went and wrote a script that 1. generates data, 2. generates a hash function to encode it. 3. finds a hash function that reverses the encoding.
And it just works. Reversible hashes.
Of course you can't use it for compression strictly, not under normal circumstances, but these aren't normal circumstances.
The first thing I tried was finding a hash function h0, that predicts each subsequent value in a list given the previous value. This doesn't work because of hash collisions by default. A value like 731 might map to 64 in one place, and a later value might map to 453, so trying to invert the output to get the original sequence out would lead to branching. It occurs to me just now we might use a checkpointing system, with lookahead to see if a branch is the correct one, but I digress, I tried some other things first.
The next problem was 1. long sequences are slow to generate. I solved this by tuning the amount of iterations of the outer and inner loop. We find h0 first, and then h1 and put all the inputs through h0 to generate an intermediate list, and then put them through h1, and see if the output of h1 matches the original input. If it does, we return h0, and h1. It turns out it can take inordinate amounts of time if h0 lands on a hash function that doesn't play well with h1, so the next step was 2. adding an error margin. It turns out something fun happens, where if you allow a sequence generated by h1 (the decoder) to match *within* an error margin, under a certain error value, it'll find potential hash functions hn such that the outputs of h1 are *always* the same distance from their parent values in the original input to h0. This becomes our salt value k.
So our hash-function generate called encoder_decoder() or 'ed' (lol two letter functions), also calculates the k value and outputs that along with the hash functions for our data.
This is all well and good but what if we want to go further? With a few tweaks, along with taking output values, converting to binary, and left-padding each value with 0s, we can then calculate shannon entropy in its most essential form.
Turns out with tens of thousands of values (and tens of thousands of bits), the output of h1 with the salt, has a higher entropy than the original input. Meaning finding an h1 and h0 hash function for your data is equivalent to compression below the known shannon limit.
By how much?
Approximately 0.15%
Of course this doesn't factor in the five numbers you need, a0, and b0 to define h0, a1, and b1 to define h1, and the salt value, so it probably works out to the same. I'd like to see what the savings are with even larger sets though.
Next I said, well what if we COULD compress our data further?
What if all we needed were the numbers to define our hash functions, a starting value, a salt, and a number to represent 'depth'?
What if we could rearrange this system so we *could* use the starting value to represent n subsequent elements of our input x?
And thats what I did.
We break the input into blocks of 15-25 items, b/c thats the fastest to work with and find hashes for.
We then follow the math, to get a block which is
H0, H1, H2, H3, depth (how many items our 1st item will reproduce), & a starting value or 1stitem in this slice of our input.
x goes into h0, giving us y. y goes into h1 -> z, z into h2 -> y, y into h3, giving us back x.
The rest is in the image.
Anyway good to see you all again.24 -
It’s not wise to play chess with a pigeon. It will throw all pieces away, shit on the board and tell everybody it won.10
-
Questions in job applications have become a fucking joke.
I'm done with them, from now on to stupid questions I'm only answering in same fashion.4 -
🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡5
-
Curtains on, lights off, cats out
it's just me and my bonsai now
I don't even need a tight belt, I don't feel hungry anymore5 -
Sleeping well cleans our mind and let us approach a problem in a different way and help us find the solution. For some days I was worried about how to solve a couple of bugs, but this morning after a good sleep I magically realized how easy the solution was:
WON'T FIX2 -
At 14, my grandpa had a boyscout trip to Britany, france. He met my grandma near a fountain, they exchanged their address and started communicating.
They exchanged letter for ten years. In the meanwhile, she had married a man and had a child. But the husband unfortunately died of tuberculosis.
So they met 10 years later, at the same fountain, and he brought her to belgium to spend their life together.
RIP bonne maman you were the best1 -
if someone in the group blames you for something, do you "accept responsibility" even if you didn't do the thing?
why or why not?4 -
pLeAsE dOnAtE tO hElP mE aFfOrD hOsTing
lolwut? you have a free-for-all project with no premium features. Users can’t create accounts. All of your code is JS, you don’t even need a server. Just host your stuff on Cloudflare. I have just about a gigabyte of data uploaded there on my free account, so if I can do it, so do you. Just use a CDN for your client-only, JS-only project.
domains though… yours is long and costs around 40 bucks a YEAR.
if you made a good product and want to make some money with it, that’s completely okay, so just say so: “I want money for my work, but I don’t want to take away features that were free and can be provided at no additional cost, so please donate”. Why lie? It’s not like people who won’t donate to you based on this justification will magically donate for “hosting”.4 -
Context/Prev: https://devrant.com/rants/9820310 (and the rants related to it)
So after accidentaly using Arch for 5 months, and then accidentally using Gentoo for 7 months
(It was supposed to be 1 month per distro, but life happened and didn't have the time and peace of mind for a switch)
Today I'm finally hopping to Slackware
I love the setup's retro/classic ncurses gui <311