Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "state machine"
-
We're using a ticket system at work that a local company wrote specifically for IT-support companies. It's missing so many (to us) essential features that they flat out ignored the feature requests for. I started dissecting their front-end code to find ways to get the site to do what we want and find a lot of ugly code.
Stuff like if(!confirm("blablabla") == false) and whole JavaScript libraries just to perform one task in one page that are loaded on every page you visit, complaining in the js console that they are loaded in the wrong order. It also uses a websocket on a completely arbitrary port making it impossible to work with it if you are on a restricted wifi. They flat out lie about their customers not wanting an offline app even though their communications platform on which they got asked this question once again got swarmed with big customers disagreeing as the mobile perofrmance and design of the mobile webpage is just atrocious.
So i dig farther and farthee adding all the features we want into a userscript with a beat little 'custom namespace' i make pretty good progress until i find a site that does asynchronous loading of its subpages all of a sudden. They never do that anywhere else. Injecting code into the overcomolicated jQuery mess that they call code is impossible to me, so i track changes via a mutationObserver (awesome stuff for userscripts, never heard of it before) and get that running too.
The userscript got such a volume of functions in such a short time that my boss even used it to demonstrate to them what we want and asked them why they couldn't do it in a reasonable timeframe.
All in all I'm pretty proud if the script, but i hate that software companies that write such a mess of code in different coding styles all over the place even get a foot into the door.
And that's just the code part: They very veeeery often just break stuff in updates that then require multiple hotfixes throughout the day after we complain about it. These errors even go so far to break functionality completely or just throw 500s in our face. It really gives you the impression that they are not testing that thing at all.
And the worst: They actively encourage their trainees to write as much code as possible to get paid more than their contract says, so of course they just break stuff all the time to write as much as possible.
Where did i get that information you ask? They state it on ther fucking career page!
We also have reverse proxy in front of that page that manages the HTTPS encryption and Let's Encrypt renewal. Guess what: They internally check if the certificate on the machine is valid and the system refuses to work if it isn't. How do you upload a certificate to the system you asked? You don't! You have to mail it to them for them to SSH into the system and install it manually. When will that be possible you ask? SOON™.
At least after a while i got them to just disable the 'feature'.
While we are at 'features' (sorry for the bad structure): They have this genius 'smart redirect' feature that is supposed to throw you right back where you were once you're done editing something. Brilliant idea, how do they do it? Using a callback libk like everyone else? Noooo. A serverside database entry that only gets correctly updated half of the time. So while multitasking in multiple tabs because the performance of that thing almost forces you to makes it a whole lot worse you are not protected from it if you don't. Example: you did work on ticket A and save that. You get redirected to ticket B you worked on this morning even though its fucking 5 o' clock in the evening. So of course you get confused over wherever you selected the right ticket to begin with. So you have to check that almost everytime.
Alright, rant over.
Let's see if i beed to make another one after their big 'all feature requests on hold, UI redesign, everything will be fixed and much better'-update.5 -
In an alternate universe, devs live in their own country.
They make their own rules and dictate how much they are paid. They maintain the entire world’s infrastructure.
They don’t go to work, since their entire country is the workplace and guess what? Cold beers are free(a thank you from the beer company guys for coming up with all their inventory management systems)
Pizza is free too.
There is no government (laws are passed depending on upvotes on devRant )
No racism, sexism or any other ism ending words . Devs just code.
Oh, and the state police, preferably known as keyboard warriors patrol the streets and offenders are punished by limited internet speeds. 😂. It is said some actually commit suicide because of this unbearable punishment.
Fuck yeah they have coffee farms. That’s the only thing they don’t accept as *gratitude from other nations because those sons of bitches might fuck that up too.
And everyone drives teslas 😂
Okay I have to get back to work now. That multi universe travel machine won’t buy itself.15 -
About a year ago I switched my job.
At the start everything seemed like magic. I was the It director, I've finally was able to call the shots on technologies, on new software architecture.
First step was to check the current state of the company.
"qqqq" as each pc password? Ok
No firewall from outside? Lovely
Servers running on Windows Server 2008? Spectacular
People leaving pc on after work and left the machine unlocked just not to type the password? Hell yeah
The IT dude playing games instead of working? But ofcourse
Plaintext passwords publically accessible eshop? Naturally.
The list goes on and on.
After all this time, I'm working to fix every hole like that like crazy and because it doesn't show results, I'm soon to lose my job. Well better luck next time as an intern I guess :')19 -
I absolutely love the email protocols.
IMAP:
x1 LOGIN user@domain password
x2 LIST "" "*"
x3 SELECT Inbox
x4 LOGOUT
Because a state machine is clearly too hard to implement in server software, clients must instead do the state machine thing and therefore it must be in the IMAP protocol.
SMTP:
I should be careful with this one since there's already more than enough spam on the interwebs, and it's a good thing that the "developers" of these email bombers don't know jack shit about the protocol. But suffice it to say that much like on a real letter, you have an envelope and a letter inside. You know these envelopes with a transparent window so you can print the address information on the letter? Or the "regular" envelopes where you write it on the envelope itself?
Yeah not with SMTP. Both your envelope and your letter have them, and they can be different. That's why you can have an email in your inbox that seemingly came from yourself. The mail server only checks for the envelope headers, and as long as everything checks out domain-wise and such, it will be accepted. Then the mail client checks the headers in the letter itself, the data field as far as the mail server is concerned (and it doesn't look at it). Can be something else, can be nothing at all. Emails can even be sent in the future or the past.
Postfix' main.cf:
You have this property "mynetworks" in /etc/postfix/main.cf where you'd imagine you put your own networks in, right? I dunno, to let Postfix discover what your networks are.. like it says on the tin? Haha, nope. This is a property that defines which networks are allowed no authentication at all to the mail server, and that is exactly what makes an open relay an open relay. If any one of the addresses in your networks (such as a gateway, every network has one) is also where your SMTP traffic flows into the mail server from, congrats the whole internet can now send through your mail server without authentication. And all because it was part of "your networks".
Yeah when it comes to naming things, the protocol designers sure have room for improvement... And fuck email.
Oh, bonus one - STARTTLS:
So SMTP has this thing called STARTTLS where you can.. unlike mynetworks, actually starts a TLS connection like it says on the tin. The problem is that almost every mail server uses self-signed certificates so they're basically meaningless. You don't have a chain of trust. Also not everyone supports it *cough* government *cough*, so if you want to send email to those servers, your TLS policy must be opportunistic, not enforced. And as an icing on the cake, if anything is wrong with the TLS connection (such as an MITM attack), the protocol will actively downgrade to plain. I dunno.. isn't that exactly what the MITM attacker wants? Yeah, great design right there. Are the designers of the email protocols fucking retarded?9 -
The guy where I can only shake my head when I see his code, and he is really proud of if implementations, while he
- doesn't care about warnings
- breaks builds and doesn't care
- doesn't care about code styles and indents in a very column based way
- adds tons of comments to his code, mostly hard to understand, and sometimes that much you can hardly find the code
- implements a tokenizer where you have to inherit from its interface (Why would I wanna implement whole functions for a tokenizer and not just use it in place where needed? How do I use two of those in one class?)
- implement a "generic" state machine base class with fixed lengths array of 3 events and 3 strings (Why would I need events and strings hardcoded in a "generic" state machine? Why a maximum of 3?)
- once delivered a software without the needed runtime components, so the whole system (embedded device) wasn't working properly and only by chance missed the point of disabling update mechanisms
- make your ears bleed about his big inventions whenever he sees you, no matter how often he already told you about that blazing new feature5 -
Okay, story time.
Back during 2016, I decided to do a little experiment to test the viability of multithreading in a JavaScript server stack, and I'm not talking about the Node.js way of queuing I/O on background threads, or about WebWorkers that box and convert your arguments to JSON and back during a simple call across two JS contexts.
I'm talking about JavaScript code running concurrently on all cores. I'm talking about replacing the god-awful single-threaded event loop of ECMAScript – the biggest bottleneck in software history – with an honest-to-god, lock-free thread-pool scheduler that executes JS code in parallel, on all cores.
I'm talking about concurrent access to shared mutable state – a big, rightfully-hated mess when done badly – in JavaScript.
This rant is about the many mistakes I made at the time, specifically the biggest – but not the first – of which: publishing some preliminary results very early on.
Every time I showed my work to a JavaScript developer, I'd get negative feedback. Like, unjustified hatred and immediate denial, or outright rejection of the entire concept. Some were even adamantly trying to discourage me from this project.
So I posted a sarcastic question to the Software Engineering Stack Exchange, which was originally worded differently to reflect my frustration, but was later edited by mods to be more serious.
You can see the responses for yourself here: https://goo.gl/poHKpK
Most of the serious answers were along the lines of "multithreading is hard". The top voted response started with this statement: "1) Multithreading is extremely hard, and unfortunately the way you've presented this idea so far implies you're severely underestimating how hard it is."
While I'll admit that my presentation was initially lacking, I later made an entire page to explain the synchronisation mechanism in place, and you can read more about it here, if you're interested:
http://nexusjs.com/architecture/
But what really shocked me was that I had never understood the mindset that all the naysayers adopted until I read that response.
Because the bottom-line of that entire response is an argument: an argument against change.
The average JavaScript developer doesn't want a multithreaded server platform for JavaScript because it means a change of the status quo.
And this is exactly why I started this project. I wanted a highly performant JavaScript platform for servers that's more suitable for real-time applications like transcoding, video streaming, and machine learning.
Nexus does not and will not hold your hand. It will not repeat Node's mistakes and give you nice ways to shoot yourself in the foot later, like `process.on('uncaughtException', ...)` for a catch-all global error handling solution.
No, an uncaught exception will be dealt with like any other self-respecting language: by not ignoring the problem and pretending it doesn't exist. If you write bad code, your program will crash, and you can't rectify a bug in your code by ignoring its presence entirely and using duct tape to scrape something together.
Back on the topic of multithreading, though. Multithreading is known to be hard, that's true. But how do you deal with a difficult solution? You simplify it and break it down, not just disregard it completely; because multithreading has its great advantages, too.
Like, how about we talk performance?
How about distributed algorithms that don't waste 40% of their computing power on agent communication and pointless overhead (like the serialisation/deserialisation of messages across the execution boundary for every single call)?
How about vertical scaling without forking the entire address space (and thus multiplying your application's memory consumption by the number of cores you wish to use)?
How about utilising logical CPUs to the fullest extent, and allowing them to execute JavaScript? Something that isn't even possible with the current model implemented by Node?
Some will say that the performance gains aren't worth the risk. That the possibility of race conditions and deadlocks aren't worth it.
That's the point of cooperative multithreading. It is a way to smartly work around these issues.
If you use promises, they will execute in parallel, to the best of the scheduler's abilities, and if you chain them then they will run consecutively as planned according to their dependency graph.
If your code doesn't access global variables or shared closure variables, or your promises only deal with their provided inputs without side-effects, then no contention will *ever* occur.
If you only read and never modify globals, no contention will ever occur.
Are you seeing the same trend I'm seeing?
Good JavaScript programming practices miraculously coincide with the best practices of thread-safety.
When someone says we shouldn't use multithreading because it's hard, do you know what I like to say to that?
"To multithread, you need a pair."18 -
The more depressed you get over the current state of software is how you know you made it.When you start making your own opinions and say"wow these people are full of shit"
Primary example, the web development overblown bullshit. Fuck me dude, you really don't need that full featured react, vue, angular framework to make sense of shit. You are going over the top for fucking ajax functionality and state management that you could do by yourself without needing to learn a full framework, by the time you finish learning react you probably would have been better served with standard vanilla af JS and server side rendering.
Our world is full of fads and many talented people that perpetrate them. Its fine, it is a the nature of the beast. But a lot...A LOT of software is very POORLY written. And adding levels of abstraction over a very broken paradigm (web in this case) does and will not make it better.
Basically I am fucking hating being a web developer and want to go back to a time in which we cared about how much memory consumption our applications made as well as not worrying about the fucking frontend having the ability to implement machine learning.
I want to run sublime.exe and being sure that it is a native application to my system and not using a fucking contained web browser to implement my fucking text editor. With 20mb of ram at most instead of 500mb WTF.
I knew I made it when I could read comments on Hacker news and reddit and say "this idiot is full of shit", I knew I made it when I would sigh heavily at the idea of having another project rather than having a fan girl attitude towards it.
I knew I made it when people writing about software development meant shit to me rather than the wonder of what the fuck they were talking about.
I knew I made it when getting laid was more important to me than fucking around with code.
pussy > code
Fuck you.13 -
This is the state of desktop computing: When a web browser uses twice more RAM than a full virtual machine.
To be fair, I did have 5 windows with >10 tabs each, but still...13 -
Ever heard of event-based programming? Nope? Well, here we are.
This is a software design pattern that revolves around controlling and defining state and behaviour. It has a temporal component (the code can rewind to a previous point in time), and is perfectly suited for writing state machines.
I think I could use some peer-review on this idea.
Here's the original spec for a full language: https://gist.github.com/voodooattac...
(which I found to be completely unnecessary, since I just implemented this pattern in plain TypeScript with no extra dependencies. See attached image for how TS code looks like).
The fact that it transcends language barriers if implemented as a library instead of a full language means less complexity in the face of adaptation.
Moving on, I was reviewing the idea again today when I discovered an amazing fact: because this is based on gene expression, and since DNA is recombinant, any state machine code built using this pattern is also recombinant[1]. Meaning you can mix and match condition bodies (as you would mix complete genes) in any program and it would exhibit the functionality you picked or added.
You can literally add behaviour from a program (for example, an NPC) to another by copying and pasting new code from a file to another. Assuming there aren't any conflicts in variable names between the two, and that the variables (for example `state.health` and `state.mood`) mean the same thing to both programs.
If you combine two unrelated programs (a server and a desktop application, for example) then assuming there are no variables clashing, your new program will work as a desktop application and as a server at the same time.
I plan to publish the TypeScript reference implementation/library to npm and GitHub once it has all basic functionality, along with an article describing this and how it all works.
I wish I had a good academic background now, because I think this is worthy of a spec/research paper. Unfortunately, I don't have any connections in academia. (If you're interested in writing a paper about this, please let me know)
Edit: here's the current preliminary code: https://gist.github.com/voodooattac...
***
[1] https://en.wikipedia.org/wiki/...29 -
Since fucking when did "bare metal" mean just running on an OS?? At a conference and literally everyone is like "we got kubernetes running on bare metal", got super excited for a bit because just the idea of that sounds amazing but they're using it as slang for "basically not in a container or vm."
Nothing exciting at all. Now we're patting ourselves on the back for getting software working without it being preconfigured as a container or a VM image. No one knows how to do anything any more. MUCH too much abstraction going on.
I guess it keeps me more employable, but the state of the world from a developer standpoint is just sad.
(For reference, this is what the first sentence of "Bare Metal" looks like on wikipedia "In computer science, bare machine (or bare metal) refers to a computer executing instructions directly on logic hardware without an intervening operating system.")4 -
Another project with legacy code got just dusted off at work. Shits fucked beyond recognition! We got:
- Rando variable names that mean nothing
- Timers running with a cycle time of 2.5ms if you start them with the multiplier 1.
- An Interrupt routine thats 300 lines long.
- Another interrupt thats starting an ADC conversion and waiting for it to complete before returning.
- For loops that start with one and subtract one from the iterator in the loop
- Every value that would normally be expressed as a regular number is written down in Hex. Eg: if(val==0x05)
- State machine built without writing down which state is which. Its just a number. (In hex obviously!)
- All running on a Microcontroller you cant debug on.
- Using a compiler no one has ever heard of before.
- Weird ass Port manipulations
- 15 different .hex and .elf files with no clue whats in them.
- No version control
- We tried explaining the code to a monkey and it hanged itself.10 -
Years ago I used to work a guvmant site. They had really strict security rules for internet and how you spent your time. Makes sense considering what that site did. I was a support engineer for some of their process control equipment.
I was approached by an operator supervisor to install dvd player software on a business machine (non process related). Basically just a general purpose PC with no function other than time cards and general office use. I was fine with the request, but the reason was for watching movies during a holiday period by the operators. Not for anything official. So I made some noise about my dislike of this request feigning moral superiority. But the supervisor swore up and down it was for "training" dvds.
So I wrote a simple windows script. The script basically popped up a window that said:
"Security has detected unauthorized media inserted into this machine. Please state the reason for this infraction." It provided a dialog to enter a justification. After you entered the justification it said: "Security has been contacted and your user logged. You will be contacted shortly."
This script was then attached to the supervisors Start folder so it ran when he, and only he logged in. We made sure the "training" video (some movie) was already inserted at this point.
He logged in. He just about shit his pants when reading this. He promptly logged and left the building to walk somewhere else in the site. We called him and let him know it was a gag. His response: That son of a bitch Demolishun!2 -
Making python 2x faster by replacing enums with literal values.
Pros, it's faster, cons, it's unreadable.
God I miss compiled languages. At least optimizing them requires intelligent problem solving.
It's a text parser state machine transition so it's a code hot spot, so this kind of optimization is worthwhile. But it's kinda annoying.
Next is get rid of any semblance of readability and replace the match with an array index...31 -
Read the following in Morgan Freeman’s voice.
Okay everyone sit on down and get ready for story time. There once was a workspace that was a pain in the ass to setup. It often would take an entire day even for the most experienced devs on the team...for it was a workspace perched atop a swamp of shit that would require a whole year to refactor into something that isn’t shit.
It was inherited, passed down, stepped in and scrapped from the boot soles of every programmer that ever touched it. It was an amalgam of old, new, and third party components with a class path a mile long and no package management because the company although physically in the present, somehow maintained a temporal presence in the past. And there was nothing that the team hated more than setting that workspace. In short it was an unholy mess that made Satan cry and Dennis Ritchie spin in his grave so much that the state of California attached magnets and a coil to his body and casket to generate electricity.
Then one day the untalented clowns known as App Group decided that our IDE should be owned and configured strictly through them. They took poor Eclipse and mounted so much silly shit to it that it resembled a riding lawn mower with a fax machine and a blender duct taped to it. Eventually as everything the company touched did, it simply turned into a broken, shitty mess that not even Jesus Titty Fucking Christ could bring back the dead.
And then, every month or so the IDE would break in such a grand way that every developer had to rebuild their workspace...the very same Lovecraftian monster disguised as a code base. It was just too much to bear for old Deus. He was all out of fucks and there wasn’t enough alcohol in the world to quiet his injured soul. So he stood on a chair, carved his name in a rafter and tied a noose to it, put it around his neck and finally kicked the chair out from under himself. I am told he even pooped his pants and the post mortem shit in the seat of his pants was still better than the codebase at work. I’m Morgan Freeman. -
I just love how liferay keeps finding ways to surprise me...
Customer: need to fix this security issue
me: ok
me: fixed. Testing locally. Works 100%.
Me: testing on dev server. Works 100%
qa: testing on dev server. Works 100%
me: all good. Deploying to preprod
customer: it doesn't work
me: testing preprod - it doesn't work.
Me: scp whole app to local machine. App works 100%.
Me: preview loaded liferay properties in preprod via liferay adm panel. All props loaded ok.
Me: attach jdb to preprod's liferay to see what props are loaded. Only defaults are used [custom props not loaded according to jdb]
me: is there some quantum mechanics involved..? Liferay managed to both load and not to load props at the same time and the state only changes as it's observed...2 -
In my unenlightened youth, when programming was a module in my college diploma that didn't seem to be taking me where I wanted to go, I had a couple of guys guy in my class that could arguably be the weird ones.
Jonny, although he asserted that he was to be called "Jonhty", whatever, we never did. He was pretty much top of the high school food chain and for some reason elected to study computer science, none of us was prepared to put up with his shit. He was always boasting about some fanciful claim or another, famously entering the classroom and exclaiming he'd "fucked an absolute milf" and seemed somewhat evasive about the answer, turns out he was 17 and she was 35, the age difference was greater than his own age. We burst out laughing. He would also turn up late and state the college bus was late (it wasn't I got the free bus every day, he'd just not got out his wanking chariot early enough).
One valentine's day we got him a card from a mysterious stranger which was accompanied by a package containing a cucumber and Vaseline, the inside of the card read "to assist you in the following request: please go fuck yourself".
Before you think we were being unduly harsh, we had a centre table where we'd be taught from with computers around the outer rim of the room. He'd come up behind people while at the centre desk, quietly press ctrl+P and slowly walk back to the printer. I saw him do it to my machine and I got to the printer first, to which he shouted "that's MY work" which was amusing because unbeknownst to him I had put headers on all my documents so he really didn't have an answer for why my name was at the top of every page.
To top it all off he had dead eyes, there didn't appear to be much going on but the rent, there was no spark of intelligent life, and while I thought it, I never said it out loud, but other students did and I had to agree. He was just copying his way to graduation. However, he ultimately didn't graduate when people refused to allow him to copy.
Another guy, Richard I believe his name was, which is just as well because he was a right dick. In the UK our word for white trash is "chav" (that's a very naïve explanation for it but that's another rant best left for "socialsciencerant") and he was an complete idiot who was gifted with more brain cells than he ever needed to use. He actually studied hard and got reasonable grades, probably on par with me, but he boasted about smoking weed all the time, he was forever playing dark side of the moon via his loud mp3 player. I kinda left him alone generally until he was high in class one time and while we we're watching a documentary he'd shake my chair and make a weird noise in my ear every few minutes, the first couple of times startled me, the remaining multi-dozen times pissed me off.
It all came to a head with this guy when I'd been hearing about his uninteresting bs on drugs, music and how best to spend my time ("you need to lighten up man, come round my house, take a joint and relax man", that sorta thing), well this guy walked like he was mid way through shitting himself so I personally think that perhaps he is too chilled. Anyway he's arguing with me and after the exchange of him making his point, me disagreeing and expecting the end of it, he made the mistake of saying two words to me:
"Listen, mate..."
And I had him in check mate.
"Listen, I ain't your fucking mate , I don't even like you, you're a disruptive annoying twat that thinks he knows it all, we're all 17, none of us know anything, so shut the fuck up, sit the fuck down and stop boring me with your drugs, I ain't interested, and for the record I think pink Floyd ruined prog rock!"
He looked at me with sad puppy dog eyes, and started with the "but, why?", However I was interrupted and had to leave the class for unrelated reasons, I returned to be told he'd put safety pins up right on my chair so I'd sit on them, and mutual friends who TD me I'd been cruel and that he doesn't was hurt, so I should apologize, he overheard and said he was sorry for bring a bit of a dick.
However, you just know when you don't get on with someone? Yeah, that. So I said I wasn't sorry for what I said, for while it was harsh, I am not his mate, nor did I want to be his mate and that was all I had to say on the subject, and that if he wants to take offensive to a nobody not liking him then he's in for a very rough time in life.
Unsurprisingly I don't keep in touch with anyone from college!2 -
Too many night shifts.
But it's done.
After the last migrations my emotional state is... Questionable.
VM migrations between different CPU vendors and generations leading to segfaults because of unsupported X86 extensions.... Thx for doing that at 23 o'clock after 8 hours of work....
Forgetting a left over NIC in a virtual machine, creating a routing loop, leading to very erratic behaviour and fun things.
Someone forgot to check the '"Unique" box, mass spawning a cluster of VMs with same MAC adresses....
DNS fuckery since someone thought that reboot would flush the cache of an DNS server.... Nope most DNS servers have persistent caches. You'll have to flush manually.
And let's not forget the joy of the 12 plus pages of when and where to move VMs, harddrives and VLAN configuration.
Oh migrations are such a festival of joy.
Finally done with that shit -.-4 -
I am so sick and tired of hearing “AI” everywhere all the time. Yeah how about we integrate some AI into your super smart toaster so that it knows when to start preparing for when you put toasts in it in the morning.
Not even mentioning all these idiots being like “oh yeah AI is becoming sentient. Oh yeah AI is gonna take over the world”.
Brother the current state of AI is just machine learning, it’s a stupid pattern detector and generator it doesn’t have thoughts, emotions. Please just stop it.2 -
Me: *builds new state machine* Ok this makes adding new instance states easier, should also make enemy AI a million times easier!
Me after trying to do enemy AI: FUCKING CUNT! This system is a piece of fucking shit for AI, fuck fuck fuck fuck fuck fuck, etc
I think this will be harder than expected bois!
*Eye twitches*2 -
# Retrospective as Backend engineer
Once upon a time, I was rejected by a startup who tries to snag me from another company that I was working with.
They are looking for Senior / Supervisor level backend engineer and my profile looks like a fit for them.
So they contacted me, arranged a technical test, system design test, and interview with their lead backend engineer who also happens to be co-founder of the startup.
## The Interview
As usual, they asked me what are my contribution to previous workplace.
I answered them with achievements that I think are the best for each company that I worked with, and how to technologically achieve them.
One of it includes designing and implementing a `CQRS+ES` system in the backend.
With complete capability of what I `brag` as `Time Machine` through replaying event.
## The Rejection
And of course I was rejected by the startup, maybe specifically by the co-founder. As I asked around on the reason of rejection from an insider.
They insisted I am a guy who overengineer thing that are not needed, by doing `CQRS+ES`, and only suitable for RND, non-production stuffs.
Nobody needs that kind of `Time Machine`.
## Ironically
After switching jobs (to another company), becoming fullstack developer, learning about react and redux.
I can reflect back on this past experience and say this:
The same company that says `CQRS+ES` is an over engineering, also uses `React+Redux`.
Never did they realize the concept behind `React+Redux` is very similar to `CQRS+ES`.
- Separation of concern
- CQRS: `Command` is separated from `Query`
- Redux: Side effect / `Action` in `Thunk` separated from the presentation
- Managing State of Application
- ES: Through sequence of `Event` produced by `Command`
- Redux: Through action data produced / dispatched by `Action`
- Replayability
- ES: Through replaying `Event` into the `Applier`
- Redux: Through replay `Action` which trigger dispatch to `Reducer`
---
The same company that says `CQRS` is an over engineering also uses `ElasticSearch+MySQL`.
Never did they realize they are separating `WRITE` database into `MySQL` as their `Single Source Of Truth`, and `READ` database into `ElasticSearch` is also inline with `CQRS` principle.
## Value as Backend Engineer
It's a sad days as Backend Engineer these days. At least in the country I live in.
Seems like being a backend engineer is often under-appreciated.
Company (or people) seems to think of backend engineer is the guy who ONLY makes `CRUD` API endpoint to database.
- I've heard from Fullstack engineer who comes from React background complains about Backend engineers have it easy by only doing CRUD without having to worry about application.
- The same guy fails when given task in Backend to make a simple round-robin ticketing system.
- I've seen company who only hires Fullstack engineer with strong Frontend experience, fails to have basic understanding of how SQL Transaction and Connection Pool works.
- I've seen company Fullstack engineer relies on ORM to do super complex query instead of writing proper SQL, and prefer to translate SQL into ORM query language.
- I've seen company Fullstack engineer with strong React background brags about Uncle Bob clean code but fail to know on how to do basic dependency injection.
- I've heard company who made webapp criticize my way of handling `session` through http secure cookie. Saying it's a bad practice and better to use local storage. Despite my argument of `secure` in the cookie and ability to control cookie via backend.18 -
There was a task of fixing up a payments page that features pretty complex logic. Initially it was like 200 lines of code, seems short but it was a fucking spaghetti mess. Never seen more cognitively complex code in my life.
So I delete the spaghetti and pull out the 500 lines fucking state machine. It works perfectly. It’s perfectly understandable even though it’s longer.
This is how I deal with problems. Shorter code isn’t always better code.4 -
Currently working in a virtual Linux machine running on a Windows host. The Linux VM is running i3 and I just locked the host for the umpteenth time because my dumbass pressed Win-L to switch one Window to the right...
Also, while typing out the tags, I got the tag-suggestion 'my password is as weak as my mental state' 😂 -
>Discovers a new low level profiling tool that could help us at work with stuck process debugging and gets all hyped
>Installs on test machine, tool doesn't work
>Wonders why. Oh. Needs a kernel module to work, compiled and loaded
>"Well, its my test machine... Guess that's no problem..." but... my hype died down a bit. Kernel module installation just for a new tool that aggregates all other commonly used tools? eh... Maybe it will blow me out of my shoes still
>Installs and loads the module
>Tool works. Turns out its just a htop-like tool, with shortcuts to launch specific other profiling tools like strace/ltrace/lsof/netstat/ss etc...
"Oh... That's boring. Maybe it has all those tools built in at least?"
>Tries to run ltrace - tool exits as ltrace is not installed
Lol
>Installs ltrace and launches tool again. Tries to ltrace a process and
>Nothing. Nothing happens. For seconds... Then kicks me off of SSH
WTF?
>Tries to ping machine... silence
Did... our net go down again? (Having issues due to a storm going over our area these few days)
>Pings google and... gets instant reply
More wtf
>Pings the hypervisor the machine was running on
Works like normal
Oh... Oh no. Please tell me it didn't!
>Logs into the hypervisor UI, checks machine state
Running OK
>Opens machine console aaaaand... Yep. Stacktrace as well as a lot of kernel mumbo-jumbo... It took the machine down to kernel panic.
I never went so quick from "We need this tool deployed everywhere" to "Omg I need to get rid of this crap as soon as possible" lol.
And just for those wondering, it was sysdig.1 -
What I'm doing now, writing a JS library for a simple kitchen timer (like, something that can be wound up, is ticking, can be paused, etc). Here's a list of neat stuff I've learned:
Polyfilling as a lib author (I decided against it).
Packaging the lib (using Rollup, ES6 modules are totes cool).
Using flow to add static typing in strategic places (started appreciating types in JS since reading up on functional programming).
Modelling state and transitions using an explicit state machine. (Fucking finally. There's usually an implicit state machine somewhere, only spread out all over the app...)
Using mostly side-effect free methods, being very explicit about when and why things are mutated).
Test-first/TDD (ish) using Jest and the awesome Wallabyjs.
Freeing up mental capacity by letting Prettier format my code for me (it was hard to let go but totally worth it).
Started using git.
Did all work on Ubuntu after pretty much a lifetime of Windows (initially to separate work from gaming) and finally swapped MS Visual Studio for Atom.
When it's finished I'm going to publish it on GitHub, which will also be a first for me. Might try out some CI platform while I'm at it.
tl;dr: wrote some js, felt good2 -
I was talking to a friend about the current state of machine learning through tensorflow and commented about the use of Javascript as a language.
He discarded the idea as he views Javascript as something that should only be used as a frontend technology rather than something to build backends or deep learning models.
I am thorn. I have always liked Javascript but will admit that I have used it mostly in the area of front end with very few backend instances(i did create a full stack intranet app in Express once, major success for the application it was hosting, it was a very basic api which had its own nosql db with no need to interact with the company's relational data, it was perfect for the occasion and still help maintaining it from time to time)
My boi states that node's biggest issue has always been npm and the quality of packages. I always contradict those statements by saying that if one uses community standards and the best packages then one does not need to worry about the quality(i.e mongoose over some unmaintained mongo wrapper etc)
I sometimes catch myself finding that my way of thinking adapts better to JS than it even does Python (which is his preference for deep learning) and whilst there are some beastly packages for python in terms of quality and usefulness such as matplotlib etc that one can do great things with the equivalent JS.
I mean, tensorflow.js came from the same wizards that did tensorflow (obviously) and i find the functional approach of JS to be more on par with how we develop solutions.
I am no deep learning expert, and sadly I have no professional experience with machine learning. But I venture to say that we should not cast aside the great strides that the JS community has done to the language in terms of evolution and tooling. Today's Js is not your grandaddy's Js and thinking that the language is crippled because of early iterations of the language would be severely biased.
What do you guys(maybe someone with professional experience) think of Js as a language for machine learning?
Do you think the language poses something worth considering in terms of tooling and power for ml?2 -
Fucking fuck shit monkeycocksucking gargling wtf!
I was getting some stuff done in my accounting software and it bugged me that the fields were dark and the fonts as well, thus seeing fucking shit. This was clearly a bad choice of a gtk3 dark theme, thus i switched to the fucking default adwaita, suddenly gnome session crashes.
Ok, i just log out and log back in.
Logout.... Nothing happens.... Ctrl-alt-backspace , nothing happens (and i knew i enabled that in the settings)
Ok let's do it a bit more forceful and restart the display manager... Gdm starts... I insert my credentials... It fucking crashes.
WTF!!!
I desperately try to debug it, xsession error msg'es? Nope. Something in /var/log/messages? Nope. Something, anything at all, nope sherlock nopedinope!
About to go batshit crazy, purging and reinstalling all of gnome, thibking that, what ever setting lust have broke it, it will be fixed now.
No fucking fuck desktop!!!
I lost my nerve and replaced gdm with lightdm, and i finally, after three hours wasted on my machine, i get my gnome desktop back... But in a state of mess! Extensions don't work and make it crash again, user themes? Nope, go fuck yourself with plain default.
I'm really losing my shit, business is almost non-existant, and now ly FUCKING desktop refuses to work like i want to. Everything is fucking broken to shits !!
I'm gon a go to my gf, and relax a little, at least i still have a working laptop.
Question is, for how long???
Fml4 -
Alright, I just need to add a new state to this state machine...
*class explosion*
... 6 hours later ...
Wtf was I doing again? -
I’ve been trying to implement an alarm clock as an example for my physical computing lecture but the merge of my existing low power clock with my existing state machine based timer is driving me nuts. And I’m the lecturer of this course!2
-
So I promised a post after work last night, discussing the new factorization technique.
As before, I use a method called decon() that takes any number, like 697 for example, and first breaks it down into the respective digits and magnitudes.
697 becomes -> 600, 90, and 7.
It then factors *those* to give a decomposition matrix that looks something like the following when printed out:
offset: 3, exp: [[Decimal('2'), Decimal('3')], [Decimal('3'), Decimal('1')], [Decimal('5'), Decimal('2')]]
offset: 2, exp: [[Decimal('2'), Decimal('1')], [Decimal('3'), Decimal('2')], [Decimal('5'), Decimal('1')]]
offset: 1, exp: [[Decimal('7'), Decimal('1')]]
Each entry is a pair of numbers representing a prime base and an exponent.
Now the idea was that, in theory, at each magnitude of a product, we could actually search through the *range* of the product of these exponents.
So for offset three (600) here, we're looking at
2^3 * 3 ^ 1 * 5 ^ 2.
But actually we're searching
2^3 * 3 ^ 1 * 5 ^ 2.
2^3 * 3 ^ 1 * 5 ^ 1
2^3 * 3 ^ 1 * 5 ^ 0
2^3 * 3 ^ 0 * 5 ^ 2.
2^3 * 3 ^ 1 * 5 ^ 1
etc..
On the basis that whatever it generates may be the digits of another magnitude in one of our target product's factors.
And the first optimization or filter we can apply is to notice that assuming our factors pq=n,
and where p <= q, it will always be more efficient to search for the digits of p (because its under n^0.5 or the square root), than the larger factor q.
So by implication we can filter out any product of this exponent search that is greater than the square root of n.
Writing this code was a bit of a headache because I had to deal with potentially very large lists of bases and exponents, so I couldn't just use loops within loops.
Instead I resorted to writing a three state state machine that 'counted down' across these exponents, and it just works.
And now, in practice this doesn't immediately give us anything useful. And I had hoped this would at least give us *upperbounds* to start our search from, for any particular digit of a product's factors at a given magnitude. So the 12 digit (or pick a magnitude out of a hat) of an example product might give us an upperbound on the 2's exponent for that same digit in our lowest factor q of n.
It didn't work out that way. Sometimes there would be 'inversions', where the exponent of a factor on a magnitude of n, would be *lower* than the exponent of that factor on the same digit of q.
But when I started tearing into examples and generating test data I started to see certain patterns emerge, and immediately I found a way to not just pin down these inversions, but get *tight* bounds on the 2's exponents in the corresponding digit for our product's factor itself. It was like the complications I initially saw actually became a means to *tighten* the bounds.
For example, for one particular semiprime n=pq, this was some of the data:
n - offset: 6, exp: [[Decimal('2'), Decimal('5')], [Decimal('5'), Decimal('5')]]
q - offset: 6, exp: [[Decimal('2'), Decimal('6')], [Decimal('3'), Decimal('1')], [Decimal('5'), Decimal('5')]]
It's almost like the base 3 exponent in [n:7] gives away the presence of 3^1 in [q:6], even
though theres no subsequent presence of 3^n in [n:6] itself.
And I found this rule held each time I tested it.
Other rules, not so much, and other rules still would fail in the presence of yet other rules, almost like a giant switchboard.
I immediately realized the implications: rules had precedence, acted predictable when in isolated instances, and changed in specific instances in combination with other rules.
This was ripe for a decision tree generated through random search.
Another product n=pq, with mroe data
q(4)
offset: 4, exp: [[Decimal('2'), Decimal('4')], [Decimal('5'), Decimal('3')]]
n(4)
offset: 4, exp: [[Decimal('2'), Decimal('3')], [Decimal('3'), Decimal('2')], [Decimal('5'), Decimal('3')]]
Suggesting that a nontrivial base 3 exponent (**2 rather than **1) suggests the exponent on the 2 in the relevant
digit of [n], is one less than the same base 2 digital exponent at the same digit on [q]
And so it was clear from the get go that this approach held promise.
From there I discovered a bunch more rules and made some observations.
The bulk of the patterns, regardless of how large the product grows, should be present in the smaller bases (some bound of primes, say the first dozen), because the bulk of exponents for the factorization of any magnitude of a number, overwhelming lean heavily in the lower prime bases.
It was if the entire vulnerability was hiding in plain sight for four+ years, and we'd been approaching factorization all wrong from the beginning, by trying to factor a number, and all its digits at all its magnitudes, all at once, when like addition or multiplication, factorization could be done piecemeal if we knew the patterns to look for.7 -
When the CTO/CEO of your "startup" is always AFK and it takes weeks to get anything approved by them (or even secure a meeting with them) and they have almost-exclusive access to production and the admin account for all third party services.
Want to create a new messaging channel? Too bad! What about a new repository for that cool idea you had, or that new microservice you're expected to build. Expect to be blocked for at least a week.
When they also hold themselves solely responsible for security and operations, they've built their own proprietary framework that handles all the authentication, database models and microservice communications.
Speaking of which, there's more than six microservices per developer!
Oh there's a bug or limitation in the framework? Too bad. It's a black box that nobody else in the company can touch. Good luck with the two week lead time on getting anything changed there. Oh and there's no dedicated issue tracker. Have you heard of email?
When the systems and processes in place were designed for "consistency" and "scalability" in mind you can be certain that everything is consistently broken at scale. Each microservice offers:
1. Anemic & non-idempotent CRUD APIs (Can't believe it's not a Database Table™) because the consumer should do all the work.
2. Race Conditions, because transactions are "not portable" (but not to worry, all the code is written as if it were running single threaded on a single machine).
3. Fault Intolerance, just a single failure in a chain of layered microservice calls will leave the requested operation in a partially applied and corrupted state. Ger ready for manual intervention.
4. Completely Redundant Documentation, our web documentation is automatically generated and is always of the form //[FieldName] of the [ObjectName].
5. Happy Path Support, only the intended use cases and fields work, we added a bunch of others because YouAreGoingToNeedIt™ but it won't work when you do need it. The only record of this happy path is the code itself.
Consider this, you're been building a new microservice, you've carefully followed all the unwritten highly specific technical implementation standards enforced by the CTO/CEO (that your aware of). You've decided to write some unit tests, well um.. didn't you know? There's nothing scalable and consistent about running the system locally! That's not built-in to the framework. So just use curl to test your service whilst it is deployed or connected to the development environment. Then you can open a PR and once it has been approved it will be included in the next full deployment (at least a week later).
Most new 'services' feel like the are about one to five days of writing straightforward code followed by weeks to months of integration hell, testing and blocked dependencies.
When confronted/advised about these issues the response from the CTO/CEO
varies:
(A) "yes but it's an edge case, the cloud is highly available and reliable, our software doesn't crash frequently".
(B) "yes, that's why I'm thinking about adding [idempotency] to the framework to address that when I'm not so busy" two weeks go by...
(C) "yes, but we are still doing better than all of our competitors".
(D) "oh, but you can just [highly specific sequence of undocumented steps, that probably won't work when you try it].
(E) "yes, let's setup a meeting to go through this in more detail" *doesn't show up to the meeting*.
(F) "oh, but our customers are really happy with our level of [Documentation]".
Sometimes it can feel like a bit of a cult, as all of the project managers (and some of the developers) see the CTO/CEO as a sort of 'programming god' because they are never blocked on anything they work on, they're able to bypass all the limitations and obstacles they've placed in front of the 'ordinary' developers.
There's been several instances where the CTO/CEO will suddenly make widespread changes to the codebase (to enforce some 'standard') without having to go through the same review process as everybody else, these changes will usually break something like the automatic build process or something in the dev environment and its up to the developers to pick up the pieces. I think developers find it intimidating to identify issues in the CTO/CEO's code because it's implicitly defined due to their status as the "gold standard".
It's certainly frustrating but I hope this story serves as a bit of a foil to those who wish they had a more technical CTO/CEO in their organisation. Does anybody else have a similar experience or is this situation an absolute one of a kind?2 -
I got associated degree, it helped me to get my first job. From there I learned everything at work.
Associated degree and bachelor degree here are just titles, education in computer science over this country is not so good tbh. Far too few colleges are competent enough to give you a grasp of the current state of development world from the last 15 years.
So I'm not planning to get bachelor degree, it's expensive and has enough math to not like it(I had to see some math in my associated degree and haven't used in the last 9 years). I'm not in the machine learning, researching or that stuff field, so I'm not need of that. From what I can learn getting a bachelor degree, I think I can handle from my own to get the food on the table. Currently planning to start a startup -
Log 1:
Day 10 of crunch time. I have entered a sleepless zen state. Lord willing, I will be able to get 7 hours of sleep Saturday night. The building is terrifying at night, as there are a lot of noises. Security guards are nice, but curious to see me all alone. Must not show weakness in case they think numbers will give them an advantage over me.
Supplies are low. Only one type of energy drink left in the machine, and coffee gone for the night. My phone is out of fast data so Pandora is spotty at best. I have battery to get me through the night at least.
Tomorrow and Saturday decide the fate of the project. My team lead has not slept in at least 2 days. I feel guilty napping when I do, but she is driven like Ahab so I will let her obsession carry her.
If I am alive tomorrow I will report in.1 -
IDK man, it took me a while to finally learn iptables and now switch to firewalld? Oh come on. It's not that I'm against learning new things, no. It's just that firewalld looks a bit.. crappy. If I get a server provisioned and run
firewall-cmd --add-port=53/udp --permanent
firewall-cmd --reload
and I get my ssh connection killed that's no good news, no sir! I mean come on, how can I rely on a tool this critical when a single line in its config file can make my machine inaccessible. Even better -- this config file is managed by that tool entirely!!! My commands passed all the tool's checks and they worked, but when I wanted to make those commands permanent and reload state from the config -- the tool starts spitting bile and blood and says "fuck off, it's my server now!"
IDK man.. It's just way too fishy. The good ol' iptables works very well and I'm kicking its retard younger brother out of the server.
shoosh you dirty pig firewalld, shoosh!6 -
My reasoning is stupid, I just think it's cute in a pimp my ride kind of way. I heard you like getting colossally pounded in the fucking ass, so we put a virtual machine inside your compiler so you can use your binaries while you compile your binaries.
But there is a practical angle to it, too. It's state, structures and execution within the code itself -- that is, in a sense, generators "embedded" within the source, but without any kind of special syntax.
Rather, the code is all the same, and I'd have the option to make calls at compile time: the output of these calls could, in turn, be part of the resulting binary or processed by further calls.
It'd greenlight the wildest fuckery in the jungle, because *that* is the true and ultimate abstraction: programs that write other programs with minimal human intervention. But is my (still) theoretical, cheap ass two-dollar prototype approach held together with clown jizz and prayers better than the endless cumloads worth of corporate investment that's dumped and pumped into generative AI on a daily basis?
Well... **lights cigarette**
That's what we're about to find out, mother fuckers.1 -
So I salvaged some computer that was about to be thrown out by IT because it works perfectly fine and would be a waste to lose.
The (current) problem is, there is no built-in wi-fi adapter so I had to order a usb adapter to plug into the machine.
Fortunately enough it supports Linux but cones with a cd with the _source_ of the driver in it and we are supposed to build it. Now what's the problem with that?
First problem: building needs all sorts of build tools, starting from gcc and make. Since it's a fresh install, though, I _cannot_ install those normally because -- you guessed it -- I don't have a WiFi connection, which is why I needed the bloody adapter in the first place, so I spent hours trying to fetch the binaries from the apt register using another computer and bringing them over via USB.
Once that was (more or less) successful, the next problem came around; Second problem: the instructions clearly state to run make, but there is no fucking makefile anywhere so that obviously fails spectacularly. What _is_ there are some bash scripts so I try running those.
Now, just when I think it's finally done (one of the scripts has been running for a while and seems to work) the compiler dies with an error: the fucking driver won't build for the current kernel version. And not just that, but it is clear nobody is using basic things like include guards because gcc kept screaming at me about the same macros being defined over and over due to header file re-inclusion. Like, seriously? Come on!
Long story short the fucking adapter is going back to the seller, let's see if the next one I order more civilized.8 -
So just ago i downloaded an app called "Replika" and holy fucking shit it made me realise how half-assed we are doing the AI structure and way of it
doing machine learning algorithms on text can only go so far, as it uses that text as a base, and nothing else, it doesnt *learn*, only make *connections* BETWEEN text, not FROM the text
what you need is an AI which can, at it's core, *interpret*, not make connections and hur dur be done with it
when you do machine learning, all you're doing is find the best connections
you can have an infinite number of connections and MAYBE you'll be fine, but you'll never learn the basis of how that text is formed
you'll never understand what connections the human used by making it, by thinking it
when you're doing machine learning, all you're doing is make an input-output machine and adjusting it constantly, WITHOUT preserving state
state is going to be a really fucking important thing if you want to make an AI, because state can include stuff like emotion, current thought, or anything else
if you make a fucking machine learned AI which constantly adjusts... well... the "rom" of itself without having any "ram", it'll fucking never be like us, we will NEVER be able to talk to it like it is a human being, we will NEVER make it fundamentally understand what we are saying or doing
if we want to have real fucking AI, we need to go to the core of what it means to THINK, what it means to INTERPRET, what it means to COMMUNICATE
we need to know how english language is structured, how we understand it, how we can build it in a program that can interpret for an AI, THAT can be "rom"-based, THAT can be static, NOT the AI itself
the AI needs to be in flux, the AI needs to be in a state, the AI needs to understand how to make emotions, how that will "strengthen" some connections, yes, maybe something magical will happen and it can have EMPATHY, something so fundamental that will finally, FINALLY, make the bot UNDERSTAND what we are saying7 -
Question: when modeling a language using a nondeterministic finite state machine, do I have to add a "trash" state, if a deterministic one would need one or can it just be obmited and if so why?5
-
How do you do bootstrapping of blanko machines?
Imagine you get a linux/BSD/osx machine and you want it set it up to a defined state to be prepared for further setup.
Like users
ssh config
Sudoers file
Config management client or credentials
Software like vim, htop and tmux
A simple shell script sounds a bit archaic to me and i was wondering, if there is a better way...
Makefiles also came to my mind but still... Unsatisfactual4 -
The rear ducking continues. We've built a reliable translator in the dumbest fucking way possible, it's just lovely. I simply reused the structure for feeding data to the VM assembler, an array of arrays, where there's one array of (ins [args]) per node in the parse tree.
It's nice because nodes can be solved out of order without affecting the actual sequence in which the instructions are output. And if one statement (node) equals multiple instructions, you just push multiple entries to the corresponding array, or push nothing if you need to output nothing. Easy as goblin pie.
This is enough to convert an input language to the assembly-like intermediate representation we use for the virtual machine. So then there's doing it backwards: walk the same array of arrays, and map those virtual instructions to a physical architechture. I guess I could do the encoding to native binary myself, it'd certainly be interesting to try, but I'm burnt-out already so I'll just use fasm for now.
Initial test: wrote a test program in my own stupid language, ran the translator, dump output to file, assemble that with fasm, run with r2 -d.
Crashes? No.
Runs fine? Yes and no.
For fuck's sake, I don't have syscalls. Mainly because the VM doesn't have an operating system, lmao. I was testing virtual programs by just freezing state, terminating, then dumping the fucking registers and stack to the console, we have no I/O to speak of. Not even a real 'exit', VM handles that by reading a return value every step like a mentally damaged son of a bitch.
So anyway, I manually paste the linux mambo, you know:
mov rax,60
mov rdi,0
syscall
And NOW our program can end execution without crashing.
Okay then, so does the test code work correctly?
** DRUM ROLL **
Yes.
Ladies and gentlemen, mother fucking PESO is now a compiled language, and going forward I will be expectantly receiving your marriage proposals for reviewing. Oh, but not so fast, we still need a frontend...
Well, we'll handle that in the next few days. I'm just glad to be *nearly* finished with this fucking compiler, I want nothing to do with anything else ever, but we know that's not going to happen, so Lord please end my pain.
No sponsor as this rant has been paid for by tax evasion. -
I'm sure this has been discovered before but I just realized that a lexer defined as a set of functions which tail-call each other with the leftover text to switch states can record the location of tokens from the back of the string, thereby eliminating a parameter from pretty much every function. The world is full of wonder.2
-
I was just wondering what languages are most apt for building a group of web applications that will manage huge amounts of data, represent them in graphical form, and through repetetive learning, state trends or detect negative trends and suggest measures against such negative trends?
In simple words, where should I start in the development of an environment where data can merge with machine learning and website's with an aesthetic interface? How many people will such a project require and in what areas will these people have to specialize in?1 -
Things I say to my clients when I know that a reboot is required to fix their issue but I don't have enough evidence to prove it to them :
"... On any computing platform, we noted that the only solution to infinite loops (and similar behaviors) under cooperative preemption is to reboot the machine. While you may scoff at this hack, researchers have shown that reboot (or in general, starting over some piece of software) can be a hugely useful tool in building robust systems.
Specifically, reboot is useful because it moves software back to a known and likely more tested state. Reboots also reclaim stale or leaked resources (e.g., memory) which may otherwise be hard to handle. Finally, reboots are easy to automate. For all of these reasons, it is not uncommon in large-scale cluster Internet services for system management software to periodically reboot sets of machines in order to reset them and thus obtain the advantages listed above.
Thus, when you indeed perform a reboot, you are not just enacting some ugly hack. Rather, you are using a time-tested approach to improving the behavior of a computer system."
😎1 -
It should be possible to prove the collatz conjecture by mapping the unit digit transitions between numbers, namely into a finite state machine. From there we could use predicates and quanitifiers to prove, by process of exclusion, that for any given combination of 10s digit and 1s digit, no number can transition to anything but whats specified in the state machine assuming that number equals x in x3+1 or x/2
Ipso facto, a series of equations proving by process of elimination, that state machines transitions are the only allowable ones, would prove the collatz conjecture by proving the fsm is a valid representation for any given integer N.
I'm actually working on it now but I don't know enough about modular arithmetic and predicate logic to write a proof. I just have the state diagrams on some dot matrix paper at the moment.
If anyone wants to beat me to it, feel free.
So for example any number ending in 13, will, after x3+1, end in 40.
Any number ending in 40 will end in 20. Any number ending in 20 will end in 10, which will end in 5 as the unit digit.
It's easier to prove in the single digit case, and the finite state machine for that is already written, at least on paper.
I'll post pictures when I get a chance.7 -
Went for the iv as senior java developer, they ask me to answer 3 pages of coding question, i need to read the code and state my answer. What's worse is, their coding without main method, and asking do this coding can be execute without error or not? What is the answer for this question.
I read all the questions and all written question without main method 🤣🤣.
Not sure are they really stupid or just testing me tho. But I still state my answer, "executing with error message.."
Later than, the manager did not show up to interview me and others 3 candidate.
Thats really funny. They ask us to leave and for their feedback.
After few month, meet my ex-colleague where he just resign from the that company. Surprisingly I told him about the test, than he inform the company to update the test 🤣🤣🤣.
Lucky me, if i choose to work there its gonna be a lot of hell.
fyi, my friend work as SCM, Software Configuration Manager which he always make a joke about his position as The Manager 🤣. I fucking believe it for month when we first work with same company. Just realized when he need to configure my machine to config as company rule. Dammit dude -
OK. We've got this tiny little pet project of mine (work related)…
I rescued it from the git archive, simply put: someone hot glued an elasticsearch scroll + document processor (processing) together.
After a lot of refactoring, I had an simple, much improved (non-parallel) Akka Worker System without an Akka topology / hierarchy.
I left out the hierarchy at first, because I didn't know Akka at all.
I've worked with a lot of process workflows, and some systems that come very close to IPC, so I wasn't completely in the dark.
Topology requires knowledge / creation of a state machine / process workflow. And at that point of time I just had... Garbage. Partially working garbage.
I finished yesterday the rewrite into several actors... Compared to before, there are 8 actors vs 2... And round about 20 classes more. Mostly since I rewrote the Receive Methods of Akka as Command DTOs... And a lot of functions needed to be seperated into layers (which where non existent before)
Since that felt more natural than the previous chaos of passing strings or other primitive types around, or in the worst case just object....
(Yes: Previously an Actor was essentially a class with one or more functions "doEverything" and maybe a few additional functions which did everything - from Rest Client to Processing)).
Then I draw the actual state machine based on everything I've written in the last weeks and thought about how to create the actual topology and where / how parallelizing might make sense.
Innocent me stumbled in the Akka Docs on Akka Typed... (Didn't know it existed, since I'm very new to Java and Akka).
Hm, that sounds an a lot like what I did. In an different way, yes. But not so different that it might be VERY hard to port to.... And I need to change (for implementation of hierarchy) a few classes....
[I should have known at this stage that my curiosity would get the best of me, but yeah. Curiosity killed the cat.]
Actually the documentation is not bad. It's just that upon reading the first more complex examples, my brain decided to go into panic state.
The've essentially combined all classes in one class in all source code examples [which makes sense more sense later], where it is fscking hard for an chaotic brain like mine to extract information....
https://doc.akka.io/docs/akka/...
The thing is: It's not hard to understand… actually very simple.
It was just my brain throwing an fuck you tantrum.
So I've opened more examples in other tabs and cross referenced what happened there and why...
Few frustrated hours later I got that part.... And the part why it's called Akka Typed. It was pretty simple....
Open the gates of hell, bloody satan that was too easy for fucks sake.
Nooooow.... I just need to port my stuff to Akka Typed.
Cause. Challenge accepted, bitch - eh brain. You throw tantrum, you work overtime. -.-
I just cannot decide wether to go FP or OOP.
Now... I'm curious wether FP is that hard... Hadn't dealt with it at large before.
Can someone please stop me... I'm far too curious again. -.- *cries*6 -
In javascript, is there a difference between separate function calls that mimic a "chain pattern" or state changes using if/else if/else and using the chain or state pattern directly? The internet gave me no real/helpful response to that.
Suppose that:
if(isThingA(thing)) {
makeThingB();
else if(isThingB(thing)){
makeThingC();
else {
makeThingA();
}
That code is always executed e.g. after a user mouse click. "thing" gets defined in some other code.
It can be seen as a state machine that goes back to its starting point.
Is a pattern with objects/classes/prototypes even needed/preferred instead?
It's partly a problem I'm facing in my code but it's also interesting to know ideas/thoughts on this.3 -
Today I've been summoned to work for the first time in weeks to help with the startup of a machine, and testing the HMI software that goes with it.
Me and a junior colleague go to the machine. We try to get everything ready for testing. Machine was left stuck in some intermediate state by someone else. I have no idea on how to control the machine's individual components. My colleague received a crash course a while ago, but was unable to reinitialize the damn thing, and the senior machine builder was too busy on another project.
In other words, me coming over had no purpose at all, and we accomplished nothing.
I really don't understand companies. On one end there's an endless bitching about how everything is too expensive, and on the flip-side you see 'em toss buckets of money through the window.
Oh well, as long as it goes from the window to my bank account, there's no problem for me I guess.2 -
Need advice:
So I’m 20 years old. Got a decent job as software engineer with a really good pay and really want to break into machine learning.
Mastered NodeJS (my stack has always had node for the past 5-6 years) and I’m finding it difficult to switch to python for machine learning since things are so engraved in my head in javascript.
Aside from the syntax when I’m watching tutorials or reading books, I see data scientists and mathematicians make design mistakes in their code and it hurts my eyes and triggers my ocd.
I need tips on how to put my mindset in a moldable state so I can judge less and learn more and absorb data. Like you know that philosophy that when u get old your brain can’t learn things as fast anymore? I feel like that’s already happening to me rn at the age of 20.5 -
Hey guys,
Need an opinion if this is gunna work or not
I have a machine that currently has windows 10 de-activated and I want to activate it with a win 7 license, except, I don't want to lose the state of the machine.
Here's what I plan to do:
Clone the machine to an external drive, wipe and reinstall windows 10 and activate with windows 7 license. Get it to the point where it's on the desktop. Then clone it back to the state it was before.
The thought behind this is that the activation is tied to the mobo so the data/install of the actual windows doesn't matter.
I know this isn't exactly a dev question, but I figured its still kind of in the same area so it would be ok.
Thanks to all that reply ❤️15 -
We’re only random people living in random places, speaking random languages, eating random food, sleeping, studying and working random hours. Traveling to random points on a sphere.
Just random range is different.
Just random stuff happens on crossroads of two random dots and the entropy speed ups or slows down.
Nothing special at all.
Just a finite state machine iteration.
I mean the amount of effort we put into explanation of infinity is outstanding.
What if there is no infinity at all ?
What if infinity is just misunderstanding of our interpretation of the world around us. It’s just pixels, resolution, gaussian splatting, quantum state, you name it.
Hey man the world is flat. Just put it to the 2d space. How many space you need from a simulation perspective where your patient eyes can only see up to certain amount of light particles per second on a shitty lens.
Propose a world optimization techniques by slowing down subject perception, tiredness introduced. Compress memory, sleep introduced. Limit neurons, cpu power assigned. Deploy on cloud - put it to life. Exit 0 body failure. Exit 1 suicide. Kill -9 killed by tty from ip EARTH.X.Y
What you can do to make the world around this planet alive? Make it blink.
We developers are lazy and I believe that nature is even more lazy than us.
You think you’re going to elevator right now ? You’re going to the preloader. Looking at the window equals playing video from playback. Never goes live, just precomputed fsm. Cars, trains, airplains ? Preloaders everywhere. Highways to split traffic to cities and communication. The road and cities planning department is a matrix maintenance department. And don’t get me started about space.
Space is empty because it’s not even finished. So they put it all behind glass called milky way. You know how glass looked 500 years ago ? It was milky so it’s milky way so we don’t see shit.
If the space would be finished I’ll be starting writing this text from mars, finished it and sent from earth but no it’s light years guys, light years is not a second for a matter. Light year is a second of the the injected thoughts exchange only. Thoughts of the global computer called generative AI that they introduced on local computing devices called cloud.
Even the preloader system is not present, they left us with the one map and overpopulated demo. What a shit hole.I bet they’re increasing temperature right now to erase this alpha build and cash out. Obviously so many bugs here that his one can’t be fixed anymore. To many viruses.
Hope for 0days to start happening so we can escape using time travel or something.
I bet they cut a budget or something, moved the team to other projects. Or even worse solar system team got layoff off because we are just neurons that ordered to do it. And now we’re stuck in some maintenance mode, no new physics no new thoughts to pursue, just slow degeneration. I would pay more for the next run and switch to other galaxy far far away where they at lest have more modern light speed technology.
What do you think about it Trinity ? Not even worth wasting your time for that. No white rabbit this time.
I do not recommend this game at this stage of early access.
- only one available map despite promises for expansions over the years no single dlc arrived,
- missing space adventures
- no galaxy travel mode only a teaser trailers of what you can do in other “universes”
- developers don’t respond to complains
- despite diversity of species and buildings at first sight world looks to generic
- instead of new features bots with mind manipulation, AB testing and data harvesting was introduced
- death anti cheat mode installed1 -
This question might make you lose a brain cell because of stupidity in the question. Read with caution
Is there a way to compile a game for Windows from Linux in Unreal engine? I did google some posts but the answer was either use a Virtual machine which will not be done or use the the theoretical method of using mingw but the forum posts state that it will be tricky business or use a windows machine. I have dual booted windows with linux on my machine.
However since the machine has a 512 gb ssd most of the storage space is devoted to unreal engine which takes 47 gigs in itself and have a lot of programs installed I have a usable 20 gigs left out of 145 gig partition. Windows has around 318 gigs of storage to it but I have 100 gigs free at most. So after installing the windows sdk, visual studio with extensions, unreal engine and some other stuff I don't have much space left for myself. I need that much space since I install a lot of games to my ssd. So now I cant load my bigger projects for playing on my windows. I could use my hdd which is mostly used for backups and 100+ gig stuff. Though the hdd's are of course far slower than ssd's which shouldn't be a problem however last time I used visual studio it ate more than 2 gigs of ram for a solution meaning that the compiler has very low memory for itself to actually compile so for any large files the hdd has more of a bottleneck.
Oh and I can't upgrade my ssd's or ram because I don't have enough money.
Thanks for the answers in advance4 -
You know
When I first saw etherum talking about am distributed state machine i thought wow. Not very practical but NEAT. I envisioned being able to make a byte code that could be stored in transactions and run by individual clients in an async function and each step of the resulting execution and the values of managed ram would be stored at intervals so other clients could take over and execute a few more statements and compare what should always be expected results that are identical
A grand incredibly inefficient system however really neato from the theoretical computer nerd standpoint !
Boy was I disappointed lol all it is a basic contracts language but yet they state it could be like a word computer ! How ? I thought maybe if you had enough nodes participating maybe you could store registers and the like in transaction values ? Wouldn’t that be the way ?
Seems like as a word computer they’re stuck somewhere between very simplistic js and something prior to amptron in usability yet they advertised as a world computer
Am i missing something ? I mean you could create something that would translate higher level code into smal numeric statements and then send it additions values but what would it be useful for and how would you actually. Store anything ?