Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "script"
-
Based loosely on the popular "git" command, I am happy to announce my new product, "hit"!
Essentially, hit hooks into "git blame" and automatically slaps the shit out of whoever wrote this garbage.
It uses SOHTTP (Slap Over HTTP) to deliver a nice firm wallop to any subpar script kiddie that had the audacity to come up with this bullshit.
Careful, the user is not immune to the effects10 -
In the begining of time, when The Company was small and The Data could fit in some fucking excel sheets, Those Who Came Before implemented some java tool to issue invoices, notify customers and clear received payments.
Then came the Time Of The Great Expanse, when The Company grew to unthinkable levels. Headcount increased with each passing day, and The Data shows that everything was going great!
But when the future seemed bright, came The Stall-Out. The days when The Company could not expand as fast as it did before. And Those Who Came Before left, abandoning their Undocumented Java Tool to its own luck.
Those who came after knew nothing of the inner workings of the Undocumented Java Tool. They knew only that the magical Jar would take a couple fucking excel spreadsheets and spit out reports and send emails like magic.
And those were The Dark Days.
In the darkness, The Data grew to be a monster. Soon a fucking excel spreadsheet could not hold The Data contained any longer. Those Who Came After, fearing the wrath of The Undocumented Java Tool, dared not mess with its code. Instead, they fucking cut away the lowest volume transactions from the fucking input spreadsheet, and left the company to report the unbilled invoices as "surprise losses". Fucking script kiddies, were Those Who Came After.
Then, at The Darkest of Days (literally, Dec 21st), marched into the project The Six Witchers, who fear not the Demon of Refactoring.
This story is still unfolding. Will The Six Witchers manage to unravel the mysteries of The Undocumented Java Tool? Will they be able to reverse engineer the fucking black box, and scale it's magic into a modern application?
Will they decrease revenue forecasting error by at least 2% in a single strike?
Only the future will tell.16 -
What do you call it when you type a script? A TypeScript. Then you spill coffee on it...
now it's a JavaScript.4 -
Worst hack/attack I had to deal with?
Worst, or funniest. A partnership with a Canadian company got turned upside down and our company decided to 'part ways' by simply not returning his phone calls/emails, etc. A big 'jerk move' IMO, but all I was responsible for was a web portal into our system (submitting orders, inventory, etc).
After the separation, I removed the login permissions, but the ex-partner system was set up to 'ping' our site for various updates and we were logging the failed login attempts, maybe 5 a day or so. Our network admin got tired of seeing that error in his logs and reached out to the VP (responsible for the 'break up') and requested he tell the partner their system is still trying to login and stop it. Couple of days later, we were getting random 300, 500, 1000 failed login attempts (causing automated emails to notify that there was a problem). The partner knew that we were likely getting alerted, and kept up the barage. When alerts get high enough, they are sent to the IT-VP, which gets a whole bunch of people involved.
VP-Marketing: "Why are you allowing them into our system?! Cut them off, NOW!"
Me: "I'm not letting them in, I'm stopping them, hence the login error."
VP-Marketing: "That jackass said he will keep trying to get into our system unless we pay him $10,000. Just turn those machines off!"
VP-IT : "We can't. They serve our other international partners."
<slams hand on table>
VP-Marketing: "I don't fucking believe this! How the fuck did you let this happen!?"
VP-IT: "Yes, you shouldn't have allowed the partner into our system to begin with. What are you going to do to fix this situation?"
Me: "Um, we've been testing for months already went live some time ago. I didn't know you defaulted on the contract until last week. 'Jake' is likely running a script. He'll get bored of doing that and in a couple of weeks, he'll stop. I say lets ignore him. This really a network problem, not a coding problem."
IT-MGR: "Now..now...lets not make excuses and point fingers. It's time to fix your code."
IT-VP: "I agree. We're not going to let anyone blackmail us. Make it happen."
So I figure out the partner's IP address, and hard-code the value in my service so it doesn't log the login failure (if IP = '10.50.etc and so on' major hack job). That worked for a couple of days, then (I suspect) the ISP re-assigned a new IP and the errors started up again.
After a few angry emails from the 'powers-that-be', our network admin stops by my desk.
D: "Dude, I'm sorry, I've been so busy. I just heard and I wished they had told me what was going on. I'm going to block his entire domain and send a request to the ISP to shut him down. This was my problem to fix, you should have never been involved."
After 'D' worked his mojo, the errors stopped.
Month later, 'D' gave me an update. He was still logging the traffic from the partner's system (the ISP wanted extensive logs to prove the customer was abusing their service) and like magic one day, it all stopped. ~2 weeks after the 'break up'.8 -
Early in my database developer career, I started a new job at a mortgage company. I was poking around the code, just getting familiar with things.
One script identified properties in certain states/territories for special handling:
AND STATECD IN (‘PR’, ‘HI’, ‘AL’)
I thought “Puerto Rico, Hawaii, and ALABAMA?? One of these things is not like the other…”
Yeah, that was supposed to be Alaska (AK)… But they’ve been special-handling properties in Alabama for years. -
Reading 3GPP standards is so painful. I download a document to read it. A month or so later, an updated version is out. I need to download it again. It's so annoying to keep checking. I guess I'll have to write yet another script for yet another fucking problem.
The documents themselves are not that good too. My work would be a lot easier if there is a web interface for the standards.3 -
Most kids just want to code. So they see "Computer Science" and think "How to be a hacker in 6 weeks". Then they face some super simple algebra and freak out, eventually flunking out with the excuse that "uni only presents overtly theoretical shit nobody ever uses in real life".
They could hardly be more wrong, of course. Ignore calculus and complexity theory and you will max out on efficiency soon enough. Skip operating systems, compilers and language theory and you can only ever aspire to be a script kiddie.
You can't become a "data scientist" without statistics. And you can never grow to be even a mediocre one without solid basic research and physics training.
Hack, I've optimized literal millions of dollars out of cloud expenses by choosing the best processors for my stack, and weeks later got myself schooled (on devRant, of all places!) over my ignorance of their inner workings. And I have a MSc degree. Learning never stops.
So, to improve CS experience in uni? Tear down students expectations, and boil out the "I just wanna code!" kiddies to boot camps. Some of them will be back to learn the science. The rest will peak at age 33.17 -
it finally happened, PM/PO/SM is now also dev シ
each standup he now also gives update about his work in progress, with his diligently designed subtasks, perfectly sticking to the daily script, like a good well-behaved employee boi. it's weirdly cute
devs can't await to review his first PR. feeling the strong urge to spank his ass for each minor coding style issue or poorly-named variable...16 -
Years ago I used to work a guvmant site. They had really strict security rules for internet and how you spent your time. Makes sense considering what that site did. I was a support engineer for some of their process control equipment.
I was approached by an operator supervisor to install dvd player software on a business machine (non process related). Basically just a general purpose PC with no function other than time cards and general office use. I was fine with the request, but the reason was for watching movies during a holiday period by the operators. Not for anything official. So I made some noise about my dislike of this request feigning moral superiority. But the supervisor swore up and down it was for "training" dvds.
So I wrote a simple windows script. The script basically popped up a window that said:
"Security has detected unauthorized media inserted into this machine. Please state the reason for this infraction." It provided a dialog to enter a justification. After you entered the justification it said: "Security has been contacted and your user logged. You will be contacted shortly."
This script was then attached to the supervisors Start folder so it ran when he, and only he logged in. We made sure the "training" video (some movie) was already inserted at this point.
He logged in. He just about shit his pants when reading this. He promptly logged and left the building to walk somewhere else in the site. We called him and let him know it was a gag. His response: That son of a bitch Demolishun!2 -
So for my programming class, we had to make a game using Scratch. No problem, I said. Scratch is easy stuff. Just drag and drop blocks. Like legos. Legos that actually do shit. Cool.
So my game is about a dog underneath a plinko set, dodging balls that come down the plinko thing. Easy enough. I figured I would spice things up a bit. My teacher has to go through 20 of these games, I figured I'd make mine interesting. I add a little heart system.
Now for those of you who don't know Scratch, or don't care enough to look it up, all of Scratch's codes are within the sprite themselves. They can communicate with other sprites with a thing called broadcasting. When other sprites receive a broadcast, it can activate a script. yeah, cool.
So I had a script on the dog, that broadcasts a message to the heart system to remove a heart when the dog is hit. So to keep things short, I call the broadcast "Dog's hit."
For anyone who knows programming, computers have no clue what an apostrophe or a space is. They can't read it unless you have it all letters, maybe a semicolon. So, I removed the space and apostrophe, with my innocent 17 year-old mind not realizing this makes it "Dogshit."
Game's finished. Finally. Due date comes in, I submit it all proud and everything. I just created the best dog-plinko simulator of all time. Later that day, I show it to my friend, who then points out the typo.
At this point, my teacher already graded it. I went down to see him after school, and he must've known why I went down as soon as I walked in the door, and just cracked up. He told me it was fine, and not to do it again.
I left red.4 -
"So Alecx, how did you solve the issues with the data provided to you by hr for <X> application?"
Said the VP of my institution in charge of my department.
"It was complex sir, I could not figure out much of the general ideas of the data schema since it came from a bunch of people not trained in I.T (HR) and as such I had to do some experiments in the data to find the relationships with the data, this brought about 4 different relations in the data, the program determined them for me based on the most common type of data, the model deemed it a "user", from that I just extracted the information that I needed, and generated the tables through Golang's gorm"
VP nodding and listening intently...."how did you make those relationships?" me "I started a simple pattern recognition module through supervised mach..." VP: Machine learning, that sounds like A.I
Me: "Yes sir, it was, but the problem was fairly easy for the schema to determ.." VP: A.I, at our institution, back in my day it was a dream to have such technology, you are the director of web tech, what is it to you to know of this?"
Me: "I just like to experiment with new stuff, it was the easiest rout to determine these things, I just felt that i should use it if I can"
VP: "This is amazing, I'll go by your office later"
Dude speaks wonders of me. The idea was simple, read through the CSV that was provided to me, have the parsing done in a notebook, make it determine the relationships in the data and spout out a bunch of JSON that I could use. Hook it up to a simple gorm golang script and generate the tables for that. Much simpler than the bullshit that we have in php. I used this to create a new database since the previous application had issues. The app will still have a php frontend and backend, but now I don't leave the parsing of the data to php, which quite frankly, php sucks for imho. The Python codebase will then create the json files through the predictive modeling (98% accuaracy) and then the go program will populate the db for me.
There are also some node scripts that help test the data since the data is json.
All in all a good day of work. The VP seems scared since he knows no one on this side of town knows about this kind of tech. Me? I am just happy I get to experiment. Y'all should have seen his face when I showed him a rather large app written in Clojure, the man just went 0.0 when he saw Lisp code.
I think I scare him.12 -
So, first time ranting, sorry if I mess anything up.
When I first started my current job and got introduced to the system we were coding in, something seemed a little fishy to me. Didn't like the system anyway, but at least the language is a compiler language, so it runs quite quickly, right?
In theory, yeah. If the lead dev liked the IDE that came with it. But he has to REALLY fucking hate it, because rather than using it, he codes in plaintext. No syntax highlighting, no auto-indent, nothing. And he's built the entire damn system around doing that. Sadly the compiler is only integrated into the IDE, so what do we do there? Copy the code from the plaintext file to the IDE to compile it there? No no, why would you. The language has a function you can use to compile some code at runtime.
And so he does. Every. Single. Fucking. Script. There's a single main script that runs and finds the correct textfile to then runtime-compile and execute. So we effectively made a compiler language into a massively unoptimized interpreter lang.
I even mentioned that this might be a problem, but I was completely dismissed, so at that point it's not my problem anymore and I have then switched to a different system anyway.
Couple weeks later I heard the same guy complaining that the scripts were running almost the whole night so we'd probably need some better hardware or something.
Well if only there was a really obvious solution that would improve the performance by probably about a factor of 20 or so...13 -
Make a script which can basically tell you what is happening on your server from your mobile device basically server analytics11
-
Everyone and their dog is making a game, so why can't I?
1. open world (check)
2. taking inspiration from metro and fallout (check)
3. on a map roughly the size of the u.s. (check)
So I thought what I'd do is pretend to be one of those deaf mutes. While also pretending to be a programmer. Sometimes you make believe
so hard that it comes true apparently.
For the main map I thought I'd automate laying down the base map before hand tweaking it. It's been a bit of a slog. Roughly 1 pixel per mile. (okay, 1973 by 1067). The u.s. is 3.1 million miles, this would work out to 2.1 million miles instead. Eh.
Wrote the script to filter out all the ocean pixels, based on the elevation map, and output the difference. Still had to edit around the shoreline but it sped things up a lot. Just attached the elevation map, because the actual one is an ugly cluster of death magenta to represent the ocean.
Consequence of filtering is, the shoreline is messy and not entirely representative of the u.s.
The preprocessing step also added a lot of in-land 'lakes' that don't exist in some areas, like death valley. Already expected that.
But the plus side is I now have map layers for both elevation and ecology biomes. Aligning them close enough so that the heightmap wasn't displaced, and didn't cut off the shoreline in the ecology layer (at export), was a royal pain, and as super finicky. But thankfully thats done.
Next step is to go through the ecology map, copy each key color, and write down the biome id, courtesy of the 2017 ecoregions project.
From there, I write down the primary landscape features (water, plants, trees, terrain roughness, etc), anything easy to convey.
Main thing I'm interested in is tree types, because those, as tiles, convey a lot more information about the hex terrain than anything else.
Once the biomes are marked, and the tree types are written, the next step is to assign a tile to each tree type, and each density level of mountains (flat, hills, mountains, snowcapped peaks, etc).
The reference ids, colors, and numbers on the map will simplify the process.
After that, I'll write an exporter with python, and dump to csv or another format.
Next steps are laying out the instances in the level editor, that'll act as the tiles in question.
Theres a few naive approaches:
Spawn all the relevant instances at startup, and load the corresponding tiles.
Or setup chunks of instances, enough to cover the camera, and a buffer surrounding the camera. As the camera moves, reconfigure the instances to match the streamed in tile data.
Instances here make sense, because if theres any simulation going on (and I'd like there to be), they can detect in event code, when they are in the invisible buffer around the camera but not yet visible, and be activated by the camera, or deactive themselves after leaving the camera and buffer's area.
The alternative is to let a global controller stream the data in, as a series of tile IDs, corresponding to the various tile sprites, and code global interaction like tile picking into a single event, which seems unwieldy and not at all manageable. I can see it turning into a giant switch case already.
So instances it is.
Actually, if I do 16^2 pixel chunks, it only works out to 124x68 chunks in all. A few thousand, mostly inactive chunks is pretty trivial, and simplifies spawning and serializing/deserializing.
All of this doesn't account for
* putting lakes back in that aren't present
* lots of islands and parts of shores that would typically have bays and parts that jut out, need reworked.
* great lakes need refinement and corrections
* elevation key map too blocky. Need a higher resolution one while reducing color count
This can be solved by introducing some noise into the elevations, varying say, within one standard div.
* mountains will still require refinement to individual state geography. Thats for later on
* shoreline is too smooth, and needs to be less straight-line and less blocky. less corners.
* rivers need added, not just large ones but smaller ones too
* available tree assets need to be matched, as best and fully as possible, to types of trees represented in biome data, so that even if I don't have an exact match, I can still place *something* thats native or looks close enough to what you would expect in a given biome.
Ponderosa pines vs white pines for example.
This also doesn't account for 1. major and minor roads, 2. artificial and natural attractions, 3. other major features people in any given state are familiar with. 4. named places, 5. infrastructure, 6. cities and buildings and towns.
Also I'm pretty sure I cut off part of florida.
Woops, sorry everglades.
Guess I'll just make it a death-zone from nuclear fallout.
Take that gators!5 -
I hate web dev because I know enough to know how it "should" be done, know that it's not done that way, but also don't know enough to do it that way quickly.
Or why I spent the last 2 days scripting deployment of a website that costs 9$ a year to run, when I couldn't find a deployment script anywhere else.3 -
My God is map development insane. I had no idea.
For starters did you know there are a hundred different satellite map providers?
Just kidding, it's more than that.
Second there appears to be tens of thousands of people whos *entire* job is either analyzing map data, or making maps.
Hell this must be some people's whole *existence*. I am humbled.
I just got done grabbing basic land cover data for a neoscav style game spanning the u.s., when I came across the MRLC land cover data set.
One file was 17GB in size.
Worked out to 1px = 30 meters in their data set. I just need it at a one mile resolution, so I need it in 54px chunks, which I'll have to average, or find medians on, or do some sort of reduction.
Ecoregions.appspot.com actually has a pretty good data set but that's still manual. I ran it through gale and theres actually imperceptible thin line borders that share a separate *shade* of their region colors with the region itself, so I ran it through a mosaic effect, to remove the vast bulk of extraneous border colors, but I'll still have to hand remove the oceans if I go with image sources.
It's not that I havent done things involved like that before, naturally I'm insane. It's just involved.
The reason for editing out the oceans is because the oceans contain a metric boatload of shades of blue.
If I'm converting pixels to tiles, I have to break it down to one color per tile.
With the oceans, the boundary between the ocean and shore (not to mention depth information on the continental shelf) ends up sharing colors when I do a palette reduction, so that's a no-go. Of course I could build the palette bu hand, from sampling the map, and then just measure the distance of each sampled rgb color to that of every color in the palette, to see what color it primarily belongs to, but as it stands ecoregions coloring of the regions has some of them *really close* in rgb value as it is.
Now what I also could do is write a script to parse the shape files, construct polygons in sdl or love2d, and save it to a surface with simplified colors, and output that to bmp.
It's perfectly doable, but technically I'm on savings and supposed to be calling companies right now to see if I can get hired instead of being a bum :P20 -
[CMS Of Doom™]
Ah, yes, their built-in bullshit newsletter module just sent the n-th user n emails. Wonderful considering n=368.
The culprit? Better don't ask...
OK, anyway: So the mailer is running as a CRONjob, but nah, not as a console script call but by a public HTTP GET URL call, fucking obviously (it's the CMS Of Doom for a reason).
So these fucking imbeciles "implemented" an ob_start() callback where HTML links are - for whatever fucking reason - modified by some regex (obviously everybody knows parsing HTML by Regex is trivial). In this case the link was somehow modified to recall the mailer Cronjob...
This must have upset the pngoing mailing process thus spamming mails. Whyyyy
And I've thought I've seen it all after 6 months in this legacy hell...
This is why you don't run a company consisting of only beginners in PHP (in cluding their "CEO")! -
Trigger Warning (2 of them):
Minecraft is the only redeeming piece of software ever written in Java script.7 -
Do not offer anyone to help them with their scripts, ever.
I had to do something as there were things like "cd $DIR; rm *". No checks if the folder got changed, no qutoes to prevent breaking on spaces. A problem waiting to happen. And it did. We don't know what the script deleted in the wrong folder to this day.
The scripts have no functions, some files have over 50% duplicate code. I was an idiot and thought running it through shellcheck and doing basic prevention of them shooting their own foot would be enough.
And there is no way to convince the guy to start writing the code properly. Should have kept my mouth shut.2 -
You know the PHP legacy code base is complete garbage when it requires a script memory limit of 1.5GB.9
-
Normal colleagues pick the right slack channel and post a message there.
My colleague picks a random channel that has no relation to dev at all and posts a question/request there.
He must be using a random script to pick the next channel name :D -
Wasted a day as Shitlock Holmes with the build chain.
It would not reproduce the firmware hexfile that had been checked in. Reverse engineering that along with the mapfile to find out the cause, it was a const string that was guarded by an ifdef from another file that was auto-generated as prebuild step via a script that fetched some version control info.
Or, it would have been if the installation instructions had been correct and someone had described that no spaces in the absolute path name of the project are allowed. Otherwise, that shit just failed silently.
I then had to reverse engineer the intended workflow from the commit history in the version control to figure out that the last dev obviously hadn't quite understood the project specific workflow and how the version control interacts with these build scripts.
At least, I finally did get a matching hexfile.1 -
Betty: Opens slack chat with Bob, Tony and me to ask me to fix some data for a client who messed the setup. (Don’t worry just building a script that takes 3 hours to complete and that I must supervise)
Betty: Opens slack chat with Ron, Tim and me to ask me to force the system I made to ignore protocol because someone else’s fuck up made it so she didn’t get the output she expected.
Betty: proceeds to ask for status updates constantly on both chats. She also disguises them as her asking what she can do to “get it across faster” knowing there’s jack shit I or anyone can do to make it go “faster”.
Also Betty, vomits BS about my micro service being unstable in front of managers even though it is it’s correctness what brought to light a bug fucking up thousands of records silently.
Go fuck yourself Betty ☺️ and fuck the client5 -
Sharing a first look at a prototype Web Components library I am working on for "fun"
TL;DR left side is pivot (grouped) table, right side is declarative code for it (Everything except the custom formatting is done declaratively, but has the option to be imperative as well).
====
TL;DR (Too long, did read):
I'm challenging myself to be creative with the cool new things that browsers offer us. Lani so far has a focus on extreme extensibility, abstraction from dependencies, and optional declarative style.
It's also going to be a micro CSS framework, but that's taking the back-seat.
I wanted to highlight my design here with this table, and the code that is written to produce this result.
First, you can see that the <lani-table> element is reading template, data, and layout information from its child elements. Besides the custom highlighting code (Yellow background in the "Tags" column, and green gradient in the "Score" column), everything can be done without opening even a single script tag.
The <lani-data-source> element is rather special. It's an abstraction of any data source, and you, as a developer can add custom data sources and hook up the handlers to your whim (the element itself uses the "type" attribute to choose a handler. In this case, the handler is "download" which simply sends a fetch request to the server once and downloads the result to memory).
Templates are stored in an html file, not string literals (Which I think really fucks the code) and loaded async, then cached into an object (so that the network tab doesn't get crowded, even if we can count on the HTTP cache). This also has the benefit of allowing me to parse the HTML templates once and then caching the parsed result in memory, so templates are never re-parsed from string no matter how many custom elements are created.
Everything is "compiled" into a single, minified .js file that you include on your page.
I know it's nothing extraordinary, but for something that doesn't need to be compiled, transpiled, packaged, shipped, and kissed goodnight, I think it's a really nice design and I hope to continue work on it and improve it over time1 -
German public service digitization. Websites celebrating the new "digital functionality" of the federal ID card, but if you need to prolong the actual card, you have to visit a public administration center in person, no way to prove your existing valid ID in a zoom meeting although that's de-facto standard accepted even when opening a bank account, plus they have all of my data so they should know I have a valid ID and they could just send the new one to my postal address.
So I have to appear in person at their offices, so I need an appointment, but in times of covid pandemic, appointments are rare and only offered on a day-to-day basis in my hometown, that's why I have to visit their online appointment web app at 7 a.m. in the morning to grab one of the few appointments when they are released.
Don't tempt me to write a script that squats all the other appointment slots to resell at the highest prices...
Situation reminds of the times when it was even harder to get a vaccination against covid, and the media kept reporting about the minority that refused to get vaxxed, so they didn't have to admit there wasn't enough vaccine anyway.
This rant is not about politics, it's about the failure of bureaucracy, but if it was about politics, I would just quote Rezo that it shows who had governed this state for sixteen years.
When I rant about German internet connectivity, people usually reply that the web is much better in Taipeh, Bangalore or Guadalajara, so I can still have some hope that it's not all of the world that's totally lost.
So give me some hope, folks.6 -
me: builds a python-script to transport data in .json-format into a config-file written in .xml for a coworker
my boss: "I am glad you have earned yourself a reputation as the 'programmer' in our team" -
Up all damn night making the script work.
Wrote a non-sieve prime generator.
Thing kept outputting one or two numbers that weren't prime, related to something called carmichael numbers.
Any case got it to work, god damn was it a slog though.
Generates next and previous primes pretty reliably regardless of the size of the number
(haven't gone over 31 bit because I haven't had a chance to implement decimal for this).
Don't know if the sieve is the only reliable way to do it. This seems to do it without a hitch, and doesn't seem to use a lot of memory. Don't have to constantly return to a lookup table of small factors or their multiple either.
Technically it generates the primes out of the integers, and not the other way around.
Things 0.01-0.02th of a second per prime up to around the 100 million mark, and then it gets into the 0.15-1second range per generation.
At around primes of a couple billion, its averaging about 1 second per bit to calculate 1. whether the number is prime or not, 2. what the next or last immediate prime is. Although I'm sure theres some optimization or improvement here.
Seems reliable but obviously I don't have the resources to check it beyond the first 20k primes I confirmed.
From what I can see it didn't drop any primes, and it didn't include any errant non-primes.
Codes here:
https://pastebin.com/raw/57j3mHsN
Your gotos should be nextPrime(), lastPrime(), isPrime, genPrimes(up to but not including some N), and genNPrimes(), which generates x amount of primes for you.
Speed limit definitely seems to top out at 1 second per bit for a prime once the code is in the billions, but I don't know if thats the ceiling, again, because decimal needs implemented.
I think the core method, in calcY (terrible name, I know) could probably be optimized in some clever way if its given an adjacent prime, and what parameters were used. Theres probably some pattern I'm not seeing, but eh.
I'm also wondering if I can't use those fancy aberrations, 'carmichael numbers' or whatever the hell they are, to calculate some sort of offset, and by doing so, figure out a given primes index.
And all my brain says is "sleep"
But family wants me to hang out, and I have to go talk a manager at home depot into an interview, because wanting to program for a living, and actually getting someone to give you the time of day are two different things.1 -
Looking for someone to test a new factorization script I wrote.
https://pastebin.com/Td2XTKe6
Tested against a set of products from all primes under 1000. Worked even on numbers up to 87954921289
Worked for about 66% of the products tested.
Obviously I'm cheating a little bit because I'm checking for four conditions n%a == 0, n%a == 1, n%b == 0, and n%b == 1
It appears it is possible to generate the series from just the product, and then factor each result. The last factor in each each set of factors becomes x, and we do p%x and check for zero.
if it works, we've found our answer.
Kind of wonky but basically what its doing is taking p, tacking on a 0 to the right, and then tacking p to the right of *that*.
So if you had a product like
314
The starting number we look at is
3140314
The middle digit becomes i, and the unit digit becomes j.
Don't know why it works more often then not, and don't know if it would really be any faster.
Just think it's cool.9 -
Now I'm a bit impressed by auto complete of VS2022.
The full text in grey is auto complete proposition.
Back story :
I have a table where datetime is stored as nvarchar(max).
I'm trying to convert that shit into a proper datetime2 column
But there are dates in ISO format, there are in MM/dd/yyyy, dd/MM/yyyy and there are some with hours/minutes parts.
So i'm making a little script to clean of all that up.
Ofc, not a perfect result, like 01/02/2022 will be considered as dd/MM/yyyy (98% of values are. But still cleaner than before1 -
I’m so sorry if this is the place for questions. I’m terrified of stack overflow and have been searching for a week for a solution and can’t find one. This is for React.js people.
I was tasked to create a webpage with react. The limitation is, they did not wanna adopt the node.js dependency. I said ok, I’ll figure it out. You can inject react, material UI, and babel with script tags in HTML, then put ur lil components in it. I did that and it works beautifully.
However, now I have to write tests for this. I think it’s actually impossible without a way to render React, so I have to use the browser, or node, right? I convinced my boss to allow me to use a node.js container just for testing, which I thought would make my life easier.
I don’t know how to render this thing with node. It’s just an HTML file that pulls react via script tags, and idk how to serve html with node. Additionally, none of the React testing libraries seem to support testing a system that wasn’t designed to be served with node, at least not easily. My gut tells me that the complication with how things are imported contributes at least a little to this (dependencies pulled via script tags in the HTML file and made available to react through global const variables).
I could be wrong about any of this — im fairly new. But how tf do I go about testing these react components? For reference, if you go to Reacts docs, there’s a section called “add react to a page in one minute” that’s pretty much what I did.20 -
Stupid question
I wrote a script to scan a site for this tv I want to buy. At first, I just wrote a get request to the site every 30min. After while, I realized that the TV is selling out because it’s getting stocked, and then sold out within 5 minutes.
Can I send a get request to this site every 5 minutes, without causing harm to the site? I really don’t want to mess this guys site up lol8 -
Do I have just a bad version of ecma script or is this some stupid shit in JS in general? I want each sub array to be separate entities, not that same one for all. I assume the fill in just putting the same list in all of them? I honestly don't care I guess, replacing a sublist is fine too. Rather than editing each element separately. Saves ram in long run.
let arr = Array(5).fill(Array(1,2))
console.log(arr)
arr[0][0] = 3
console.log(arr)
[[1,2],[1,2],[1,2],[1,2],[1,2]]
[[3,2],[3,2],[3,2],[3,2],[3,2]]
Congratulations, you are my dev duck today.19 -
Fun Fact: It's physically impossible to run a .bat script on a remote Windows machine as admin from a Visual Studio post-build event.6
-
Heres a fairly useless but interesting tidbit:
if i = n
then
r = (abs(((((p)-(9**i)-9)+1))-((((9**i)-(p)-9)-2)))-p+1+1)
then r%a will (almost*) always return 0. when n = floor(a/2) for the lowest non-trivial factor of a two factor product.
Thats not really the interesting bit though. The interesting bit is the result of r will always be some product with a *larger* factor tree that includes the factor A of p, but not p's other larger factor, B.
So, useless from what I can see. But its an interesting function on its own, simply because of what it does.
I wrote a script to test it. For all two-factor products of the first 1000 primes, (with no repeating combinations, so if we calculated say, 23*31, we skip 31*23), only 3262 products failed this little formula, out of half a million.
All others reliably returned 0 for the following..
~~~
i = floor(a/2)
r = (abs(((((p)-(9**i)-9)+1))-((((9**i)-(p)-9)-2)))-p+1+1)
r%a
~~~
The distribution of failures was *very* early on in the set of factors, and once fixed at the value of 3262, stopped increasing for the rest of the run.
I didn't calculate if some primes were more likely to cause a product to fail or not. Nor the factor trees, nor if the factor trees had any factors in common between products, or anything of that nature.
All in all I count this as a worthwhile experiment.
If you want to run the code yourself, I posted it to pastebin here:
https://pastebin.com/Q4LFKBjB
edit:
Tried wolfram alpha just to see what it says, but apparently not much. Wish it could tell me more.41 -
Spent a couple of weeks on writing a cronjob which updates a certain value in the application config, and spend the last few months on testing it in different environments to make sure it does not fail in production. Ran the deployment script, and the damn cronjob fails because of ssl certificate on production. fuck me
-
Browser automation is a PITA. I’m going on my fourth side mission with this crap and I honestly still look like a newbie. I’ve tried Java Selenium with Chrome, Excel VBA with IE9, Vanilla JS in the browser console, and tonight I’m thinking to concoct some kind of hybrid CDP & Selenium approach in Chrome. Never used CDP before, not even sure where to start but I heard it sucks like anything else unless you get some extra libraries and plugins and stuff.
It doesn’t help that I can’t get just anything I want from our IT Department. It would be another PITA to ask for puppeteer. If puppeteer is totally legit please let me know.
Selenium sucks. The buttons don’t click, the waits don’t wait. Its unusable. Iframes are annoying as all hell but I can deal with that. HTML Tables suck too. It doesn’t help I have to restart my whole java program and whole Chrome every time an element doesn’t get picked correctly. Scripting one single element can take all fucking night.
Chrome dev tools what the fuck. Why the fuck is the DOM explorer in the same window as the web page I’m working on?? I can’t undock it. Am I supposed to use a fucking TV screen to work with this bastard?? If I use the remote chrome tools on port 9225 or whatever - It Still Renders The Whole Fucking Page Alongside The Console. Get Out Of My Way!!! The nested HTML CODE IS ONE CHARACTER WIDE ALL THE TIME. I can’t for the life of me figure out what the fuck I’m looking at. Haven’t you people ever heard of A HORIZONTAL SCROLL BAR at least.
Fuck I tried using getElementById, and the Xpath thing and its not all that great seeing I have seemingly 1000s of nested Divs all over the god damned place oftentimes containing a single element. I’m finally on chrome now should I learn Jquery now? I mean seriously wtf.
I use this one no code tool for dev it has web automation built in. As you can imagine its just as broken as anything else!! I have 10 screens to navigate it gets stuck on the second screen all the damn time. Fuck I love clicking the buttons when my script misses and playing catch up with it.
So as a work around to Selenium not waiting even 1 millisecond when I use explicit wait or implicit wait or fluent wait, I’m guessing maybe I can attach both Chrome Dev Tools Protocol (CDP as ive called it earlier) and selenium to the same browser and maybe I can use CDP to perform a Wait with any degree of success. Selenium will do nothing more than execute vanilla javascript Element.click(); This is the only way I know to even ACTUALLY use selenium beyond the simplest html documents possible. Hell I guess CDP can execute js idk.
I can’t get the new selenium that has CDP but I do have some buggy ass selenium from a few years back. Yeah, I remember reading there was a pretty impactful regression defect in the version I have. Maybe I’m being gaslighted by some shit copy of selenium?
The worst part is that I do seem to be having issues that the rest of the internet’s devs do not seem to be having. People act like browser automation is totally viable and pretty OK. How in the fuck hell is my Selenium Test Suite going to be more reliable my application under test?!!?? I’ll have more fucking bugs in my test suite than in my application. Today, I have less than half a test script and, I. already. fucking. do.
I am still SUPER PISSED at the months of 12 hour days (always 8 hours spent on normal sprint work btw only 4 to automation) I spent trying to automate our regression tests. I got NOWHERE.
I did learn a lot about HTML and JS though like I’m not that mad…but I’m just trying to emphasize my achievement on my task was zero.
The buttons don’t click. There are so many divs and I swear you sometimes need to select a div somewhere in the middle sometimes to get it working. The waits don’t wait. XHR requests are invisible. Java crashes 100 times before I find an xpath and thread.sleep() combo that works. I have no failure modes to use — Sometimes I click the same element 20x in a script because I have no way to know if it clicked the first time! Sometimes you gotta scroll the page to make the click work. So many click methods all broken. So many wait methods all broken. Its not just the elements don’t click! There are so many ways to click that almost work but surely they all fail the same in the end. ok at this point I’m just repeating myself…
there yet even more issues that I can’t remember…and will soon remember as I journey into this project yet again…
thanks for reading I hope I entertained and would love to hear your experience!7 -
So I made a couple slight modifications to the formula in the previous post and got some pretty cool results.
The original post is here:
https://devrant.com/rants/5632235/...
The default transformation from p, to the new product (call it p2) leads to *very* large products (even for products of the first 100 primes).
Take for example
a = 6229, b = 10477, p = a*b = 65261233
While the new product the formula generates, has a factor tree that contains our factor (a), the product is huge.
How huge?
6489397687944607231601420206388875594346703505936926682969449167115933666916914363806993605...
and
So huge I put the whole number in a pastebin here:
https://pastebin.com/1bC5kqGH
Now, that number DOES contain our example factor 6229. I demonstrated that in the prior post.
But first, it's huge, 2972 digits long, and second, many of its factors are huge too.
Right from the get go I had hunch, and did (p2 mod p) and the result was surprisingly small, much closer to the original product. Then just to see what happens I subtracted this result from the original product.
The modification looks like this:
(p-(((abs(((((p)-(9**i)-9)+1))-((((9**i)-(p)-9)-2)))-p+1)-p)%p))
The result is '49856916'
Thats within the ballpark of our original product.
And then I factored it.
1, 2, 3, 4, 6, 12, 23, 29, 46, 58, 69, 87, 92, 116, 138, 174, 276, 348, 667, 1334, 2001, 2668, 4002, 6229, 8004, 12458, 18687, 24916, 37374, 74748, 143267, 180641, 286534, 361282, 429801, 541923, 573068, 722564, 859602, 1083846, 1719204, 2167692, 4154743, 8309486, 12464229, 16618972, 24928458, 49856916
Well damn. It's not a-smooth or b-smooth (where 'smoothness' is defined as 'all factors are beneath some number n')
but this is far more approachable than just factoring the original product.
It still requires a value of i equal to
i = floor(a/2)
But the results are actually factorable now if this works for other products.
I rewrote the script and tested on a couple million products and added decimal support, and I'm happy to report it works.
Script is posted here if you want to test it yourself:
https://pastebin.com/RNu1iiQ8
What I'll do next is probably add some basic factorization of trivial primes
(say the first 100), and then figure out the average number of factors in each derived product.
I'm also still working on getting to values of i < a/2, but only having sporadic success.
It also means *very* large numbers (either a subset of them or universally) with *lots* of factors may be reducible to unique products with just two non-trivial factors, but thats a big question mark for now.
@scor if you want to take a look.5 -
A year ago I built my first todo, not from a tutorial, but using basic libraries and nw.js, and doing basic dom manipulations.
It had drag n drop, icons, and basic saving and loading. And I was satisfied.
Since then I've been working odd jobs.
And today I've decided to stretch out a bit, and build a basic airtable clone, because I think I can.
And also because I hate anything without an offline option.
First thing I realized was I wasn't about to duplicate all the features of a spreadsheet from scratch. I'd need a base to work from.
I spent about an hour looking.
Core features needed would be trivial serialization or saving/loading.
Proper event support for when a cell, row, or column changed, or was selected. Necessary for triggering validation and serialization/saving.
Custom column types.
Embedding html in cells.
Reorderable columns
Optional but nice to have:
Changeable column width and row height.
Drag and drop on rows and columns.
Right click menu support out of the box.
After that hour I had a few I wanted to test.
And started looking at frameworks to support the SPA aspects.
Both mithril and riot have minimal router support. But theres also a ton of other leightweight frameworks and libraries worthy of prototyping in, solid, marko, svelte, etc.
I didn't want to futz with lots of overhead, babeling/gulping/grunting/webpacking or any complex configuration-over-convention.
Didn't care for dom vs shadow dom. Its a prototype not a startup.
And I didn't care to do it the "right way". Learning curve here was antithesis to experimenting. I was trying to get away from plugin, configuration-over-convention, astronaut architecture, monolithic frameworks, the works.
Could I import the library without five dozen dependancies and learning four different tools before getting to hello world?
"But if you know IJK then its quick to get started!", except I don't, so it won't. I didn't want that.
Could I get cheap component-oriented designs?
Was I managing complex state embedded in a monolith that took over the entire layout and conventions of my code, like the world balanced on the back of a turtle?
Did it obscure the dom and state, and the standard way of doing things or *compliment* those?
As for validation, theres a number of vanilla libraries, one of which treats validation similar to unit testing, which seems kinda novel.
For presentation and backend I could do NW.JS, which would remove some of the complications, by putting everything in one script. Or if I wanted to make it a web backend, and avoid writing it in something that ran like a potato strapped to a nuclear rocket (visual studio), I could skip TS and go with python and quart, an async variation of flask.
This has the advantage that using something thats *not* JS, namely python, for interacting with a proper database, and would allow self-hosting or putting it online so people can share data and access in real time with others.
And because I'm horrible, and do things the wrong way for convenience, I could use tailwind.
Because it pisses people off.
How easy (or hard) would it be to recreate a basic functional clone of the core of airtable?
I don't know, but I have feeling I'm going to find out!1 -
Okay okay.
Zero Touch Provisioning and cisco devices is a joke.
You buy serveral devices for thousands of dollars and want to provision on startup.
And this shitty thing just tells me I that it is not possible to start ZTP on its management port. Oh my god.
And you cant even provide a plain config file. No it muste be a python script that will be executed on the router.
This is hilarious2 -
I recently refactored a form with complex client side interactivity for one of my clients replacing jquery with vuejs in the process and I'm absolutely baffled by how easier it is to reason about everything when you think of the UI as a function of the state. Only devs who have done both imperative and declarative DOM manipulation at some point in their life can understand the joy of doing this. And all of this can be done with just a simple script tag without having to bring in complex build process that has plagued the Javascript ecosystem.
-
I need to check 100k tables in the database so I wrote a script... It runs fine for 5 minutes; then suddenly I get an error out of nowhere: getaddrinfo ENOTFOUND influxdb-dev.cloud.etc.com. What the helly hell? When I restart my script, the host name is found fine again; after 5 more minutes it crashes again with the same error. I cannot think of a good reason why this happens. Why would the host disappear? RIP6
-
Linux users needing to run literally ANY script again:
⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️...4 -
In my university have different projects but i don't have any personal project in my github. Any script not even a project from scratch, only report bugs but any contribution. I am subscribed in different mailing list (debian, php, perl) for help.2
-
I truly believe that a back end needs to know about front end as much as a front end needs to know about back end.
And a more serious belief is that project managers / owners need to know about both. Thank you.3 -
Angular cli was installed globally with some "more up-to-date" version and locally for a project with a slightly older version. On a local machine. No problem.
The same thing on a VM: nope, module not found error. node trying to run a node_modules install script from within windows directory, in which nothing node-related exists ... ?? -
TIL RVM and I are on different assumptions. I'm talking about RVM allowing unbound variables in its scripts.
I don't because I literally have run "rm -rf /" on my Mac because of an unbound variable in the past. So, when I write a shell script, the second line is always "set -eu."
And because RVM allows unbound variables, this line crashes RVM.
Then for some stupidity on my part, I looked into GitHub for its codebase first to get even more clueless about the issue before finally googling to see if anyone had experienced the same problem 🤦1 -
Moengage is one of the worst analytics software I have ever worked with...
Integrating it into a react website is a pain in the ass, they don't have a npm package, you need to add a script tag to html file.
It also has a wierd bug that the service worker they mentioned in the documentation doesn't work when the debug logs are off.
Aaaargh. Now I have to make a service worker handler to import this service worker and see if it works... -
I loathe manual regression testing. So much so, that today I made a quick bash script to move my mouse every minute to make it look like I'm online and doing stuff while I watch Twitch and YouTube videos.
The worst part is that we have Cypress to automate this and my company puts more value in pushing out features instead of automating all this unnecessary manual testing. Soooo I'm just not gonna participate because there's no way for them to know that I'm *not* testing.1 -
Hi everyone
I have a python script that continuously collects data for me. I want to be able to display that data on a node js server. How should I go about this? I was thinking of maybe having the python script send get requests to the server but I feel that is not the right answer. Let me know if u guys need more info, thanks!5