Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "registers"
-
I guess I can do one of these a day or so. I've collected some novelties over the years.
First up is a Curta mechanical calculator. Before electronic calculators became a thing, these were the best portable calculators in the world. Notably, they were the calculator of choice in rally car sports.
They work by a series of helical gears that act as registers. A series of internal gears and value assignment switches apply an adjustable number of incrementations to those gears, multiplying gears and the tracking gears, once per "grind." The result is output as a number on top of the device. The "clear register" function is lifting the top ring, which releases the reverse lockout on the gears and a clockwise turn on the ring then resets them to their zero state.
They were designed by Curtz Herzstark, partly before WWII and partly while he was imprisoned in a Nazi concentration camp. He had filed a patent for it in 1938, shortly before his family's manufacturey became a weapons factory. During his imprisonment, in addition to nearly starving to death, he completed his plans for manufacturing of his calculator.
It had fun names like the, "pepper grinder," and "math grenade."
15 -
For those who were wondering, since my last post had some people who seemed interested, here's my progress on my 8-bit computer. I've got the clock on the left hand side, and the 2 8-bit registers on the right hand sign. (The power strips in the middle are going to be used as a bus)
9 -
-Registers on a site to use the product
-Opens email
"Our weekly news digest"
"Our daily news digest"
"You haven't used our products for 15 minutes. We miss you"
"Would you recommend our products to a friend"
"If you like this, you'll LOVE this"
"Here's a promo code for something you don't need"
"You've unsubscribed. Was it really you?"
"You've unsubscribed. Was it really you?"
"You've unsubscribed. Was it really you?"
"You've unsubscribed. Was it really you?"
FUCK OFF YOU ANNOYING CUNTS15 -
Me: Well, it's time to make a new app!
* opens up VS Code *
* opens folder selection dialog *
* creates a new folder called "notes app" *
* yarn inits that folder *
* installs react and react-dom *
* installs webpack, webpack-cli, babel-core, babel-loader, babel-preset-env, babel-preset-react, style-loader, css-loader, file-loader, html-webpack-plugin and clean-webpack-plugin as a dev dependency (install is pending) *
* copies a webpack config from some other project *
* creates a babelrc file *
* copies a yarn script called "build:dev" which would launch webpack *
* dev dependencies installed *
* tries to save *
* vscode doesn't save because files differ *
* tries to copy dev dependencies *
* fail *
* tries again *
* saves *
* writes bare-bones index.jsx *
* yarn build:dev *
* opens build/index.html in firefox *
* gets satisfaction *
* writes bare-bones App.jsx which is a react component but it's an entire app *
* yarn build:dev *
* opens build/index.html in firefox *
* gets satisfaction *
-- trim --
* walks out of his room to his mom's room where's sbc is located *
* grandma plays solitare on laptop *
* i ask grandma for a laptop *
* grandma gives me laptop *
* glues all components into App.jsx *
* yarn start:dev (magic of webpack-dev-server) *
* opens localhost:8080 in firefox *
* searches how to update a component prop *
* nothing found *
* registers on devrant and verifies his email *
* writes this rant *14 -
@JoshBent suggested that I'd make a blog about security.
Nice idea, fair enough!
*registers domain at provider with discounts at the moment*
*tries to find whois protection option*
"You can add WHOIS protection to your account as an upgrade"
*requests authorization token*
*logs into usual domain name provider account*
*transfers domain name*
*anonymizes WHOIS details within two seconds*
I could've stayed and ask them about the cost etc but the fact that they even HAVE a price for protecting WHOIS data is a no-go for me.
Fuck domain name resellers which ask money for protecting ones WHOIS information (where possible).32 -
*registers for an account on RaidForums*
> Sorry mate, we only accept the following email carriers: gmail.*, googlemail.*, hotmail.*, hotmail.*.*, yahoo.*, yahoo.*.*, ymail.*, live.*.*, live.*, outlook.*, outlook.*.*, protonmail.*, riseup.net, aol.com, gmx.de, raid.lol, msn.com, cox.net, mail.ru, att.net, bellsouth.net, laposte.net, rambler.ru, sky.com, mail.com, pm.me, shaw.ca, charter.ca, facebook.com, terra.com.br, libero.it, web.de, free.fr, orange.fr, wanadoo.fr, rediffmail.com, comcast.net, yandex.ru, uol.com.br, bol.com.br, sfr.fr, verizon.net
Now what if some dickhead somewhere wants to use his own domain to be able to reroute any spam from some forum dickheads to /dev/null, hmm? YOU FUCKING WANKERS, LET ME FUCKING USE MY OWN DOMAIN ALREADY YOU TWATS!!!18 -
Not laughing.
Not cursing.
Both for interviewing and being interviewed.
Some interviews could have been taken straight from a mexican telenovela.......
"Yeah, I worked for a year in the Walmart IT administration."
"Ok, what did you do?"
"Oh I had the high responsibility of taking care of swapping printer cartridges, programming the registers, stuff like that..."
"You apply for a senior database management role, you're aware of that?"
"Yeah. I took a bootcamp for 3 months in the evening after work. I'm up for the job and expect a payment of <lol, even having a stroke while writing a payment check that number will never happen>".
I made that up - but we had these cases... The story is just rewritten and mixed up for obvious reasons.
When I'm being interviewed, the same thing can happen by the way, too.
IMHO a interview is made not only for the company, but for me as an employee, too. I don't sugar coat it. I want to know what type of shit I'm getting into and how much I'm drowning in it.
Some "types" of interviewers react kinda funny when I start roasting them with questions...
For example, the authoritarian type usually reacts with disrespect. How dare u piss on my front lawn.... Kind of reaction. Which makes it hard not too laugh, because who wants to work for someone who throws a tamper tantrum during a interview? Even harder when the same guy promised you heaveb before (the flowery kind of bullshit, like everything's peaceful and fine and teams great and they have such a great leadership...)
Even worse is the patsy.
When you're sitting in an interview and the only answers you get are:
- Sorry, I don't know.
- I'm not allowed to ....
- Not in my area of expertise....
All just nice ways of saying: I will say nothing cause then I'd need to take some responsibility.
:)
The most Mexican telenovela stuff though in being interviewed is when I managed to divide a team of interviewers and it starts to become a "Judge Judy" or similar freaked out justice show...
A: "No, our team doesn't work that way".
B: "But you will in the short future, WE committed to it".
C: "Not that I'm aware of".
And me, an obvious sinner and person who enjoys entertainment and schadenfreude, just keeps adding kerosene to the fire.
"So, it seems like the team of A has its own rules which do not apply to B and C, do they also have greater funding?".
Oh it makes just fun to spur a good blood bath. -
Oh Oh !!! Today is the 8 bit register overflow day....Happy 8 bit register overflow day to all 16bit registers..2
-
Just thought I'd share my current project: Taking an old ISA sound card I got off eBay and wiring it up to an Arduino to control its OPL3 synth from a MIDI keyboard. I have it mostly working now.
No intention to play audio samples, so I've not bothered with any of the DMA stuff - just MIDI (MPU-401 UART) and OPL3.
It has involved learning the pinout of the ISA bus connectors, figuring out which ones are actually used for this card, ignoring the standards a little (hello, amplifier chip that is wired up to the +12V line but which still happily works at +5V...)
Most of the wires going to it are for each bit of the 16-bit address and 8-bit data. Using a couple of shift registers for the address, and a universal shift register for the data. Wrote some fairly primitive ISA bus read/write code, but it was really slow. Eventually found out about SPI and re-wrote the code to use that and it became very fast. Had trouble with some timings, fixed those.
The card is an ISA Plug and Play card, meaning before I could use it I had to tell it what resources to use. Linux driver code and some reverse-engineering of the official Windows/DOS drivers got me past this stage.
Wired up IRQ 5 to an Arduino interrupt to deal with incoming MIDI data, with a routine that buffers it. Ran into trouble with the interrupt happening during I/O and needing to do some I/O inside the handler and had to set a flag to decide whether to disable/re-enable interrupts during I/O.
It looks like total chaos, but the various wires going across the breadboard are mainly to make it easier to deal with the 16-bit address and 8-bit data lines. The LEDs were initially used to check what addresses/data were being sent, but now only one of them is connected and indicates when the interrupt handler is executing.
There's still a lot to do after that though - MIDI and OPL3 are two completely different things so I had to write some code to manage the different "channels" of the OPL3 chip. I have it playing multiple notes at the same time but need to make it able to control the various settings over MIDI. Eventually I might add some physical controls to it and get a PCB made.
The fun part is, I only vaguely know what I'm doing with the electronics side of this. I didn't know what a "shift register" was before this project, nor anything about the workings of the ISA bus. I knew a bit about MIDI (both the protocol and generally how the MPU-401 UART works) along with the operation of a sound card from a driver/software perspective, but everything else is pretty new to me.
As a useful little extra, I made some "fake" components that I can build the software against on a PC, to run some tests before uploading it to the Arduino (mostly just prints out the addresses it is going to try and write to).
46 -
First dev job: port Unix on Transputer, a (now defunct) bizarre processor with no stack, no registers and no compiler. That was fun! And that was in 1991 😎
3 -
Most successful project... What is success?
My first computer at 8 years old was a Commodore64. There was no internet yet, so I used the manual to learn about BASIC and assembly, sound and sprite registers, and created a pretty elaborate RPG. Mostly text, some sprite art, soldered some eeprom cartridges, optimized the code. Spent almost a year on it. An enthousiast magazine picked up on it, revised, QA'ed & published the game, sold a little over 10k samples. I got ƒ0.25 per sale, and I was completely overwhelmed how much candy one could buy for ƒ2500 ($2k corrected for inflation).
More recent:
I was employee #3 at my current company, started when it was worth nothing and the website redirected to a set of Google Forms containing all the logic. I wrote a large part of the first, monolithic backend.
Now there's teams in a dozen countries, and an estimated revenue of a quarter billion.
So obviously my current "project" is more successful.
Still, my current job sucks, the company turned into a desolate passion-free wasteland full of soulless fake hipster zombies and managers who seem to derive sexual pleasure from holding extremely ineffective meetings, endlessly rubbing their calendars together in their bureaucratic orgy of ineptitude.
So, I'm more proud of my C64 game.2 -
SOO.
I work at a grocery store, right. Cashier and all ya know, livin the dream. And whoever manages our product database. Needs to get thekr stuff together. We managed to confirm the DB isn't the same across the the registers. So now I have a bunch of stupid pictures of barcodes in my phone so I can make error reports for each and every single item that doesn't ring up. I know not ready dev related. But a dev somewhere is slacking
5 -
Made this project "Come Fix Me" in a 24hr hackathon. Won the most innovative solution.
An android application for citizens(users) which allows them to register issues on potholes in their area.
Web for report management
Usage Flow:
User clicks a photo of the pothole and registers a new issue.
The photo gets uploaded on the firebase database along with other information like GPS co-ordinates.
The image is downloaded in the server and served in the pothole detection script.
If pothole is detected an estimated area is calculated, if no pothole is detected user's issue gets rejected.
After successful detection details are uploaded on the web for administrator, these issue are forwarded to govt. officials.
Once the officials claim that they have fixed the pothole, the user gets a notification and they can close their issue if pothole is fixed
Demontration:
https://youtu.be/cN9kijExwyI
Github Link:
https://github.com/globefire/...rant story innovation python web development firebase yolo opencv android development machine learning cuda12 -
We're currently designing the ALU of a CPU at the university. After that we will design the registers and combine them all together.
It is like the awakening of a child inside of me. I have tried to understand how computers work in its very details.
That was too complex to me, but I believed that I can and will understand it when I start from somewhere in that field. That lead me to learn HTML and CSS when I was 6 to 8 (idk it has been too many years).
I'm really indescribably fascinated, motivated and happy.6 -
Interviewer = I, Me = M
I: What is your project all about?
M: It is about reading data from memory of a program and transfer it to output register via a dedicated bus attached inside CPU and then projecting data of registers onto LCD crystals of display.
I: Can you show the working of your project?
M: Runs "hello world" program
Me - 1, Interviewer - Slap on my cheeks with shoe in one hand.3 -
I thought to myself, "i nee some company in my life"
1.Registers on a app to meet some people
2. Fills in profile
3. Falls asleep from bordom.
*ting*
Notification from app: "we see you are in the engineering field, want to have a look at the opportunities we have at our company?"
.. I look for company and get job offers. Nice2 -
trying to do anything on the PS2 is almost fucking impossible
i imagine a board meeting where they were designing the hardware
"how can we make this insanely hard to use?"
"let's make decentralized partition definitions, allow fragmenting of entire partitions, and require all partitions to be rounded to 4MB. If you delete a partition, don't wipe the partition out, just rename it to "_empty" and the system will do it for you, except it actually won't because fuck you"
"let's require 1-bit serial registers to be used for memory card access and make sure you can't take more than 8 CPU cycles to push each bit or it'll trash the memory card"
"let's make the network module run on a 3-bit serial register and when initialized it halves the available memory but only after 8 seconds of activity"
"let's require the system to load feature modules called "IOPs" and require the software to declare which of the 256 possible slots it wants to use (max of 8 IOPs) then insert stubs into those. Any other IOP you call will hang the system and probably corrupt the HDD. You also have to overwrite the stubbed IOPs with your own but only if you can have the stubs chainload the other IOPs on top of themselves"
"let's require you to write to the controller registers to update them, but you have to write the other controller's last-polled state or the controller IOP will hang"
of course this couldn't make sense, it's
s s s s
o o o o
n n n n
y y y y
4 -
Twenty years.
For twenty years I've used vim almost exclusively, and only now have I learned about buffers and registers. It feels like wasted years, but also it feels like a gift.6 -
Haven't used it since and hopefully never will again, but understanding recursion and keyboard input in Assembly (uni project)
After a long (4 days) sleepover with my friends, with 14 hours a day of slamming our heads against abstract registers, we could finally program the factorial and take floating numbers as input and output them on the screen. It was nothing but pain, but the moment we got it, the sky had opened before us :D
Never again3 -
FREE .design domains! 😁
Porkbun is giving away one free .design domain to each customer, after verifying it's legit I had to share - I got a 3 letter domain!
Link: https://goo.gl/Nwx8fW
I always have need for an extra domain or 2 and while .design is a bit long/specific it is quite new so there are lots of short names available (I registered jhb.design - jhb is my city). If you have a sense of humour there are plenty of wordplay options too ( buttugly.design, thatsanice.design).
On top of that they are offering $20 per referral for anyone who registers a .design name for free. This is so good it sounds like a scam so to test it I made the above link an affiliate link in the hope of free beer.
Let me know if you managed to register anything good!7 -
Ye, so after studying for an eternity and doing some odd jobs here and there, all I can show for are following traits:
* Super knowledgeable in arm/Intel assembly language
* C-Veteran with knowledge of some sick and nasty C-hacks/tricks which would even sour the mood of your grandma
* Acquired disdain of any and all scripting languages (how dare you write something in one line for which I need a whole library for!)
* All-in-all low-level programmer type of guy (gimme those juicy registers to write into!)
After completing the mandatory part of my computer science studies, all I did was immerse myself into low-level stuff. Even started to hold lectures and all.
Now I'm at the cusp of being let free into the open market.
The thing is: I'm pretty sure that no company is really interested in my knowledge, as no one really writes assembly anymore.
Sure, embedded programming is still a thing, but even that is becoming increasingly more abstract, with God knows how many layers of software between the hardware and the dev, just to hide all the scary bits underneath.
So, are there people in here who're actually exposed to assembly or any hands-on hardware-programming?
Like, on a "which bit in which register/addr do I need to set" - kind of way.
And if so, what would you say someone like me should lookout for in a company to match my interest to theirs?
Or is it just a pipe dream, so I'd need to brace myself to a mundane software engineer career where I have to process a ticket at a time?
(Just to give a reference: even the most hardware-inclined companies I found "near" me are developing UIs with HTML5 to be used in some such environment ....)12 -
WTF IS WRONG WITH ASSEMBLY LANGUAGE?!
I was just modifying an existing program for adding a sequence of numbers from the data section and through console input. I studied the code and started modifying it one step at a time. I needed to modify it into a multiplication program. So I started by changing the ADD functions, replaced the result and buffer registers with bigger size and thought I completed it. WELL GUESS WHAT? SHIT JUST GIVES ME SEGMENTATION FAULT! NOW I HAVE TO REDO THE WHOLE THING! WHY DOESN'T IT TELL ME WHICH LINE OF THE CODE I FUCKED UP AT?! STUPID NASM COMPILER.9 -
Guess I just semi-wrecked my old testbed Android phone.
Don't you just love it when you touch on the screen in the middle and it registers as a touch to the top right corner? I sure don't.1 -
Update came for the cpu thingies. Seems to be fixed because i cant access the cpu registers like before. But damn son it slows the pc down, fuck.
Thinking about switching to a ryzen3.4 -
GOD DAMN IT COLLEGE YOU DID IT AGAIN. for real college can go suck Satan's 50 inch red cock for all I care.
A professor asked me to design a processor and I'll get a bonus. I said okay cool nothing hard.
oh but it has to be in verilog.
okay cool.
oh and it has to be on this fucking ancient useless piece of shit called xilinx that the fucking college provides to you only via a fucking 50 gigabyte virtual machine.
sigh. okay..... challenge accepted.
It fucking crashes every 2 minuites. And after 3 days of no sleep. I finally finished the Alu, Control unit, 4k memory, 8 registers and the busses.......... BUT THEN THE ENTIRE VIRTUAL MACHINE CRASHED AND LOST ALL PROGRESS...... fml.
and the professor only gave me the bonus for the Alu. sigh. fuck college.11 -
In my latest installment of "Swift, WTF?", we look at the "if" conditional in terms of the Swift convention of:
if let x = y { /* ... */ }
so what this does :
1. declares x in the scope of the braced code
2. sets x to y (an ahem, "optional")
3. decides if x is not *nil*, then executes the braced code.
This is very similar in both the visual and the operation to the C code of:
if (int x = y) { /* ... */ }
1. declares x in the scope of the braced code
2. sets x to the value of y
3. if x is not zero, then executes the braced code
which is considered *exceptionally* poor style.
Neither the C nor the Swift construct result in a legitimate boolean value of "true" or "false", although C comes closer than Swift.
In the Swift case the *imaginary* "nil" value has to be interpreted as "false" and thus there must be extra code is for the conditional to check on whatever constitutes the **actual** value of nil in Swift and then set the condition to "false".
(remember boys and girls, "optionals" are not real, they are an imaginary language construct of Swift and have no legitimate counterpart in the CPU operations with memory and registers)
At least in the case of C, if the value of x is zero or NULL (which is 0) then it is technically a "false" which in C is 0. Regardless, it is really poor programming and anyone doing that on my team gets an ear full.
But in Swift this obfuscation of code is common and condoned! Well, why not put more of the program in the condition of the if? In fact, stuff the whole thing in there.. why not? 🙄
This just reenforces my opinion that Swift is not a bird but the stuff that comes out of the underside of the bird. 🐦💩31 -
Fucken clock stretching.... seriously ... if you’re gonna clock stretch me for any more than 3ms (which is still pushing it) just fucken NACK me so I know your not ready god damn.... it’s fucken i2c .... stop with the application i2c bullshit... I just want to read the fucken registers stop abstracting it put it on a god damn DMA..
clock stretching is why we can’t have nice things lol12 -
I am learning exploit development on Windows and I have a problem with it, when I analyze the registers ESP and EIP.
I am able to overwrite both ESP and EIP.
The problem is that I can not make use of "mona.py". "Mona.py" keeps showing me that there are no pointers and any os dlls whereas that is not true.
Immunity Debugger is working completely fine.
I need "mona.py" to find pointers to ESP, but it says there is none.4 -
I have a web app that is currently running on a production server with no issues, but at the same time, it isn't working on my machine which I used to write, test, and deploy the app. The thing is, I haven't touched the code for a full month.
Now, I know this has to be logical and that there must be an reasonable explanation for all of this that I do not know yet.
However, and out of frustration, my mind wants to believe that there's some sorcery involved here or that a cosmic ray has actually penetrated the machine and messed its registers.
Damn the cosmic ray!3 -
Erm not sure if this qualifies. Not so long ago I was tackled with having to read a device memory at a very high address in 32-bit linux process (kernel is 64). The 32-bit mmap is unfortunately limited to range of protected mode PAE so it just wouldnt reach that high. So! I wrote my own syscall in assembly that would switch to long mode first so I could use long registers and then I got my page and switched back :)
In retrospective not a big deal, but it made me really happy for the rest of the day when I saw that address in pmap :)1 -
The best motivational comment
I posted a rant in which I mentioned that "few" developers who don't want other to progress and are present to show off at every platform....
Got a comment, which I want to share...
Thanks to @MrCush
Ya, most of them tend to stalk the stack overflow and Arch Linux communities. On stack overflow they tend to refresh their browser nonstop to see who their next victim is on a new question and then spend an abnormal amount of time searching the site for a similar question and then downvote you and report as a duplicate. “Umm ya, the question you linked is similar to mine. I found that one as well but unfortunately it wasn’t in the same environment with the same conditions that I raised and didn’t help me. Oh btw, he posted that back in 2002 and HEY LOOK, he got reported for a duplicate as well. Seems like you reported him as well.”
The issues of arrogance and being unhelpful on that site are so vast that nobody else that registers can get enough points to be able to be allowed to answer someone else’s question so you never get any new blood.
Arch Linux “elites” like to answer your question with a link that you’ve already been to as they always link the same site. “Dude! There’s a wiki for a fucking reason. Did you read this page?”
Yes I did read that page and it was helpful to a degree but since I’m absolutely new to Arch, a lot of the information on the wiki is a bit too descriptive and over my head. Not to mention every paragraph links you to another wiki page which then links you to another and so on that I have no idea where I left off....
“Dude! If you don’t understand everything on the wiki then you shouldn’t be using Arch Linux man! Gtfo scrub.”
Took me a long time to get comfortable with Arch because of these assholes. You got to start somewhere and doing is the best way to learn.
Reading the wiki on how to install Arch now seems so simple to me because I know what to ignore and what is required but back when I first started it was absolutely confusing.
-
This was true with the ancient IBM 6150/6151!
When you looked at the registers of the CPU with their debugger, all uninitialized registers had the hexcode 0xdeadbeef!1 -
Installing COSU devices. Need to setup 200 Androids. I boot up number 43 and it's set to Chinese language. I switch to English. It registers with the network.. WTF there is a sim card inside. =D And it was in a sealed package.
Now I am a proud owner of some poor bastards China Unicom WO sim card. =D -
https://simulator.io/board
Lets you place clocks, full and half adders, D latches, RS and JK flip-flops, shift registers, demultiplexers, multiplexers, and decoders, as well as all the standard gates. It also has buttons, switches, and individual LEDs.
Pretty close to what I would make myself.3 -
So this is kind of an odd scenario, but bare with me. My client has been issued a JWT token. After having received and stored it, I completely reset the database, and so also emptied the users table. Note that we're using MySQL with auto incrementing ID, whose counter has been reset. The user ID is stored in the JWT, so now the JWT isn't referencing an existing user anymore, so the client will get a 401. The problem arises when a new user registers and is inserted in the new database. That user will get ID 1, and so the old token for the other user will suddenly be valid for another user. I know it's an odd case, but is this a flaw in JWT? I guess an easy solution would be to use random ID's, but I'm still wondering.6
-
I'm shitting there hammering out some code butchering some real problems when I suddenly realise I'm surrounded. I look around and yes it's the bloody committee.
The committee is what I call the rest of the department and it is dominated by the old guard which comprises of the programmers that have been around for longer.
None of the old guard can program particularly well but because they had been around the longest they'd all grown senior. The committee had free reign but anyone else doing anything differently has to get approval from the committee.
The only way to code otherwise was to copy and paste existing code then to primarily rename things. If anyone did anything that hadn't been seen before then it would have to be approved by the committee. Individual action was not permitted unless you were old guard.
I swept my headphones away expecting it to be something unimportant. It was.
First things first they announce. We're going to add extraneous commas to the last element of all possible lists separated by comma including parameters or so they say. Ask but why so I do.
Because the language now supports it. They added support for it so it must be the right way someone proclaimed. Does it? I didn't realise we were waiting for it. Why do we want it though?
Didn't you hear? It's all over the blogosphere. It massively improves merge requests. But how I ask?
Five minutes later I grow tired of the chin stroking, elbow harnessing, slanted gazes into the yonder and occasionally hearing maybe its because and ask if they mean when you for example add an element the last element registers as changed from adding a comma. Turns out that's all it is.
How often do we see that tiny distraction and isn't it pointless to make the code ugly just for a tiny transient reduction in diff noise I ask. Everyone's stumped. This went on and on and got worse and worse. But it makes moving things around easy half of them say in unison like the bunch of slobs that they are. I mean really. It doesn't make expanding and contracting statements from multiline to single line easy and it's such a stupid thing. Is that all they do all day? Move multi-line method parameters up and down all day? If their coding conventions weren't totally whack they wouldn't have so many multiline method prototypes with stupid amounts of parameters with stupidly long types and names. They all use the same smart IDE which can also surely handle fixing the last comma and why is that even a concern given all the other outrageously verbose and excessive conventions for readability?
But you know what, who cares, fine, whatever. Lets put commas all over the shop and then we can all go to the pub and woo the ladies with how cool and trendy we are up to date with all the latest trends and fashions then we go home with ten babes hanging off each arm and get so laid we have to take a sick day the following to go to the STD clinic. Make way for we are conformists.
But then someone had to do it. They had to bring up PSR. Yes, another braindead committee that produces stupid decisions. Should brackets be same line or next line, I know, lets do both they decided. Now we have to do PSR and aren't allowed to use sensible conventions.
But why, I ask after explaining it's actually quite useful as a set of documents we can plagiarise as a starting point but then modify but no, we have to do exactly what PSR says. We're all too stupid apparently you see. Apparently we're not on their level. We're mere mortals. The reason or so I'm told, is so that anyone can come in and is they know PSR coding styles be able to read and write the code. That's not how it works. If you can't adjust to a different style, a more consistent style, that's not massively bizarre or atypical but rather with only minor differences from standard styles, you're useless. That's not even an argument, it's a confession that you've got a lump of coal where your brain's supposed to be.
Through all of this I don't really care because I long ago just made my own code generators or transpilers that work two ways and switch things between my shit and their shit but share my wisdom anyway because I'm a greedy scumbag like that.
Where the shit really hit the fan is that I pointed out that PSR style guide doesn't answer all questions nor covers all cases so what do we do then. If it's not in PSR? Then we're fucked.4 -
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
First post on devRant... Aaaaand it's university hw... I can't wrap my head around this...
So, the problem is: I have to implement writing and printing 64 bit decimal integers (negative and positive with 2s complement) in NASM Assembly. There are no input parameters, and the result should be in EDX:EAX. The use of 64 bit registers is prohibited.
There is a library which I can use: mio.inc
It has these functions:
- mio_writechar (writes the character which corresponds to the ASCII code stored in AL to console)
- mio_readchar (reads an ASCII character from console to AL)
It also has to manage overflow and backspace. An input can be considered valid or invalid only after the user hits Enter... It's actually a lot of work, and it's just the first exercise out of 10... 😭
The problem is actually just the input - printing should be easy, once I have valid data...
Please help me!3 -
Writing simple driver for AT24C256 eeprom on pico (RP2040)
It turned out it was FT24C256A, which should follow same protocol.
After literally over month of coming back to it, getting stuck again, rewriting things (including some functions of pico-sdk), i almost gave up a d started just yolo trying random shit.
Afterall the documentation on addressing the chip fucking missled me -_- (1st bit is r/w flag and 2-7 bits are address, counted from MSB->LSB)
I made it work yesterday.
In meantime Ive rewritten Wire library, Ive modified someone's else rewrite, extended sdk to allow getting i2c registers, tried to use tiny go just to learn it doesnt support i2c slave mode, resoldered entire thing few times, measured connections few too many times etc.
Frustrated I doubted I will ever manage to finish putting this project together because it looked like Im just too noob.1 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.8 -
I know that DI(dependency injection) is probably just another good pattern out there like many others, but dear lord have I been burned on it with acumatica. Acumatica just loves having friggen magic crap everywhere with no damn explanation(*may be in a blog post somewhere but that’s no replacement for good documentation).
I believe they use AutoFac in C# on an asp.net server. They love to utilize reflection and injection and in turn the server takes multiple minutes to startup whilst it dynamically registers everything, as well on any individual pages.
Development is a pain in the ass on this damn system.
I’m constantly having to dive into the damn code using dotpeek to understand what the fuck they are doing and it’s often friggen stupid shit. They like to reinvent the wheel a fair bit.1 -
Programming embedded systems from scratch. All hardware, memory, timers, peripherals, etc, must be set up correctly at startup, and if you set even one single bit incorrect in any of the sometimes hundreds of 32- or 64-bit configuration registers, you are screwed. There is often no terminal that prints error messages to help you, but if you are lucky you have an (often very expensive) hardware in-circuit debugger to step through the start up code.2
-
The rear ducking continues. We've built a reliable translator in the dumbest fucking way possible, it's just lovely. I simply reused the structure for feeding data to the VM assembler, an array of arrays, where there's one array of (ins [args]) per node in the parse tree.
It's nice because nodes can be solved out of order without affecting the actual sequence in which the instructions are output. And if one statement (node) equals multiple instructions, you just push multiple entries to the corresponding array, or push nothing if you need to output nothing. Easy as goblin pie.
This is enough to convert an input language to the assembly-like intermediate representation we use for the virtual machine. So then there's doing it backwards: walk the same array of arrays, and map those virtual instructions to a physical architechture. I guess I could do the encoding to native binary myself, it'd certainly be interesting to try, but I'm burnt-out already so I'll just use fasm for now.
Initial test: wrote a test program in my own stupid language, ran the translator, dump output to file, assemble that with fasm, run with r2 -d.
Crashes? No.
Runs fine? Yes and no.
For fuck's sake, I don't have syscalls. Mainly because the VM doesn't have an operating system, lmao. I was testing virtual programs by just freezing state, terminating, then dumping the fucking registers and stack to the console, we have no I/O to speak of. Not even a real 'exit', VM handles that by reading a return value every step like a mentally damaged son of a bitch.
So anyway, I manually paste the linux mambo, you know:
mov rax,60
mov rdi,0
syscall
And NOW our program can end execution without crashing.
Okay then, so does the test code work correctly?
** DRUM ROLL **
Yes.
Ladies and gentlemen, mother fucking PESO is now a compiled language, and going forward I will be expectantly receiving your marriage proposals for reviewing. Oh, but not so fast, we still need a frontend...
Well, we'll handle that in the next few days. I'm just glad to be *nearly* finished with this fucking compiler, I want nothing to do with anything else ever, but we know that's not going to happen, so Lord please end my pain.
No sponsor as this rant has been paid for by tax evasion. -
dammit ti why must your torture be limitless
> eZ80 has awesome DMA-like instruction that copies byte chunks based on registers and it's nigh-instant to copy 64k it's great
> TI has the opcode disabled outside a 4-byte chunk erroneously unincluded from all blacklists and access regulation
> can't bankswitch and keep registers, and can't write to anywhere but those 4 bytes in that bank
> no reusable code in target bank that i can use via mid-func bankswitch1 -
There was this question I came up with that was very good at inducing hallucinations on what at the time I thought was a *lobotomized* LLM.
I can't recall the exact wording right now, but in essence you asked it to perform OpenGL batched draw calls in straight x86_64 assembly. It would begin writing seemingly correct code, quickly run out of registers, and then immediately start making up register names instead of moving data to memory.
You may say: big deal, it has nowhere to pull from to answer such an arcane fucking riddle, so of course it's going to bullshit you. That's not the point. The point is it cannot realize that it's running out of registers, and more importantly, that it makes up a multitude of register names which _will_ degrade the context due to the introduction of absolute fabrications, leading to the error propagating further even if you clearly point out the obvious mistake.
Basically, my thought process went as follows: if it breaks at something fundamental, then it __will__ most certainly break in every other situation, in either subtle or overt ways.
Which begged the question: is it a trait of _this_ model in particular, or is it applicable to LLMs in general?
I felt I was on to something, but I couldn't be sure because, again, I was under the impression that the model on which I tested this was too old and stupid so as to consider these results significant proof of anything; AI is certainly not my field, so I had to entertain the idea that I could be wrong, albeit I did so begrudgingly -- for obvious reasons, I want at least "plausible based on my observations" rather than just "I can feel it in my balls".
So, as time went on, I made similar tests on other models whenever I got a chance to do so, and full disclosure, I spent no money on this so you may utilize that fact in your doomed attempt to disprove me lmao. Anyway, it's been a long enough while, I think, and I have a feeling you folks can guess the final answer already:
(**SLIGHTLY OMINOUS DRUM ROLL**)
The "lobotomy" in question was merely a low cap on context tokens (~4000), which I never went over in the first place; newer/"more advanced" models don't fare any better, and I have been _very_ lenient in what I consider a passable answer.
So that's that, is what I'm starting to think: I was right all along, and went through the burdensome hurdle of sincerely questioning the immaculate intuition of my balls entirely for naught -- learn from this mistake and never question your own mystical seniority. Just kidding, but not really.
The problem with the force of belief is it can work both ways, by which I mean, belief that I could be wrong is the reason I bothered looking further into it, whereas belief to the contrary very much compels me to dismiss doubt entirely. I don't need that, I need certainty, dammit. And though I cannot in good faith say that I am _certain_, "sufficiently convinced" will have to do for the time being.
TL;DR I don't know but the more I see it just seems shittier.7 -
Hey ranters!
Okay so uh, do you know about LFSRs (Linear Feedback Shift Registers)?
You may wish to tell me else burn me to a toast xD1 -
I'm in a big fat fucking stinking rut, as in progress on this project has absolutely stagnanted.
Gonna rubber face your duck now **UNZIPS** excepts I don't have zippers, as joggers are the one true way; fake Adidas til I fucking drop.
Brain damage aside, I understand both how I've layed out the data and what I'm supposed to do with it. We have a virtual machine, an array of instructions and arguments for a given process within it, and we need to walk this array and map values to registers.
We also need to spill values inside registers to stack, IF they are required at a further point within that block. This also isn't terribly complex. We simply look forward in the array and see if the value is an argument to any instruction that *needs* this value to be loaded (ie, within a register).
So this implies multiple iterations; we need to better understand how one particular value is used throughout an F before we can make a final decision on how many registers and stack space are actually needed for the whole block.
Here's where it gets tricky. If there's a call, we need to be certain that the symbol being invoked has already been fully processed. Besides the obvious fact that recursion fucks me up, there's another matter: say a private method gets invoked by another private method. We can take advantage of this, by which I mean, sacrilege incoming so put on this toga.
Looking at the output for C compilers, it would seem this is not done in practice, I would assume because it's a pain in the ass. But when you have the guarantee that F will only be called internally, as that's what "private" means, there's two ways it can go:
0. It's well below the 13-20 cycle threshold, so you inline the fucker. No suprises there.
1. It's a more involved affaire, and invoked in more than one place, so you don't inline it. Codesize matters.
Recursion and [1] are the big deal things holding me back. Not because it's too hard, like I said this is kindergarten level abstraction. I'm just slow and fanatical, which is how I prefer to spell "constant obsessive paranoid delusions". I can see the potential optimization I can pull here, so I'm stuck trying to figure it out.
Idea would be, handling the register allocation and stack spill for an internal-internal (or deep internal; what we like to call a "guts" method) in synchronization with the *calling* processes. This is, fundamentally, violating all conventions -- but so under the hood no one will notice.
Let me give you an example. If we were to pass some value to a function, expecting to mutate it and get a different value back, in a lot of cases it'd be stupid to make an implicit copy by using two registers, one for input and another for the output. Dude, it's one cycle. Multiply it by a million, say sixty times per second, for every time you __needlessly__ make a copy of a value that we've already stated is mutable.
Clearly unacceptable. This is, in the strictest sense, everywhere in every single codebase. Premature micro optimization is the root of all goodness, God is great and praiseworthy. So how do we go about it?
Answer is I know and I don't know. By which I mean to say, this very thing I've done by hand. Assembly is fun. Now the issue is teaching a calculator how to do it. Not so fun.
There is a dependency chain between processes, as I believe I've kind of alluded to. I'm trying to make decisions on the side of the caller depending on the details of the callee, which is why recursion is rawdogging my soul. This is the same situation, it's inverting the direction of one or more links in the dependency chain, which makes no fucking sense.
And yet it does.
Brain, explain yourself.
How do *you* handle this without crashing?
Brain?
<<ME STEWPED; BEEP-BOOP>>
Alright then, that was a useless attempt at fuckery. Let's have a nap then, maybe it'll come to me in the morning. That's what I've been saying to myself for almost a month now.
Perhaps it is a hardcoded fuk. -
Brrruh "mov" only works when both registers have the same size.
Could've told me this beforehand dude.5 -
Why those f***ers in Freescale was unable to keep architecture same across their product lines? Why has KL25Z completely different archtecture than KL28Z? Registers are renamed randomly, peripheries removed and added with different architecture... That naming is just happy coincidence not same line in their products...
This porting is gonna be fun.
FML. -
Damn lots of you knew this shit before turning of age.
I didn't code a single line until I went to college.
I tried to, but it was just too fucking complicated and I didn't understand a thing. Tried to grasp how to use some tools like Unity or an Adventure Maker of sorts and something called Flix for Flash games. Didn't understand shit.
I decided to study systems engineering due to a career aptitude test I took hoping somehow that way I could learn sthg.
First thing I was taught was bash.
When I realised I already knew enough to code a whole text adventure from scratch with such a simple language I felt really hyped.
Always loved text and graphic adventures.
Afterwards I was taught the Z80 assembly language and how CPU registers worked and it blew my fucking mind.
That was the first half-year.
Then I was taught C. And boy was it hard. Didn't get how memory was being handled until the very end.
I happened to be one of the few passing a stupidly complicated semifinal test with triple indirection pointers.
That felt goood.
Learning other languages afterwards was a piece of cake. C#, Java, X86 assembly, C++...
It was a hard door to open. Fucking heavy. But now nothing seems black magic anymore and boy isn't that something to be proud of! :D -
How should I go about learning C++ and the inner workings of a computer such as registers, the stack, etc?6
-
Spring Boot Admin is psychotic. Every time a client application registers to it, it barfs out random shit like: 1ear321fs31sfq.2
-
Why... why they have to be like that?
https://github.com/micro/micro/... was reported 11 days ago, I have this issue with the dashboard inside docker than registers no services nor clients, a shame because this enables testing and that comes handy specially if you have never ever done micro-services.
Despite linking to a minimal example that reproduces the issue I have in my project I'm not getting any support from the developers of Go Micro other than "use the latest Docker image, it shouldn't panic", sadly others give it a try too but their directions won't fix the problem.
So this makes me wonder, after 11 days and a minimal reproducible example provided from day one, why no developer have offered any hint of what I'm doing wrong? they know their software, it should be easier for them to spot why the bloody dashboard is not working as it should.7 -
Question directed to devs who know a bit about setting up middle sized architecture.
Prestory: Joined into development of a middle sized online game. Figured they created a monolith over the last 6 years up to a point where nothing works properly and nothing can be changed without wrecking the whole system. Figured a monolithic approach isn't such a great idea.
Current Situation: In a different, same scale online game development team, game itself working but team is struggling with architecture.
My job is to come up with an approach on how to set up masterserver/matchmaking/database etc. Reading through various articles about common principles (SOLID etc.), i figured that a microservice+event-/servicebus architecture may work for that kind of project.
The idea would be to have a global interface in which microservices can be hooked. So a client registers to a client handler on startup, then starts to queue for a game, the client handler throws an event on the bus to register the user to matchmaking. The matchmaker happens to listen to those events (Observer Pattern) and adds him to matchmaking, when a match is found it throws an event on the bus to connect the user to the server, etc. One can easily imagine a banhandler throwing in a veto to cancel such an action, metrics and logging is fairly simple to add (just another service listening to all events), additionally Continuous Delivery, FRP and such are also beneficial advantages and it is said to scale well.
The question is, would you do the same, is there maybe something i might be overlooking? Do you have better ideas?
Keep in mind that we are not too experienced and are bound to different languages (python, C++ and java mostly) and are a small (4 Devs) Team with different strengths.
Thank you for your feedback and criticism!1 -
Having problems with getting user's IP address with PHP.
So basically I made a custom DDoS protection for my linux server.
It works like this: php website gathers visitor IP address when he does a certain action (in this case registers an account). All visitor ips are stored in ips.txt securely on my website ftp.
Then my linux server has iptables rules setup in a way where it blocks all traffic except my website traffic.
On linux server I have a cron job which pulls whitelisted ips every 5 minutes from my php website FTP and then whitelists all IP's in iptables.
That way only visitor IP's (of those who registered account in my website) are being whitelisted in my linux server.
In case of a DDoS attack, all traffic is dropped except for the whitelisted visitor's IP's gathered from website ips.txt
Now I'm having a problem. My PHP script is not accurate. Some visitors in my website are not being whitelisted because they might have a different ipv4 ip address than what is given from php website. So basically I am looking for some php script/library that would gather ALL ipv4 ips from a visitor, then whitelist them.
Also regarding ipv6, my iptables are all default (which means that all ipv6 visitor traffic is allowed) so problem is not with visitors that have ipv6. Problem is with my script not getting ALL ipv4 ip addresses assigned to the user.
Can you recommend me some php library for that? So far I've used https://github.com/marufhasan1/... but apparently it's not accurate enough.16 -
looking at more DOS malware. 12 samples in this set of 80 (out of 16 looked at) read the time then overwrite the registers 5 or 6 lines later. The other 4 don't even bother.5
-
Bruh, just learn them how a computer works with minecraft. Inverters, And, Nand, etc... can all b made there. From there, u can make flip-flops, u can make registers, adders, multiplexors, demuxs. Ofc makin anything more then a 1-bit, maybe 2-bit machine, would be a pain, and dont get me starting on memory that extends one register. But hey, if u got the patience, u can ofc. Put it together to an ALU, combin all of them with a PC to an CPU. Ofc, you got no ROM or RAM, but hey, atleast u've built the hard part.4
-
You know
When I first saw etherum talking about am distributed state machine i thought wow. Not very practical but NEAT. I envisioned being able to make a byte code that could be stored in transactions and run by individual clients in an async function and each step of the resulting execution and the values of managed ram would be stored at intervals so other clients could take over and execute a few more statements and compare what should always be expected results that are identical
A grand incredibly inefficient system however really neato from the theoretical computer nerd standpoint !
Boy was I disappointed lol all it is a basic contracts language but yet they state it could be like a word computer ! How ? I thought maybe if you had enough nodes participating maybe you could store registers and the like in transaction values ? Wouldn’t that be the way ?
Seems like as a word computer they’re stuck somewhere between very simplistic js and something prior to amptron in usability yet they advertised as a world computer
Am i missing something ? I mean you could create something that would translate higher level code into smal numeric statements and then send it additions values but what would it be useful for and how would you actually. Store anything ? -
https://hahaxo.xlog.app/JetBrains-x...
this shit registers intellij ultimate until 2048, for free. and it works even for the latest version of intellij
whoever this chinese man is, is a fucking genius. Been looking for a fucking intellij crack for way too long!5
