Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "memory tables"
-
Lads, I will be real with you: some of you show absolute contempt to the actual academic study of the field.
In a previous rant from another ranter it was thrown up and about the question for finding a binary search implementation.
Asking a senior in the field of software engineering and computer science such question should be a simple answer, specifically depending on the type of job application in question. Specially if you are applying as a SENIOR.
I am tired of this strange self-learner mentality that those that have a degree or a deep grasp of these fundamental concepts are somewhat beneath you because you learned to push out a website using the New Boston tutorials on youtube. FOR every field THAT MATTERS a license or degree is hold in high regards.
"Oh I didn't go to school, shit is for suckers, but I learned how to chop people up and kinda fix it from some tutorials on youtube" <---- try that for a medical position.
"Nah it's cool, I can fix your breaks, learned how to do it by reading blogs on the internet" <--- maintenance shop
"Sure can write the controller processing code for that boing plane! Just got done with a low level tutorial on some websites! what can go wrong!"
(The same goes for military devices which in the past have actually killed mfkers in the U.S)
Just recently a series of people were sent to jail because of a bug in software. Industries NEED to make sure a mfker has aaaall of the bells and whistles needed for running and creating software.
During my masters degree, it fucking FASCINATED me how many mfkers were absolutely completely NEW to the concept of testing code, some of them with years in the field.
And I know what you are thinking "fuck you, I am fucking awesome" <--- I AM SURE YOU BLOODY WELL ARE but we live in a planet with billions of people and millions of them have fallen through the cracks into software related positions as well as complete degrees, the degree at LEAST has a SPECTACULAR barrier of entry during that intro to Algos and DS that a lot of bitches fail.
NOTE: NOT knowing the ABSTRACTIONS over the tools that we use WILL eventually bite you in the ASS because you do not fucking KNOW how these are implemented internally.
Why do you think compiler designers, kernel designers and embedded developers make the BANK they made? Because they don't know memory efficient ways of deploying a product with minimal overhead without proper data structures and algorithmic thinking? NOT EVERYTHING IS SHITTY WEB DEVELOPMENT
SO, if a mfker talks shit about a so called SENIOR for not knowing that the first mamase mamasa bloody simple as shit algorithm THROWN at you in the first 10 pages of an algo and ds book, then y'all should be offended at the mkfer saying that he is a SENIOR, because these SENIORS are the same mfkers that try to at one point in time teach other people.
These SENIORS are the same mfkers that left me a FUCKING HORRIBLE AND USELESS MESS OF SPAGHETTI CODE
Specially to most PHP developers (my main area) y'all would have been well motherfucking served in learning how not to forLoop the fuck out of tables consisting of over 50k interconnected records, WHAT THE FUCK
"LeaRniNG tHiS iS noT neeDed!!" yes IT fucking IS
being able to code a binary search (in that example) from scratch lets me know fucking EXACTLY how well your thought process is when facing a hard challenge, knowing the basemotherfucking case of a LinkedList will damn well make you understand WHAT is going on with your abstractions as to not fucking violate memory constraints, this-shit-is-important.
So, will your royal majesties at least for the sake of completeness look into a couple of very well made youtube or book tutorials concerning the topic?
You can code an entire website, fine as shit, you will get tested by my ass in terms of security and best practices, run these questions now, and it very motherfucking well be as efficient as I think it should be(I HIRE, NOT YOU, or your fucking blog posts concerning how much MY degree was not needed, oh and btw, MY degree is what made sure I was able to make SUCH decissions)
This will make a loooooooot of mfkers salty, don't worry, I will still accept you as an interview candidate, but if you think you are good enough without a degree, or better than me (has happened, told that to my face by a candidate) then get fucking ready to receive a question concerning: BASIC FUCKING COMPUTER SCIENCE TOPICS
* gays away into the night53 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
Idea: Emoji passwords
Bdixbsufhdbe HEAR ME OUT
I know, I know, emojis belong with teenage girls on Snapchat but there are some theoretical benefits to emoji passwords.
Brute Force attacks are useless! With such a wide range of characters and so many different combinations, they just wouldn't be viable.
Dictionary attacks are less useful! Because those require...words.
They can be easier to remember. Tell a story with your emojis. Images are easier to commit to memory than combinations of letters and numbers.
Users would adopt the feature! For whatever reason, the general population fucking loves these things. So emoji passwords probably won't take very long to see use.
I don't know much about this last one, so I saved it for last, but I would imagine that decryption would be more difficult if the available values is quite vast. I dunno how rainbow tables and hash defucking works so I'll just put this here as a "maybe"
😀33 -
I've optimised so many things in my time I can't remember most of them.
Most recently, something had to be the equivalent off `"literal" LIKE column` with a million rows to compare. It would take around a second average each literal to lookup for a service that needs to be high load and low latency. This isn't an easy case to optimise, many people would consider it impossible.
It took my a couple of hours to reverse engineer the data and implement a few hundred line implementation that would look it up in 1ms average with the worst possible case being very rare and not too distant from this.
In another case there was a lookup of arbitrary time spans that most people would not bother to cache because the input parameters are too short lived and variable to make a difference. I replaced the 50000+ line application acting as a middle man between the application and database with 500 lines of code that did the look up faster and was able to implement a reasonable caching strategy. This dropped resource consumption by a minimum of factor of ten at least. Misses were cheaper and it was able to cache most cases. It also involved modifying the client library in C to stop it unnecessarily wrapping primitives in objects to the high level language which was causing it to consume excessive amounts of memory when processing huge data streams.
Another system would download a huge data set for every point of sale constantly, then parse and apply it. It had to reflect changes quickly but would download the whole dataset each time containing hundreds of thousands of rows. I whipped up a system so that a single server (barring redundancy) would download it in a loop, parse it using C which was much faster than the traditional interpreted language, then use a custom data differential format, TCP data streaming protocol, binary serialisation and LZMA compression to pipe it down to points of sale. This protocol also used versioning for catchup and differential combination for additional reduction in size. It went from being 30 seconds to a few minutes behind to using able to keep up to with in a second of changes. It was also using so much bandwidth that it would reach the limit on ADSL connections then get throttled. I looked at the traffic stats after and it dropped from dozens of terabytes a month to around a gigabyte or so a month for several hundred machines. The drop in the graphs you'd think all the machines had been turned off as that's what it looked like. It could now happily run over GPRS or 56K.
I was working on a project with a lot of data and noticed these huge tables and horrible queries. The tables were all the results of queries. Someone wrote terrible SQL then to optimise it ran it in the background with all possible variable values then store the results of joins and aggregates into new tables. On top of those tables they wrote more SQL. I wrote some new queries and query generation that wiped out thousands of lines of code immediately and operated on the original tables taking things down from 30GB and rapidly climbing to a couple GB.
Another time a piece of mathematics had to generate all possible permutations and the existing solution was factorial. I worked out how to optimise it to run n*n which believe it or not made the world of difference. Went from hardly handling anything to handling anything thrown at it. It was nice trying to get people to "freeze the system now".
I build my own frontend systems (admittedly rushed) that do what angular/react/vue aim for but with higher (maximum) performance including an in memory data base to back the UI that had layered event driven indexes and could handle referential integrity (overlay on the database only revealing items with valid integrity) or reordering and reposition events very rapidly using a custom AVL tree. You could layer indexes over it (data inheritance) that could be partial and dynamic.
So many times have I optimised things on automatic just cleaning up code normally. Hundreds, thousands of optimisations. It's what makes my clock tick.4 -
>Be client
>Have an issue with incredibly slow webpage load time
>Blame memcache issues
So... I look into the problem. Yes, the page either loads up fast, or times out. So, into the logs I go. Webserver is fine (except the timeout), PHP though... Error log is fine (just notices), but slow log shows the issue is the database (of course... its always the database... ugh)
So, checking the database, there is one ugly query that seems to be an issue. 5 joins and a huge where condition.
So I run EXPLAIN on the query and... Proceed to bang my head against the wall.
OF COURSE ITS SLOW YOU FU******, NONE OF YOUR TABLES HAVE ANY INDEXES.
What do they expect when the database has to always go down the whole table and do everything in memory, until it runs out and has to dump it all on disk and work with it there.
Ugh... Some clients... -
Today at work I accidentally redeployed our ELK instance without taking a backup of all of Kibana's saved objects...
I didn't realize Elastic stores all indexes including the Kibana's in it's app folder by default.
Tomorrow will be fun.... I can't decide what to do first... Recreating all the charts and tables from memory... Or fixing the deployment script to change the data dir path...2 -
Spent the entire morning updating a SQL query.
Client wanted to have different expiration times for different products. So the full package would be 1 year of access and a module would only be 6 months. Then when you renew your account the renewal is 1 year if you have the full package else it's 6 months.
The query takes 0.7s to run and left joins 3 tables. Only to return about 100 results. Still it's faster than the guy who wrote the original query which just dumped the hole db into memory then looped through it appending valid entries to a new array. -
"Our supplier asks that you double the number of php child processes for this fpm pool"
"Are you aware, that that would lead to about 100% of memory overcommit, taken the current limit of 128MB/child, and that if a lot of them started at once, the system would probably go for OOM-Kill, which would most probably kill your database, that still runs on 100% MyISAM tables that do not support transactions, and you'd have to kiss your data integrity goodbye, right?"
"Uh... Nevermind then"
I get that some people are not IT-versed, but really... Hire someone who knows what they are doing and doesn't live 20 years in the past, god damn it! -
You know shit is going to hit the fan if the sentence "c++ is the same as java" is said because fuck all the underlying parts of software. It's all the fucking same. Oh and to write a newline in bash we don't use \n or so, we just put an empty echo in there. And fuck this #!/bin/bash line, I'm a teacher. I don't need to know how shit works to teach shit. Let's teach 'em you need stdio for printf even tho it compiles fine without on linux (wtf moment number one, asking em leaves you with "dunno..") and as someone who knows c you look at your terminal questioning everything you ever learned in your whole life. And then we let you look into the binaries with ldd and all the good stuff but we won't explain you why you can see a size difference in the compiled files even tho you included stdio in the second one, and all symbol tables show the exact same thing but dude chill, we don't know what's going on either.
Oh and btw don't use different directory names as we do in our examples. You won't find your own path, there is no tab key you can press to auto-fill shit.
But thats not everything. How about we fill a whole semester with "this is how to printf" but make you write a whole game with unity and c#. (not thaught even the slightest bit until then btw)
Now that you half-assed everything because we put you in a group full of fucks who don't even know what a compiler is but want to tell you you don't know shit and show you their non-working unfinished algorithms in some not-even-syntax-correct java...
...how about we finally go on with Algebra II: complex numbers, how they are going to fuck up your life, how we can do roots of negative numbers all of the sudden and let you do some probability shit no one ever fucking needs. BUT WHY DON'T YOU KNOW EVERYTHING ALREADY HMMMMM, IT'S YOUR SECOND LESSON, YOU WENT TO SCHOOL PLS BE A MATH PRO ASAP CUS YOU NEED IT SO MUCH BUT YOU DON'T NEED TO KNOW PROPER SYNTAX, HOW MEMORY MANAGEMENT WORKS, WHAT A REFERENCE IS AND PLS FINALLY FORGET THE WORD "ALLOCATION" IT DOESN'T PLAY A SINGLE ROLE YOU ARE STUDYING SOFTWARE DEVELOPMENT WHY ARE YOU SO BAD AT ECONOMICS IT MAKES NO SENSE I MEAN YOU HAD A WHOLE SEMESTER OF HOW TO GREET SOMEONE IN ENGLISH, MATHS > ECONOMICS > ENGLISH > FUCKING SHIT > CODING SKILL THATS HOW THE PRIORITIES WORK FOR US WHY DON'T YOU GET IT IT MAKES SO MUCH SENSE BRAH4 -
F**k companies who's apps use MySQL/MariaDB tables of the table engine MEMORY.
Seriously.
That engine *sucks* to work with as an admin. It's such a huge pain in the ass having to always dump the whole DB instead of taking a snapshot.
And if the replica restarts... Poof. Replication breaks. Cuz all the memory tables are suddenly empty!
Fml. Fmfl. Ugh.17 -
>Working on code
>Shit works as intended first try, nice
>Goes to play strange bootleg Gameboy Color ROM sent by a friend
>ROM immediately fucking dies
wtf.svg
>Pop emulator's debugger
we're executing from VRAM, stack's firmly embedded in ROM
>why
>Add execution breakpoint to entrypoint of game, restart emulated system (because i'm actually using the legit bios i hacked so it allows null/corrupted games to run)
>Step through everything, everything goes well until all of a sudden we call a function and shit hits the goddamn fan
well we have the culprit
>step through subroutine
if <unused_byte_in_HRAM> != 0 then stackPointer+=32;tryAgain();else return
>***y***
>Realize this is using a bootleg Memory Bank Controller with hard-backed encryption so none of the bytes executed or read as data are the right byte
>Find emulator that'll handle the jank MBC
>read code to try and figure out how it works
if checksumExtendedLogoBlob == some_number then set MBC_Bootleg1 else if checksumExtendedLogoBlob == some_other_number then set MBC_Bootleg2 else if...
>of course
>Spend 10 minutes finding the right bootleg MBC
>code shows 8 possible tables for real bit order based on some value in the cart header
>look for code that gets this value
>not in the header
>not in ANY header in this 1000+ file emulator
>not in any related cpp files???
>get desperate
>email author
>"Delivery failed: email doesn't exist"
fuck me i guess2 -
A database fetch. All rows at once. Not that many rows, maybe 50.
But oh boy when someone forgot that the repository is wired to magically inject SQL that joins other tables and does ineffective loops to create thousands of objects in the background.
Been fun finding memory hogs in the codebase. -
So i have been thinking..
SQL is a lang that runs on a specific software on the server, and helps creating data stores(databases and tables) that can be queried & manipulated.
is there a way to run sql like queries on the client side with no interaction from backend at all?
Say i have 5 inter related data models. in a backend world, they will form nice little tables of a db with all their joins and composite keys. from the server, i shall be querying them like "SELECT name from x where y=z & ..."
but what if i could store them like tables in browser memory and run the same query filters via a query language... is this possible?
i know this poses a certain security risk, but we already use cookies, local storage and a lot of json based shitty client side storages. surely it might be possible to have a lesser optimised sql tables on the frontend with extremely good querying capabilities?
or am i talking something far fetched here?8 -
Okay, so I had an object consisting of tables (basically classes) and structs (classes with only scalars as their properties).
I was about to serialize the object with vectors of classes and structs and wrote some nice tests for it wondering why they fail to validate the data after deserialization and why I only got garbage for the vectors of structs whilst the tables worked just fine.
Turns out there is an undocumented function called CreateVectorOfStructs which shall be used for structs instead of the regular CreateVector ...
There go three hours of blaming memory issues and running Valgrind over and over again ...