Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "multiplying"
-
I guess I can do one of these a day or so. I've collected some novelties over the years.
First up is a Curta mechanical calculator. Before electronic calculators became a thing, these were the best portable calculators in the world. Notably, they were the calculator of choice in rally car sports.
They work by a series of helical gears that act as registers. A series of internal gears and value assignment switches apply an adjustable number of incrementations to those gears, multiplying gears and the tracking gears, once per "grind." The result is output as a number on top of the device. The "clear register" function is lifting the top ring, which releases the reverse lockout on the gears and a clockwise turn on the ring then resets them to their zero state.
They were designed by Curtz Herzstark, partly before WWII and partly while he was imprisoned in a Nazi concentration camp. He had filed a patent for it in 1938, shortly before his family's manufacturey became a weapons factory. During his imprisonment, in addition to nearly starving to death, he completed his plans for manufacturing of his calculator.
It had fun names like the, "pepper grinder," and "math grenade."15 -
The amount of people who don't know the difference between kilobyte and kibibyte is too damn high. So much confusion.
TL;DR : Most people use Kilobyte ( KB ) and Kibibyte ( KiB ) wrong and i am angry about it.
When i first got involved with software as a teenager, i always wondered why we convert kilo to mega with multiplying by 1024, when we do it with multiplying by 1000 basically everywhere else. Our physics teacher called this SI unit system and told us that this is an internationally accepted statement. So why is there a different rule ? Did i miss out something ? Regrettably I didn't ask her about this.
I just didn't get fully as a teenager. Now, as I am a developer now, i understand that dealing with power or ten is troublesome. Due to ease of work, we lazily mess with SI system and use it wrongly. Isn't it the time we end this abomination ?
2 years ago i talked to a friend about this, he said that i shouldn't bother.
I talked to a teacher, he said "you are right but using different brand of unit system can be overkill, since there is not much difference anyways." I said okay and left.
1 mega = 1000 kilo
1 giga = 1000 mega etc
also,
MB = Megabyte ( 1000 Kilobyte )
KB = Kilobyte ( 1000 Byte )
MiB = Mebibyte ( 1024 Kibibyte )
KiB = Kibibyte ( 1024 Byte )
I am writing this because today i saw someone do it wrong on the internet, all of these came into mind. I wonder your approach about this, for research purposes.
Call me dick all you want, but i am the guy who always corrects uncertainty, no matter what. Things should be in place, correctly. No i don't have OCD. If you say something like "I have 1 MB of executable file, which means i have 1024 KB of it", i will find you, and i will correct you.37 -
Functional Programming. Because Moores Law has moved from making processors faster to multiplying cores, and we may eventually have to code on machines that have 1024 cores or more. Mutable state will cause all kinds of hell in those scenarios. We already have problems with it when we have like 2-3 different threads.4
-
Okay, story time.
Back during 2016, I decided to do a little experiment to test the viability of multithreading in a JavaScript server stack, and I'm not talking about the Node.js way of queuing I/O on background threads, or about WebWorkers that box and convert your arguments to JSON and back during a simple call across two JS contexts.
I'm talking about JavaScript code running concurrently on all cores. I'm talking about replacing the god-awful single-threaded event loop of ECMAScript – the biggest bottleneck in software history – with an honest-to-god, lock-free thread-pool scheduler that executes JS code in parallel, on all cores.
I'm talking about concurrent access to shared mutable state – a big, rightfully-hated mess when done badly – in JavaScript.
This rant is about the many mistakes I made at the time, specifically the biggest – but not the first – of which: publishing some preliminary results very early on.
Every time I showed my work to a JavaScript developer, I'd get negative feedback. Like, unjustified hatred and immediate denial, or outright rejection of the entire concept. Some were even adamantly trying to discourage me from this project.
So I posted a sarcastic question to the Software Engineering Stack Exchange, which was originally worded differently to reflect my frustration, but was later edited by mods to be more serious.
You can see the responses for yourself here: https://goo.gl/poHKpK
Most of the serious answers were along the lines of "multithreading is hard". The top voted response started with this statement: "1) Multithreading is extremely hard, and unfortunately the way you've presented this idea so far implies you're severely underestimating how hard it is."
While I'll admit that my presentation was initially lacking, I later made an entire page to explain the synchronisation mechanism in place, and you can read more about it here, if you're interested:
http://nexusjs.com/architecture/
But what really shocked me was that I had never understood the mindset that all the naysayers adopted until I read that response.
Because the bottom-line of that entire response is an argument: an argument against change.
The average JavaScript developer doesn't want a multithreaded server platform for JavaScript because it means a change of the status quo.
And this is exactly why I started this project. I wanted a highly performant JavaScript platform for servers that's more suitable for real-time applications like transcoding, video streaming, and machine learning.
Nexus does not and will not hold your hand. It will not repeat Node's mistakes and give you nice ways to shoot yourself in the foot later, like `process.on('uncaughtException', ...)` for a catch-all global error handling solution.
No, an uncaught exception will be dealt with like any other self-respecting language: by not ignoring the problem and pretending it doesn't exist. If you write bad code, your program will crash, and you can't rectify a bug in your code by ignoring its presence entirely and using duct tape to scrape something together.
Back on the topic of multithreading, though. Multithreading is known to be hard, that's true. But how do you deal with a difficult solution? You simplify it and break it down, not just disregard it completely; because multithreading has its great advantages, too.
Like, how about we talk performance?
How about distributed algorithms that don't waste 40% of their computing power on agent communication and pointless overhead (like the serialisation/deserialisation of messages across the execution boundary for every single call)?
How about vertical scaling without forking the entire address space (and thus multiplying your application's memory consumption by the number of cores you wish to use)?
How about utilising logical CPUs to the fullest extent, and allowing them to execute JavaScript? Something that isn't even possible with the current model implemented by Node?
Some will say that the performance gains aren't worth the risk. That the possibility of race conditions and deadlocks aren't worth it.
That's the point of cooperative multithreading. It is a way to smartly work around these issues.
If you use promises, they will execute in parallel, to the best of the scheduler's abilities, and if you chain them then they will run consecutively as planned according to their dependency graph.
If your code doesn't access global variables or shared closure variables, or your promises only deal with their provided inputs without side-effects, then no contention will *ever* occur.
If you only read and never modify globals, no contention will ever occur.
Are you seeing the same trend I'm seeing?
Good JavaScript programming practices miraculously coincide with the best practices of thread-safety.
When someone says we shouldn't use multithreading because it's hard, do you know what I like to say to that?
"To multithread, you need a pair."18 -
I earned devie middle. Bought devie left. I came in this morning and devie right is looking at me???
This may be a Tribble situation! Worried when I open my office door Monday they may come pouring out. What will I do with thousands of devies???5 -
I'm working this whole weekend to rewrite/move an old custom made shop extension to the new shop.
The amount of possible SQL injections is too damn high and this piece of shit the creator calls code is the most pitiable thing I have ever seen!
I don't how you can call yourself an experienced programmer if you create SQL queries by concatenating strings and variables in raw PHP, copying the same fucking includefiles to 10 different folders and use all of them in random places.
I'm not angry at all, I just want to castrate you with a blunt, fake swiss army knife so mankind is safe from you multiplying yourself.2 -
Quarantine day..... i've stopped counting...
Numbers and time have lost all meaning...
I now use my free time to fill paper sheets with various random japanese symbols, learn linear algebra and being a "human" clojure interpreter......
( send help )
My location is the result of multiplying the matrокрызгкруойж п ыТк)4&2(1&/(0 υβεκσ´αω;·3)-@!}€{]¥~+~;];{*<=
<<< COMMUNICATION ABORTED >>>6 -
When we subtract some number m from another number n, we are essentially creating a relationship between n and m such that whatever the difference is, can be treated as a 'local identity' (relative value of '1') for n, and the base then becomes '(base n/(n-m))%1' (the floating point component).
for example, take any number, say 512
697/(697-512)
3.7675675675675677
here, 697 is a partial multiple of our new value of '1' whose actual value is the difference (697-512) 185 in base 10. proper multiples on this example number line, based on natural numbers, would be
185*1,
185*2
185*3, etc
The translation factor between these number lines becomes
0.7675675675675677
multiplying any base 10 number by this, puts it on the 1:185 integer line.
Once on a number line other than 1:10, you must multiply by the multiplicative identity of the new number line (185 in the case of 1:185), to get integers on the 1:10 integer line back out.
185*0.7675675675675677 for example gives us
185*0.7675675675675677
142.000000000000
This value, pulled from our example, would be 'zero' on the line.
185 becomes the 'multiplicative' identity of the 1:185 line. And 142 becomes the additive identity.
Incidentally the proof of this is trivial to see just by example. if 185 is the multiplicative identity of 697-512, and and 142 is the additive identity of number line 1:185
then any number '1', or k=some integer, (185*(k+0.7675675675675677))%185
should equal 142.
because on the 1:10 number line, any number n%1 == 0
We can start to think of the difference of any two integers n, as the multiplicative identity of a new number line, and the floating point component of quotient of any number n to the difference of any number n-m, as the additive identity.
let n =697
let m = 185
n-m == '1' (for the 1:185 line)
(n-m) * ((n/(n-m))%1) == '0'
As we can see just like on the integer number line, n%1 == 0
or in the case of 1:185, it equals 142, our additive identity.
And now, the purpose of this long convoluted post: all so I could bait people into reading a rant on division by zero.30 -
So I'm on my morning stroll. Walking, enjoying, watching the world around me.. It's nice how cherries blossom. They smell very tempting to stop there and enjoy the moment. Some flowers under the cherry...
Why do plants blossom again? Oh yeah, that's right, to exchange some speciments in order to grow fruit and seeds. To have their offspring. Just like every other living macroorganism [with a few exceptions ofc]. Life has no other way to survive but to exchange genetic material between two parties and only then trigger growth of the new life.
And that is a very strict rule. No more, no less: it takes exactly 2 organisms to make new life. But why is that? If my memory serves, theory of evolution says that life is like business: cut the losses and let the profits run. Over time it discards everything not required for the organism in order to save energy, and only successful new "investments" remain in the genome. The unsuccessful ones die before they proliferate, so the bad genes shall not survive.
It also says that very simple things, very simple changes lead to very complex outcomes. Us. Life.
But what is simple about life having to need 2 other lives? Exactly 2. It's either simple or efficient, depends on perspective. BUT IT IS NOT BOTH. Look at cells. They just split in half and multiply. Dead simple. It takes one of them to make another one. But with mammals, birds, reptiles, plants and other macroorganisms [excpt fungi] this is not the case! Why?!? I can't think of any scenario where two generic microorganisms, following some dead simple mutations, would come up w/ something that inefficient and overly complex. Like they're living on their own, multiplying by division, and smth very simple happens and they can no longer divide, only mate in pairs. The primitive, efficient and simple mechanism gets terminated and replaced with a different one, incredibly complex one!
Sure, we have protozoa which have similar reproductive mechanisms. They exchange genetic material to multiply.
But look at our, human cells. They dont need that! Look at some reptiles, some plants that only take one to make another. They don't pair as well! It's simple. Efficient. Why do protozoa need 2 for the species to survive?
It's not simple and efficient [tho helps us adapt, but its not my point for now]. See, things like this make ne wonder. What if we, the life, are not as accidental as we think? What if this whole mechanism was set off by someone or something billions of years ago? That's mean there are much older, much more superior cognitive organisms than us. What if protozoa was version 3 of new life [the first two did not survive]? Viruses - v2? Sea creatures - v3, reptiles - v4, and so on until they came up with us, mammals? That'd surely mean we are not alone in this universe. Are they watching us? Will they create a new species any time soon? What's our purpose, are we just an experiment?
And so, from cherry blossoms to existensial dilemma, my stroll is over. Time for breakfast :)1 -
I might have just git-committed the cardinal developer sin: not multiplying estimates by 3. Torvalds help me!
So a php app I developed a few months ago when I was first starting as a dev needs an upgrade. Pretty simple since I've known about said upgrade for a while, but the feature was never needed until today.
Told my boss it would take a day or two of refactoring and additions for it to work.
How screwed am I?4 -
I was writing a simple algorithm to simulate gravity. But when I tested it it produced wildly wrong results. I looked over my whole algorithm trying to find the error, but thought that the last bit, the final position update, must be fine.
I was wrong, some misplaced brackets were accidentally multiplying the position + 0.5 by the sum of the old & new velocity instead of adding the position to 0.5 those velocities.
I noticed that and fixed it, and now it runs pretty well. -
Very Long, random and pretentiously philosphical, beware:
Imagine you have an all-powerful computer, a lot of spare time and infinite curiosity.
You decide to develop an evolutionary simulation, out of pure interest and to see where things will go. You start writing your foundation, basic rules for your own "universe" which each and every thing of this simulation has to obey. You implement all kinds of object, with different attributes and behaviour, but without any clear goal. To make things more interesting you give this newly created world a spoonful of coincidence, which can randomely alter objects at any given time, at least to some degree. To speed things up you tell some of these objects to form bonds and define an end goal for these bonds:
Make as many copies of yourself as possible.
Unlike the normal objects, these bonds now have purpose and can actively use and alter their enviroment. Since these bonds can change randomely, their variety is kept high enough to not end in a single type multiplying endlessly. After setting up all these rules, you hit run, sit back in your comfy chair and watch.
You see your creation struggle, a lot of the formed bonds die and desintegrate into their individual parts. Others seem to do fine. They adapt to the rules imposed on them by your universe, they consume the inanimate objects around them, as well as the leftovers of bonds which didn't make it. They grow, split and create dublicates of themselves. Content, you watch your simulation develop. Everything seems stable for now, your newly created life won't collapse anytime soon, so you speed up the time and get yourself a cup of coffee.
A few minutes later you check back in and are happy with the results. The bonds are thriving, much more active than before and some of them even joined together, creating even larger bonds. These new bonds, let's just call them animals (because that's obviously where we're going), consist of multiple different types of bonds, sometimes even dozens, which work together, help each other and seem to grow as a whole. Intrigued what will happen in the future, you speed the simulation up again and binge-watch the entire Lord of the Rings trilogy.
Nine hours passed and your world became a truly mesmerizing place. The animals grew to an insane size, consisting of millions and billions of bonds, their original makeup became opaque and confusing. Apparently the rules you set up for this universe encourage working together more than fighting each other, although fights between animals do happen.
The initial tools you created to observe this world are no longer sufficiant to study the inner workings of these animals. They have become a blackbox to you, but that's not a problem; One of the species has caught your attention. They behave unlike any other animal. While most of the species adapt their behaviour to fit their enviroment, or travel to another enviroment which fits their behaviour, these special animals started to alter the existing enviroment to help their survival. They even began to use other animals in such a way that benefits themselves, which was different from the usual bonds, since this newly created symbiosis was not permanent. You watch these strange, yet fascinating animals develop, without even changing the general composition of their bonds, and are amazed at the complexity of the changes they made to their enviroment and their behaviour towards each other.
As you observe them build unique structures to protect them from their enviroment and listen to their complex way of communication (at least compared to other animals in your simulation), you start to wonder:
This might be a pretty basic simulation, these "animals" are nothing more than a few blobs on a screen, obeying to their programming and sometimes getting lucky. All this complexity you created is actually nothing compared to a single insect in the real world, but at what point do you draw the line? At what point does a program become an organism?
At what point is it morally wrong to pull the plug?15 -
I thought the all point of using react-native was to make development faster.
How is multiplying the time needed by an average of 4 any faster?
FML2 -
!rant
OMG fuck yeah!
Today I was workin' on my CSS framework, made a couple of cool functions for generating hsla() colors with a customizable lightness and opacity. Using calc() for multiplying the default lightness by the value passed in parameter to the function.
"It's working perfectly in Chrome and Edge, cool! Now let's check in Firefox, but if it's okay on Edge, I'm pretty confident..."
Except, that's a failure: https://bugzilla.mozilla.org/show_b...
At that point, I started to rant alone. Properly. Like: "why this feature is still not implemented, people are waiting for it since YEARS!! Fuckin' browsers war!!!"
I was already thinking to drop a big angry post on here, when I noticed something : https://developer.mozilla.org/en-US...
So I update Firefox Developer Edition and, IT WORKS!
This feature was needed since years and the FF team brings it just when I need it. What was the chance ? I feel happy :)
Conclusion: sometimes ranting is the easy way. Calm down, try harder and you can find the solution!1 -
Fuck Redux/ngrx. I'm done, I can't get my head around this ugly shit. All I wanted was to load/save api data in a clean way and display a loading indicator now and then. But definitely not multiplying my entire code base by 10. Actions, Reducers, Effects. What is this?! Fuck that rocket science.5
-
Life is a cycle, you struggle with multiplying numbers then that same feeling comes back doing Assembly.1