Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "language to binary"
-
WTF Python!!! "Master" and "Slave" perfectly convey the concept. In the English language many words have different meanings based on their context. It's plainly obvious that no allusion to human slavery is meant in the context of software or hardware module relationships. I don't even think it is problematic. The real problem seems to be the people who are taking terms outside their intended space. Why are we linking a scar on human history to terminologies explaining technical relationships?
Then lets also ban 0 and 1 because it can offends non-binary peoples!22 -
I'm 20, and I consider myself to be as junior as they come. I only started programming seriously in June 2016,and since then, I've been doing mainly Android Work, and making my own servers and backends(using AWS/Firebase nd stuff).
For the first time in life, I was approached by a recruiter for a company on linkedIn. They "stumbled upon" my Github profile and wanted to see if I was interested in an internship opportunity. This company is an early stage start up, by that I mean a dude with an idea calling himself the CEO and a guy who "runs a tech blog" and only knows college level C programming (explaination follows).
So they want me to make the app for their startup. and for that, I ws first asked to solve a couple problems to prove my competence and a "technical interview" followed.
They gave me 3 questions, all textbook, GCD of 2 numbers, binary search and Adding an element to the linked List, code to be written on a piece of paper. As the position was that of an Android Developer, I assumed that Java should be the language of choice. Assumed because when I asked, the 'tech blogger' said, yeah whatever.
But wait, that ain't all, as soon as I was done, Mr. Blogger threw a fit, saying I shouldn't assume and that I must write it in C. I kept my cool (I'm not the most patient person), and wrote the whole thing in C.
He read it, and asked me what I've written and then told me how wrong I was to write 2 extra lines instead of recursion for GCD. I explained that with numbers large enough, we run the risk of getting a stackoverflow and it's best to apply non recursive solution if possible. He just heard stackoverflow and accused me of cheating. I should have left right then, but I don't know why, I apologized and again, in detail explained what was happening to this fucktard. Once this was done, He asked me how, if I had to, I'd use this exact code in my Android App. I told him that Id rather write this in Java/Kotlin since those are the languages native to Android apps. I also said that I'd export these as a Library and use JNI for the task. (I don't actually know how, I figured I can study if I have to).
Here's his reply, "WTF! We don't want to make the app in Java, we will use C (Yeh, not C++, C). and Don't use these fancy TOOLS like JNI or Kotlin in front of me, make a proper application."
By this I was clear that this guy is not fit to be technical lead and that I should leave. I said, "Sir, I don't know how, if even possible, can we make an Android App purely in C. I am sorry, but this job is not for me".
I got up and was about to leave the room, when we said, "Yeah okay, I was just testing you".
Yeah right, the guy's face looked like a howling monkey when I said Library for C, and It has been easier for me to explain code to my 10 year old cousin that this dumbfuck.
He then proceeded to ask me about my availability, and I said that I can at max to 15-20 hours a week since my college schedule is pretty tight. I asked me to get him a prototype in 2 months and also offered me a full time job after I graduate. (That'd be 2 years from now). I said thank you for the offer, but I am still not sure of I am the right person for this job.
He then said, "Oh you will be when I tell you your monthly stipend."
I stopped for a second, because, money.
And then he proceeded to say 2 words which made me walk out without saying a single word.
"One Thousand".
I live in India, 1000 INR translates to roughly $15. I made 25 times that by doing nothing more than add a web view to an activity and render a company's responsive website in it so it looks like an app.
If this wasn't enough, the recruiter later had the audacity to blame me for it and tell me how lucky I am to even get an offer "so good".
Fuck inexperienced assholes trying shit they don't understand and thinking that the other guy is shitsworth.10 -
( rant || !rant ) && idiots
console.info( this.isLongRant );
console.warn( "contains strong language and wordpress" );
A friend of mine sent two of his "friends" to me because they wanted me to build a website for their new business (~idea).
So I had a meeting with them.
First of all they wanted me to have a look on the current (work in progress) site.
First impression of the frontend:
OH BOY!?
Well, imagine this:
- a 90s/2k background (dotted/pixelated cloud in baby-blueish as backgroud with repeat)
- the logo was made by the sister of one of the guys, it wasn't too bad, but badly aligned, asymmetrical
- some obvious $offTheShelfShopPlugin with $randomStockContent
- the fucking slider had a small loading bar to indicate changes, it appears like an hyperanxious child on ADHS
- below the logo TWO FUCKING GIF SPINNERS to indicate nothing else but how fucking brain amputated these two dudes are, including the dev who is responsible for adding this. (to this point, they only told me, that a webagency did the setup and some basic work on the site, more on that later)
- no styling concept at all, random fonts and stuff everywhere including default styles of the shop plugin.
- FUUUUUCK WTF wil come furtherin this meeting?
After seeing a pile of binary puke fisted out of a 60yo nonstop-intern who changed his jobtitle from dildo-traveling-salesman to fullstack-frontend-dev by wrinting it on a post-it-note, I imagined, there has to be something wrong with the backend as well.
Boy was I right!
Yes, you guessed it! A random Wordpress adminpanel login appeared! OH NO....
I really wanted to levae this meeting immediately.
I was not able to hold my disgust back and I told them right in their face, what a shit pile of nutty squirrel turds this current page is. And that Wordpress is not the right choice at all for a shop.
Then came the best part: They basically told me, that they terminated the previous contract with the webagency because they were too expensive (they are cheap, compared to others, I know people who know their prices) and that they wanted to create A BIG MARKETPKACE with multiple ressellers who can have their shop in their website. Something similar to FUCKING AMAZON. ON FUCKING WORDPRESS!?!?!?
They even asked me if I wanted to be their partner & developer and that they can't pay much at the moment until the marketplace starts to grow.
I more or less told them to go fuck themselves with a rusty pitchfork.2 -
All this "political correctness" cancer spreading through the Python community at the moment over "master/slave" terminology has me wondering where will it end. When the pendulum swings will be have a pro-life movement opposed to pre-emptively killing processes? Will a branch of PETA form to oppose the taxonomic appropriation of reptilian names for the language as a whole? Are we going to need to find gender-neutral names for motherboards to avoid offending those who are offended by the oppressive digital binary? Will removing "mother" from the name motherboard invite 6th wave feminists to decry the influence of toxic masculinity in electronics? Do snake lives matter? Seriously, some people need to take a month off to go fuck themselves somewhere far far away and stop confusing "diversity" with "rampant idiocy".14
-
I love Linux, but its community can be so full of incompetent assholes..
Just now I asked in Freenode ##linux how to get the process ID of my current running process in bash. I got my answer - it's a shell built-in called "$$".
Then people start to nitpick some more - why do you need it? How is that different from an exit? - to which my response was.. well I know the whole idea behind exit codes, and I'd use it whenever possible, in all defined behavior that allows my program to terminate itself whenever it can. This pidfile however would be used to exit itself and provide diagnostic information whenever the program enters undefined behavior - a segfault in C language. Scenarios in which I don't have full control over the script's behavior anymore, such as the system entering an unworkable state where the system stalled, still got some binaries in RAM but the rootfs got unwritable, such as now - very helpfully, thanks HP! - when my laptop likely overheated and shat itself. I issued sudo reboot into it, but even that wouldn't issue properly anymore due to the /sbin/poweroff binary becoming inaccessible too. I had to issue a hard power cycle.. one of the few times in which I'm thankful to HP for actually causing shit like this, lol.
Point is, that undefined behavior is what I'm trying to mitigate against. I certainly can't let any files other than diagnostics remain in nonvolatile storage like that, especially when their state should be predictable in order to ensure good operation (like files expressing whether the script is already running or not, i.e. lock files).
Back to that IRC chat. Aside from the answer, I got ridicule from people who probably don't even know how to properly compile a kernel. Ubuntu users, overconfident scum. Sometimes I feel like I should ask questions in channels like #archlinux only, where such incompetency is ridiculed on its own.13 -
First rant: but I'm so triggered and everyone needs a break from all the EU and PC rants.
It's time to defend JavaScript. That's right, the best frikin language in the universe.
Features:
incredible async code (await/async)
universal support on almost everything connected to the internet
runs on almost all platforms including natively
dynamically interpreted but also internally compiled (like Perl)
gave birth to JSON (you're welcome ppl who remember that the X in AJAX stood for XML)
All these people ranting about JS don't understand that JS isn't frikin magic. It does what it needs to do well.
If you're using it for compute-heavy machine learning, or to maintain a 100k LOC project without Typescript, then why'd you shoot yourself in the foot?
As a proud JS developer I gotta scroll through all these posts gushing over the other languages. Why does nobody rant about using Python for bitcoin mining or Erlang to create a media player?
Cuz if you use the wrong tool for the right job, it's of course gonna blow up in your face.
For example, there was a post claiming JS developers were "scared" of multithreading and only stick in their comfort zone. Like WTF when NodeJS came out everything was multithreaded. It took some brave developers to step out of the comfort zone to embrace the event loop.
For a web app, things like PHP and Node should only be doing light transforms between the database information and HTML anyways. You get one thread to handle the server because you're keeping other threads open to interface with databases and the filesystem. The Nexus.js dev ranting on all us JS devs and doesn't realize that nobody's actual web server is CPU bound because of writing HTML bodies, thats why we only use 1 thread. We use other worker threads to do the heavy lifting (yes there is a C++ bridge look it up)
Anyways TL;DR plz respect JS developers we're people too. ES7 is magic and please don't shit on ES3 or we'll start shitting on the Python 2-3 conversion (need to maintain an outdated binary just cuz people leave out ()'s in their print statements)
Or at least agree that VB.NET is an abomination and insult to the beauty that is TI-84 BASIC13 -
FUCK LINUX
now that I have your attention, and you’re probably angry, too, please, even if you don’t read this rant, never use code.org again. now, onto the rant…
god dammit, code.org sucks. I mean, anyone who created it or associates with it should, well, be considered a terrorist. they’re bombing students futures in computer science with false, useless, bullshit information. not to mention, their sponsors like bill gates, mark zuckerburg, and other rich asses, talk in a video about some boring ass shit that is hard to understand for anyone who doesn’t program, and not to mention, they use a fucking five dollar microphone. ear rape. even if you look at a textual version of it, then read the information on it, it’s practically useless because it's so terribly explained, and also useless. ironically enough, they focus on their animations more than their actual explinations, or their students for that matter. the fact that we had to encode a picture in binary, made me about 50% dumber, give or take a 0 or 1. then, we had to do it in hex, which wasn’t really much better, although more realistic I supposed. what's really the most depressing thing about this class is its application in the real world. I've learnt nothing whatsoever that will help me in the real world, or in computer science. I suppose there's two things that may be useful (that I already knew): hex, and that TCP doesn't lose packets. that's it. those two things. five seconds worth of knowledge from the first quarter of the year. the ideas just make me want to throw up. teaching the main ideas of computer science without actually teaching it? one of the teachers (probably a good one) enrolled her students in an online programming course just so they could understand, because the explanations are just so terrible. this is the only [high school] computer science course offered by code.org, and I signed up because it's an AP computer science class (tried to get into AP Java, the day I was supposed to take the test to get into an upper level class, I was told it didn't count as a tech credit). seriously, fuck code.org. it makes you dumber. their 'app lab' environment is pointless, just like everything else. the app lab is basically where you have a set of commands and have to make a dog bark() or a storm trooper miss() [and that's hell when they haven't introduced while loops yet]. the app lab is literally code.org going out of their way to make everything that their students are learning pointless in the real world. seriously, why can't we just use a <canvas> like an ACTUAL PROGRAMMER would do if they were to make a browser game, not use an app engine so slow it would be faster to update windows and android studio each time I run an 'app' in their 'environment'. their excuse is that the skills "transfer over" to the real world. BITCH! IF I DIDN'T KNOW JAVA, AND I WANTED TO MAKE A GAME IN JAVA, I'M NOT GOING TO LEARN PYTHON, THEN "TRANSFER" THE SKILLS I LEARNT, I'M GOING TO LEARN FUCKING JAVA. AND THAT GOES FOR EVER OTHER LANGUAGE, PROJECT, ETC.
I'm begging you code.org, stop, get help.9 -
I've optimised so many things in my time I can't remember most of them.
Most recently, something had to be the equivalent off `"literal" LIKE column` with a million rows to compare. It would take around a second average each literal to lookup for a service that needs to be high load and low latency. This isn't an easy case to optimise, many people would consider it impossible.
It took my a couple of hours to reverse engineer the data and implement a few hundred line implementation that would look it up in 1ms average with the worst possible case being very rare and not too distant from this.
In another case there was a lookup of arbitrary time spans that most people would not bother to cache because the input parameters are too short lived and variable to make a difference. I replaced the 50000+ line application acting as a middle man between the application and database with 500 lines of code that did the look up faster and was able to implement a reasonable caching strategy. This dropped resource consumption by a minimum of factor of ten at least. Misses were cheaper and it was able to cache most cases. It also involved modifying the client library in C to stop it unnecessarily wrapping primitives in objects to the high level language which was causing it to consume excessive amounts of memory when processing huge data streams.
Another system would download a huge data set for every point of sale constantly, then parse and apply it. It had to reflect changes quickly but would download the whole dataset each time containing hundreds of thousands of rows. I whipped up a system so that a single server (barring redundancy) would download it in a loop, parse it using C which was much faster than the traditional interpreted language, then use a custom data differential format, TCP data streaming protocol, binary serialisation and LZMA compression to pipe it down to points of sale. This protocol also used versioning for catchup and differential combination for additional reduction in size. It went from being 30 seconds to a few minutes behind to using able to keep up to with in a second of changes. It was also using so much bandwidth that it would reach the limit on ADSL connections then get throttled. I looked at the traffic stats after and it dropped from dozens of terabytes a month to around a gigabyte or so a month for several hundred machines. The drop in the graphs you'd think all the machines had been turned off as that's what it looked like. It could now happily run over GPRS or 56K.
I was working on a project with a lot of data and noticed these huge tables and horrible queries. The tables were all the results of queries. Someone wrote terrible SQL then to optimise it ran it in the background with all possible variable values then store the results of joins and aggregates into new tables. On top of those tables they wrote more SQL. I wrote some new queries and query generation that wiped out thousands of lines of code immediately and operated on the original tables taking things down from 30GB and rapidly climbing to a couple GB.
Another time a piece of mathematics had to generate all possible permutations and the existing solution was factorial. I worked out how to optimise it to run n*n which believe it or not made the world of difference. Went from hardly handling anything to handling anything thrown at it. It was nice trying to get people to "freeze the system now".
I build my own frontend systems (admittedly rushed) that do what angular/react/vue aim for but with higher (maximum) performance including an in memory data base to back the UI that had layered event driven indexes and could handle referential integrity (overlay on the database only revealing items with valid integrity) or reordering and reposition events very rapidly using a custom AVL tree. You could layer indexes over it (data inheritance) that could be partial and dynamic.
So many times have I optimised things on automatic just cleaning up code normally. Hundreds, thousands of optimisations. It's what makes my clock tick.4 -
Hi every developer! My name is Allen. English is not my native language so forgive me if I say something that does not make any sense. Let me tell you my story how I become a programmer. (I am still learning) My first computer was a DELL OptiPlex GX 720 desktop. My father bought it for our self-employee job. Before he allow me to use the computer, I used to sit next to him and watching what he do, what he click and what he gets. When he allow me to use the computer, I was slow at typing. One or 2 WPM (word per minute) my father taught me how to use the computer. Very slowly, my typing speed improves. I understand how to use the computer. but one day, I do what make me regret. I was playing with some executables, when I double clicking it, it does not work I used to associate files with apps. I associate music files with every player I want. So, I did what I used to, I associate exe files with windows media center! The computer started to open hundreds of windows media center (WMC for short) whenever an app is clicked, it opens windows media center. Today, I realized that windows were trying to open every app and every process that regularly run. However, since I associate it with WMC, instead of the app itself, it opens WMC some days after the mistake, I wonder how apps work and how I can create my own. My father told me before that a program is simply a binary file that the computer can read. However, it was too advanced to me at the time.I begin my search with google. Everytime I search, it says "learn to code" or something like that. I see some C++ code but, it was disgusting. when I read just a few lines of a hello world code in java. it was too complex
What I seen
#$$#% $%&$%&*#!@
~
(&*%&$ (_(*^% #&&* (^^$(&^$%^( %^*$())
~
^$70^(`*#%`*#&%^)*!" Hello world "#@
~
~
The actual code:
class helloworld
{
public static void main(String args[])
{
System.out.println("Hello World!");
}
}
I look for an easy way but my attempts fail. then. I push
I to learn how to code.I try learning java. but it still
Very complex. i tried LibertyBASIC. from LibertyBASIC to
Java. after learning LibertyBASIC, it was easy!
LibertyBASIC -> Java -> Ruby -> NOW, C# and XAML
Today, I am learning C# and XAML.
My first OS : Windows 7
My first Computer : DELL OptiPlex GX 720
My first successful click : The Start menu
My first used App : Microsoft Encarta 2009
My first created App : Hi-Lo(number-guessing game. written in LibertyBASIC)
Thankyou for reading this Long story.
8 -
I am a firmware developer with 4 years experience. C and sometimes assembly is my bread and butter.
Like 2 years ago, I was really interested to make a switch to application development. Got referred by my friend to her startup.
But I was a bit rusty with my data structures, high level languages and interpersonal skills.
The first question was to find the number of occurences for each word in a paragraph. The language choice was Java. But I was allowed to use C++ since it was the closest relative to Java that I knew.
And I started implementing a binary search tree from scratch and started inserting each tokenised word into it, wrote a traversal algorithm.
The interviewer, luckily, was a patient guy. After I completed my whole mess, he asked is it possible to do this in a slightly better way with constant time access without traversal.
I said yes, we can with a hash table but I dont know how to implement one. He replied I dont expect you to implement the hash table but see you use it. I asked him if I am allowed to used the standard library, for which he said ofcourse.
*facepalm*.
Finally I understood his expectation, referred cppreference.com and used an unordered_map.
Later there were some quesion on databases for which I tried my best to answer. And I frankly replied that I am not comfortable with JS frameworks as of now. Got rejected.
So the mistake is I never asked basic questions like what is the time complexity expects, if I was allowed to use standard library, didnt spend some extra time on studying stuffs needed for the domain switch and most importantly I panicked.7 -
there's this club at my school, called STEM, and another called "science olympiad." both are pretty cringey, bad, or boring. science olympiad was just for the college credit. during the intro to the club, they said there was a coding section. "game on!" is what they dubbed it as, where basically you're timed to make a game in scratch. i'm fucking tired of it. why is scratch considered programming? don't get me wrong, i'll write an OS in PHP before i say code.org is better than scratch, but fuck it. its a fucking interpreted language that's interpreted by another interpreted language. i don't understand why this shit is still used. scratch isn't good. please codecademy or w3schools or just write in binary directly, but not scratch. my hand hurts from dragging and dropping, my eyes hurt from the light theme, my imaginary cat committed suicide after learning about scratch's mascot. fuck it. now onto stem club, fuck it too. not for being bad (well, kinda), but for not being more recognized. it should be above science olympiad, and other clubs because you actually have to think instead of just memorize. but alas, we still were offered the choice of scratch to program the robot. sigh. arduino much? i guess not. challenging much? nope. was i elected "leader"? with three of my friends out of the eight there, i could have been, but no. effort in this would be depressing.rant fuck off fucking clubs fuck you fucking fuck fuck code.org just fuck fuck clubs fuck scratch fucking ducks fucking hell fuck this shit
-
(long post is long)
This one is for the .net folks. After evaluating the technology top to bottom and even reimplementing several examples I commonly use for smoke testing new technology, I'm just going to call it:
Blazor is the next Silverlight.
It's just beyond the pale in terms of being architecturally flawed, and yet they're rushing it out as hard as possible to coincide with the .Net 5 rebranding silo extravaganza. We are officially entering round 3 of "sacrifice .Net on the altar of enterprise comfort." Get excited.
Since we've arrived here, I can only assume the Asp.net Ajax fiasco is far enough in the past that a new generation of devs doesn't recall its inherent catastrophic weaknesses. The architecture was this:
1. Create a component as a "WebUserControl"
2. Any time a bound DOM operation occurs from user interaction, send a payload back to the server
3. The server runs the code to process the event; it spits back more HTML
Some client-side js then dutifully updates the UI by unceremoniously stuffing the markup into an element's innerHTML property like so much sausage.
If you understand that, you've adequately understood how Blazor works. There's some optimization like signalR WebSockets for update streaming (the first and only time most blazor devs will ever use WebSockets, I even see developers claiming that they're "using SignalR, Idserver4, gRPC, etc." because the template seeds it for them. The hubris.), but that's the gist. The astute viewer will have noticed a few things here, including the disconnect between repaints, inability to blend update operations and transitions, and the potential for absolutely obliterative, connection-volatile, abusive transactional logic flying back and forth to the server. It's the bring out your dead approach to seeing how much of your IT budget is dedicated to paying for bandwidth and CPU time.
Blazor goes a step further in the server-side render scenario and sends every DOM event it binds to the server for processing. These include millisecond-scale events like scroll, which, at least according to GitHub issues, devs are quickly realizing requires debouncing, though they aren't quite sure how to accomplish that. Since this immediately becomes an issue with tickets saying things like, "scroll event crater server, Ugg need help! You said Blazorclub good. Ugg believe, Ugg wants reparations!" the team chooses a great answer to many problems for the wrong reasons:
gRPC
For those who aren't familiar, gRPC has a substantial amount of compression primarily courtesy of a rather excellent binary format developed by Google. Who needs the Quickie Mart, or indeed a sound markup delivery and view strategy when you can compress the shit out of the payload and ignore the problem. (Shhh, I hear you back there, no spoilers. What will happen when even that compression ceases to cut it, indeed). One might look at all this inductive-reasoning-as-development and ask themselves, "butwai?!" The reason is that the server-side story is just a way to buy time to flesh out the even more fundamentally broken browser-side story. To explain that, we need a little perspective.
The relationship between Microsoft and it's enterprise customers is your typical mutually abusive co-dependent relationship. Microsoft goes through phases of tacit disinterest, where it virtually ignores them. And rightly so, the enterprise customers tend to be weaksauce, mono-platform, mono-language types who come to work, collect a paycheck, and go home. They want to suckle on the teat of the vendor that enables them to get a plug and play experience for delivering their internal systems.
And that's fine. But it's also dull; it's the spouse that lets themselves go, it's the girlfriend in the distracted boyfriend meme. Those aren't the people who keep your platform relevant and competitive. For Microsoft, that crowd has always been the exploratory end of the developer community: alt.net, and more recently, the dotnet core community (StackOverflow 2020's most loved platform, for the haters). Alt.net seeded every competitive advantage the dotnet ecosystem has, and dotnet core capitalized on. Like DI? You're welcome. Are you enjoying MVC? Your gratitude is understood. Cool serializers, gRPC/protobuff, 1st class APIs, metadata-driven clients, code generation, micro ORMs, etc., etc., et al. Dear enterpriseur, you are fucking welcome.
Anyways, b2blazor. So, the front end (Blazor WebAssembly) story begins with the average enterprise FOMO. When enterprises get FOMO, they start to Karen/Kevin super hard, slinging around money, privilege, premiere support tickets, etc. until Microsoft, the distracted boyfriend, eventually turns back and says, "sorry babe, wut was that?" You know, shit like managers unironically looking at cloud reps and demanding to know if "you can handle our load!" Meanwhile, any actual engineer hides under the table facepalming and trying not to die from embarrassment.36 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
I don't get it.
I tried Kotlin on Android just for fun, and it doesn't support binary data handling, not even unsigned types until the newest version. Java suffers from the same disease.
How does one parse and process binary data streams on such a high end system? Not everything is highlevel XML or JSON today.
And it's not only an Android issue.
Python has some support for binary data, and it's powerful, but not comfortable.
I tried Ruby, Groovy, TCL, Perl and Lua, and only Lua let's you access data directly without unnecessary overhead.
C# is also akward when it comes to data types less than the processer register width.
How hard can it be to access and manipulate data in its natural and purest form?
Why do the so called modern programming language ignore this simple aspect that is needed on an everyday basis?11 -
So after my hosting my first project and announcing it on devrant, the users pointed out the many security faults and places where the code can be exploited ( thank you so much ). So I started my research on security ( im 99% self-taught ). The first thing I landed across is the code vulnerabilities which the I can fix then the vulnerabilities of the language itself and then binary code to overrun whatever the language it is. Well, the topic gets broader and broader. If I click on a link named xxx vulnerabilities oh god that is a whole new collection of hundeds of wiki like pages. I feel like I'm lost and here I need some real help2
-
Is the way people solve problems intrinsic to the native language they learned growing up? Can the shape of our thoughts be optimal for solving certain kinds of problems? Like sentence structure, grammar, etc.
If the pattern of thoughts a language promotes can help us solve problems. Then is there a spoken language that can help promote solving computer science problems?
I know I have to work to think differently to program in different styles of programming. I wonder if we can learn from different spoken languages patterns of logic that are applicable to engineering.
Mathematics, while not a spoken language, has helped me re-frame things in programming. I think programming has also helped in other areas. Like using binary search to find the end of a pipe in the ground.5 -
If you've ever tried using Go plugins raise your hand.
If you've ever tried doing plugins in Go, raise your hand.
If you think that the following rant will be interesting, raise your hand.
If you raised your hand, press [Read More]:
This is a tale of pain and sorrow, the sorrow of discovering that what could be a wonderful feature is woefully incomplete, and won't be for a very long time...
Go plugins are a cool feature: dynamically load pre-compiled code, and interact with it in a useful and relatively performant way (e.g. for dynamically extending the capabilities of your program). So far it sounds great, I know right?
Now let me list off some issues (in order of me remembering them):
1. You can't unload them (due to some bs about dlopen), so you need to restart the application...
2. They bundle the stdlib like a regular Go binary, despite the fact that they're meant to be dynamic!
3. #2 wouldn't be so bad if they didn't also require identical versions of all dependencies in both binaries (meaning you'd need to vendor the dependencies, and also hope you are using the right Go version).
4. You need to use -trimpath or everything dies...
All in all, they are broken and no one is rushing to fix it (literally, the Go team said they aren't really supporting it currently...).
So what other options are there for making plugins in Go?
There's the Hashicorp method of using RPC, where you have two separate applications one the plugin, one the plugin server, and they communicate over RPC. I don't like it. Why? Because it feels like a hack, it's not really efficient and it carries a fear of a limitation that I don't like...
Then we come to a somewhat more clever approach: using Lua (or any other scripting language), it's well known, it's what everyone uses (at least in games...). But, it simply is too hard to use, all the Go Lua VMs I could find were simply too hard to set up...
Now we come to the most creative option I've seen yet: WASM. Now you ask "WASM!? But that's a web thing, how are you gonna make that work?" Indeed, my son, it is a web thing, but that doesn't mean I can't use it! Someone made a WASM VM for Go, and the pros are that you can use any WASM supporting language (i.e. any/all of them). Problem inefficient, PITA to use, and also suffers from the same issues that were preventing me from using Lua.
Enter Yaegi, a Go interpreter created by the same guys who made (and named) Traefik. Yes, you heard me right, an INTERPRETER (i.e. like python) so while it's not super performant (and possibly suffering from large inefficiency issues), it's very easy to set up, and it means that my plugins can still be written in Go (yay)! However, don't think this method doesn't have its own issues, there's still the problem of effectively abstracting different types of plugins without requiring too much boilerplate (a hard problem that I'm actively working on, commits coming soon). However, this still feels to be the best option.
As you can see, doing plugins in Go is a very hard problem. In the coming weeks (hopefully), I'm going to (attempt to at least) benchmark all the different options, as well as publish a library that should help make using Yaegi based plugins easier. All of this stuff will go (see what I did there 😉) in a nice blog post that better explains the issues and solutions. But until then I have some coding to do...
Have a good night(/day)!13 -
ARE YOU F*CKING KIDDING ME, WINDOWS?
I finally give in and install your stupid ass update, and what happens? let's just skip the part where i sit around for 30 minutes because apparently it takes 30 fucking minutes to install 300MB - by stallman's underpants, that's 150KB per second!
and when windows FINALLY feels like it has finished fondling it's binary-balls, what is waiting for me?
about 10 stupid-ass data-consent notices straight from satans anus, more weird yes-or-maybe settings for cortana (bill gates' ex or whatever that is) which i don't even USE, my browser speed dial has been complemented by about 7 links to SHOPPING SITES and once i sort that mess out i get a notification that the german language pack has successfully been installed.
SUCK MY FUCKING D*CK MICROSOFT, the ONLY thing i want to do with that language pack is SHOVE IT UP YOUR ASS. i can't even uninstall that moronic piece of shit. FUCK YOU.2 -
Major rant incoming. Before I start ranting I’ll say that I totally respect my professor’s past. He worked on some really impressive major developments for the military and other companies a long time ago. Was made an engineering fellow at Raytheon for some GPS software he developed (or lead a team on I should say) and ended up dropping fellowship because of his health. But I’m FUCKING sick of it. So fucking fed up with my professor. This class is “Data Structures in C++” and keep in mind that I’ve been programming in C++ for almost 10 years with it being my primary and first language in OOP.
Throughout this entire class, the teacher has been making huge mistakes by saying things that aren’t right or just simply not knowing how to teach such as telling the students that “int& varOne = varTwo” was an address getting put into a variable until I corrected him about it being a reference and he proceeded to skip all reference slides or steps through sorting algorithms that are wrong or he doesn’t remember how to do it and saying, “So then it gets to this part and....it uh....does that and gets this value and so that’s how you do it *doesnt do rest of it and skips slide*”.
First presentation I did on doubly linked lists. I decided to go above and beyond and write my own code that had a menu to add, insert at position n, delete, print, etc for a doubly linked list. When I go to pull out my code he tells me that I didn’t say anything about a doubly linked list’s tail and head nodes each have a pointer pointing to null and so I was getting docked points. I told him I did actually say it and another classmate spoke up and said “Ya” and he cuts off saying, “No you didn’t”. To which I started to say I’ll show you my slides but he cut me off mid sentence and just yelled, “Nope!”. He docked me 20% and gave me a B- because of that. I had 1 slide where I had a bullet point mentioning it and 2 slides with visual models showing that the head node’s previousNode* and the tail node’s nextNode* pointed to null.
Another classmate that’s never coded in his life had screenshots of code from online (literally all his slides were a screenshot of the next part of code until it finished implementing a binary search tree) and literally read the code line by line, “class node, node pointer node, ......for int i equals zero, i is less than tree dot length er length of tree that is, um i plus plus.....”
Professor yelled at him like 4 times about reading directly from slide and not saying what the code does and he would reply with, “Yes sir” and then continue to read again because there was nothing else he could do.
Ya, he got the same grade as me.
Today I had my second and final presentation. I did it on “Separate Chaining”, a hashing collision resolution. This time I said fuck writing my own code, he didn’t give two shits last time when everyone else just screenshot online example code but me so I decided I’d focus on the PowerPoint and amp it up with animations on models I made with the shapes in PowerPoint. Get 2 slides in and he goes,
Prof: Stop! Go back one slide.
Me: Uh alright, *click*
(Slide showing the 3 collision resolutions: Open Addressing, Separate Chaining, and Re-Hashing)
Prof: Aren’t you forgetting something?
Me: ....Not that I know of sir
Prof: I see Open addressing, also called Open Hashing, but where’s Closed Hashing?
Me: I believe that’s what Seperate Chaining is sir
Prof: No
Me: I’m pretty sure it is
*Class nods and agrees*
Prof: Oh never mind, I didn’t see it right
Get another 4 slides in before:
Prof: Stop! Go back one slide
Me: .......alright *click*
(Professor loses train of thought? Doesn’t mention anything about this slide)
Prof: I er....um, I don’t understand why you decided not to mention the other, er, other types of Chaining. I thought you were going to back on that slide with all the squares (model of hash table with animations moving things around to visualize inserting a value with a collision that I spent hours on) but you didn’t.
(I haven’t finished the second half of my presentation yet you fuck! What if I had it there?)
Me: I never saw anything on any other types of Chaining professor
Prof: I’m pretty sure there’s one that I think combines Open Addressing and Separate Chaining
Me: That doesn’t make sense sir. *explanation why* I did a lot of research and I never saw any other.
Prof: There are, you should have included them.
(I check after I finish. Google comes up with no other Chaining collision resolution)
He docks me 20% and gives me a B- AGAIN! Both presentation grades have feedback saying, “MrCush, I won’t go into the issues we discussed but overall not bad”.
Thanks for being so specific on a whole 20% deduction prick! Oh wait, is it because you don’t have specifics?
Bye 3.8 GPA
Is it me or does he have something against me?7 -
Binary combinatory logic (BCL) is gonna change the world. Yes, it's an esolang, but its easy to evolve populations of programs written in it. Then, when you have a winner, you can easily embed it in a C binary as a hardcoded arg to your interpreter function. Or use logic to translate it to any other programming language.1
-
So I see posts about an interview question/challenge of inverting a binary tree. I don't use trees very often (mainly file related or parsing server nodes), but I thought I would learn how to do this.
I saw a page that started talking about different ways to invert enough to understand that one type of inversion is swapping left and right nodes. So I stopped before they showed how.
Then I created a test program that has a tree structure and also can display a tree before and after modification. This was kind of fun.
So then I wrote the inversion function. It was less than 10 lines of code. Wtf? I thought it would be harder than this.
Then I started wondering where trees were used. So today I have been learning how they are used and why I might need one to solve a problem. One use I intuited was parsing regex or a language. Apparently it is useful there.
What I am learning is that a lot of these interview questions are really test to see if you can comprehend instructions when stressed. Or you will ask questions to clarify the task. It doesn't necessarily test your ability to solve hard problems.
One thing that perplexes me. If inverting a tree is swapping nodes left<->right, then why not leave data in place and just swap roles in the functions. Maybe I completely misunderstood what inversion means or why it would be done. I guess if this is not inverting I have the structure to try other methods now.2 -
The rear ducking continues. We've built a reliable translator in the dumbest fucking way possible, it's just lovely. I simply reused the structure for feeding data to the VM assembler, an array of arrays, where there's one array of (ins [args]) per node in the parse tree.
It's nice because nodes can be solved out of order without affecting the actual sequence in which the instructions are output. And if one statement (node) equals multiple instructions, you just push multiple entries to the corresponding array, or push nothing if you need to output nothing. Easy as goblin pie.
This is enough to convert an input language to the assembly-like intermediate representation we use for the virtual machine. So then there's doing it backwards: walk the same array of arrays, and map those virtual instructions to a physical architechture. I guess I could do the encoding to native binary myself, it'd certainly be interesting to try, but I'm burnt-out already so I'll just use fasm for now.
Initial test: wrote a test program in my own stupid language, ran the translator, dump output to file, assemble that with fasm, run with r2 -d.
Crashes? No.
Runs fine? Yes and no.
For fuck's sake, I don't have syscalls. Mainly because the VM doesn't have an operating system, lmao. I was testing virtual programs by just freezing state, terminating, then dumping the fucking registers and stack to the console, we have no I/O to speak of. Not even a real 'exit', VM handles that by reading a return value every step like a mentally damaged son of a bitch.
So anyway, I manually paste the linux mambo, you know:
mov rax,60
mov rdi,0
syscall
And NOW our program can end execution without crashing.
Okay then, so does the test code work correctly?
** DRUM ROLL **
Yes.
Ladies and gentlemen, mother fucking PESO is now a compiled language, and going forward I will be expectantly receiving your marriage proposals for reviewing. Oh, but not so fast, we still need a frontend...
Well, we'll handle that in the next few days. I'm just glad to be *nearly* finished with this fucking compiler, I want nothing to do with anything else ever, but we know that's not going to happen, so Lord please end my pain.
No sponsor as this rant has been paid for by tax evasion. -
A beginner in learning java. I was beating around the bushes on internet from past a decade . As per my understanding upto now. Let us suppose a bottle of water. Here the bottle may be considered as CLASS and water in it be objects(atoms), obejcts may be of same kind and other may differ in some properties. Other way of understanding would be human being is CLASS and MALE Female be objects of Class Human Being. Here again in this Scenario objects may differ in properties such as gender, age, body parts. Zoo might be a class and animals(object), elephants(objects), tigers(objects) and others too, Above human contents too can be added for properties such as in in Zoo class male, female, body parts, age, eating habits, crawlers, four legged, two legged, flying, water animals, mammals, herbivores, Carnivores.. Whatever.. This is upto my understanding. If any corrections always welcome. Will be happy if my answer modified, comment below.
And for basic level.
Learn from input, output devices
Then memory wise cache(quick access), RAM(runtime access temporary memory), Hard disk (permanent memory) all will be in CPU machine. Suppose to express above memory clearly as per my knowledge now am writing this answer with mobile net on. If a suddenly switch off my phone during this time and switch on.Cache runs for instant access of navigation,network etc.RAM-temporary My quora answer will be lost as it was storing in RAM before switch off . But my quora app, my gallery and others will be on permanent internal storage(in PC hard disks generally) won't be affected. This all happens in CPU right. Okay now one question, who manages all these commands, input, outputs. That's Software may be Windows, Mac ios, Android for mobiles. These are all the managers for computer componential setup for different OS's.
Java is high level language, where as computers understand only binary or low level language or binary code such as 0’s and 1’s. It understand only 00101,1110000101,0010,1100(let these be ABCD in binary). For numbers code in 0 and 1’s, small case will be in 0 and 1s and other symbols too. These will be coverted in byte code by JVM java virtual machine. The program we write will be given to JVM it acts as interpreter. But not in C'.
Let us C…
Do comment. Thank you6 -
C- let's See
C is a procedurally developed language follows sequential method of solving a problem.
Example
If a teacher of an Institute teaching various subjects, Maths, English, Science and History.
Case1.One student comes and asks teacher to teach English
and next student to teach Maths,
And the other to teach History.
Case2.Next students comes for English
Case3.Other one for History.
So what I understood regarding C is procedural language is
It completes first case1,next case2, and then case3. (Task after task)
Here English is taught 2 times seperate
And History too 2 times separately making time and process complexity.
C is a platform based high level language support only desired platform. If I program in windows with i3 processor , it runs only on the same OS and Processor, if code is run in other computers.
Single threaded, if a code is interrupted in between, stops there and doesn't allow other part of the code to run.
Java
In this if the same above cases encountered then and tell
Computer to create a Class of English and tell all the students to attend the class(time saving, No complexity and not repetitive)
Same way Creating History class and make all students attend the class at once.
Students may be the objects created.
Multi threaded language, if a task is interrupted following code cannot be stopped. Allows other part of thecode to run.
JVM- Java virtual machine allows Java code into signs that can be understood by computer. Where as C converts into binary code.
A class concept added to C language become C++rant support rant learning to code want to code jvm newbie asking high level languages are cool discussions java c mistakes3 -
Genuine question
You're given a server with the latest Ubuntu. You can't install any deps, and you can't use docker. Your goal is to write a REST API backend that can store/retrieve data persistently, ideally with a SQL-like language. Bonus points if you can figure out a reverse-proxy.
What would you do?
I'm obsessed with an idea of having some kind of codebase that doesn't include binary files and that I can just ssh over to a fresh server, and it would work instantly18 -
I swear by all that is binary when I see Siri I’m going to curb stomp that skank. She single handedly makes me want to toss my iPhone X (which is heavy) into an Apple store window then pick it up to ask for assistance with a representative. I hate having to say “hey Siri!” just so she can take her sweet ass time to answer me. When I speak she fucking deaf and misinterprets what I say 80% of the time. English is my native language. I have to talk like a robot for her to get me. Alexa I’ve noticed works faster and more accurate. Siri take note!1
-
Yet another unusual take for the Orchid STL: Unicode codepoints aren't a part of the string library.
For the purposes of a high level language, the unit of text is a grapheme. Strings can be converted between Unicode and binary blobs. In a binary, indices address bytes. In text, indices address graphemes. For example, searching a string for a substring that consists of a single letter implies the added constraint that the letter must not have accents or other modifiers.
For storage and transfer optimization it's possible to discover the byte length of a string without converting it to binary2