Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "c is assembly"
-
I'm a self-taught 19-year-old programmer. Coding since 10, dropped out of high-school and got fist job at 15.
In the the early days I was extremely passionate, learning SICP, Algorithms, doing Haskell, C/C++, Rust, Assembly, writing toy compilers/interpreters, tweaking Gentoo/Arch. Even got a lambda tattoo on my arm after learning lambda-calculus and church numerals.
My first job - a company which raised $100,000 on kickstarter. The CEO was a dumb millionaire hippie, who was bored with his money, so he wanted to run a company even though he had no idea what he was doing. He used to talk about how he build our product, even tho he had 0 technical knowledge whatsoever. He was on news a few times which was pretty cringeworthy. The company had only 1 programmer (other than me) who was pretty decent.
We shipped the project, but soon we burned through kickstart money and the sales dried off. Instead of trying to aquire customers (or abandoning the project), boss kept looking for investors, which kept us afloat for an extra year.
Eventually the money dried up, and instead of closing gates, boss decreased our paychecks without our knowledge. He also converted us from full-time employees to "contractors" (also without our knowledge) so he wouldn't have to pay taxes for us. My paycheck decreased by 40% by I still stayed.
One day, I was trying to burn a USB drive, and I did "dd of=/dev/sda" instead of sdb, therefore wiping out our development server. They asked me to stay at company, but I turned in my resignation letter the next day (my highest ever post on reddit was in /r/TIFU).
Next, I found a job at a "finance" company. $50k/year as a 18-year-old. CEO was a good-looking smooth-talker who made few million bucks talking old people into giving him their retirement money.
He claimed he changed his ways, and was now trying to help average folks save money. So far I've been here 8 month and I do not see that happening. He forces me to do sketchy shit, that clearly doesn't have clients best interests in mind.
I am the only developer, and I quickly became a back-end and front-end ninja.
I switched the company infrastructure from shitty drag+drop website builder, WordPress and shitty Excel macros into a beautiful custom-written python back-end.
Little did I know, this company doesn't need a real programmer. I don't have clear requirements, I get unrealistic deadlines, and boss is too busy to even communicate what he wants from me.
Eventually I sold my soul. I switched parts of it to WordPress, because I was not given enough time to write custom code properly.
For latest project, I switched from using custom React/Material/Sass to using drag+drop TypeForms for surveys.
I used to be an extremist FLOSS Richard Stallman fanboy, but eventually I traded my morals, dreams and ideals for a paycheck. Hey, $50k is not bad, so maybe I shouldn't be complaining? :(
I got addicted to pot for 2 years. Recently I've gotten arrested, and it is honestly one of the best things that ever happened to me. Before I got arrested, I did some freelancing for a mugshot website. In un-related news, my mugshot dissapeared.
I have been sober for 2 month now, and my brain is finally coming back.
I know average developer hits a wall at around $80k, and then you have to either move into management or have your own business.
After getting sober, I realized that money isn't going to make me happy, and I don't want to manage people. I'm an old-school neck-beard hacker. My true passion is mathematics and physics. I don't want to glue bullshit libraries together.
I want to write real code, trace kernel bugs, optimize compilers. Albeit, I was boring in the wrong generation.
I've started studying real analysis, brushing up differential equations, and now trying to tackle machine learning and Neural Networks, and understanding the juicy math behind gradient descent.
I don't know what my plan is for the future, but I'll figure it out as long as I have my brain. Maybe I will continue making shitty forms and collect paycheck, while studying mathematics. Maybe I will figure out something else.
But I can't just let my brain rot while chasing money and impressing dumb bosses. If I wait until I get rich to do things I love, my brain will be too far gone at that point. I can't just sell myself out. I'm coming back to my roots.
I still feel like after experiencing industry and pot, I'm a shittier developer than I was at age 15. But my passion is slowly coming back.
Any suggestions from wise ol' neckbeards on how to proceed?32 -
Anyone looking for something interesting to do???
Step 1) understand how basic circuitry works on a bread board nothing too fancy. ( Implement NAND, AND, ADDER, SUBTRACTOR)
Step 2) learn about microprocessors and how OS works
Step 3) learn assembly
Step 4)write a basic assembler and understand how loaders and linkers works !
Step 5) write a kernel with very basic features like memory management and process management and some drivers for IO
Step 5) write an emulator for some simple systems .! ex chip-8.
Step 6) read about compiler theory and automata
Step 7) write a basic Python interpreter that compiles (not interpreter) to native assembly.
Step 8) implement TCP stack .
Step 9) learn as much as u can about complexity measurement ), data structures and algorithms using C or C++ it's very important ( familiarity with pointers and thus computer memory )
Step 10) learn any high level language of choice like Python or Ruby.
Step 11) stop debating over tabs vs spaces , emacs vs vim , angular vs vue, php vs Python , OOps vs procedular vs functional ( just know about all of them and when to use but don't fucking debate over which one is superior )..
Step 12) live happily and be healthy.30 -
Assembly: He’s the nerd. He speaks very quickly and uses short sentences. Very few people talk to him. He’s considered to be an autist asperger by a majority of the class because he finishes the exams so quickly it’s insane and he faces a lot of difficulties in speaking with others. He’s at school but already dressed like an engineer.
Ada: She’s a foureyes nerd. When she gets the answer she’s doesn’t make any mistake. Ada often corrects the teacher when she writes a line a little ambiguous. She’s building a rocketship in her backyard and she’s always speaking about this weird hobby.
Python: He’s Mr Popular. He likes skate, brags about all the parties he’s invited to. He’s good in all the subjects taught in class but he’ll do them a bit slower than the others. Everyone loves him because he explainsthings so well, sometimes the teacher herself asks Python to explain some part of the course. He’s dressed with a hoodie, a baggy and glasses on the top of the head ;)
Java: She is one of the toppers of the class and very popular. She’s very good in all the topics. The teacher loves her but she’s a very talkative person.
Scala/Kotlin: They are twin sisters and the best friends of Java. Unfortunately, they are not as popular and it’s often Java who takes the lead in the group. It’s very difficult to distinguish one from another. Both are far less talkative than Java but Scala speaks a bit differently than Kotlin and Java.
C: He’s the topper of the class. He’s so fast in completing the exams that the teacher really thinks he’s copying Assembly’s work. He has a little brother C++ and they share a lot in common together. He’s the chess major and often plays chess with Assembly and his big brother.
Go: He’s the new kid on the bloc. He doesn’t like C++ and his friends and he wants to prove he can do better than them. Of course, he prefers playing Go over Chess.
APL: He’s a lonely guy. No one understands him when he speaks. Even the teacher is surprised when APL shows a correct answer after several lines of incomprehensible pictograms. People think that he was born in a foreign country… or a foreign planet ?
HTML/CSS: These twin brothers are very different. One is dressed in black and white and the other is dressed with everything except black and white. HTML is very talkative and annoying and the CSS is very artistic. CSS is the best student in Art lessons and HTML performs well in written expression.
LaTeX: She’s friend of HTML. The teacher likes her because she has a gift of writing. LaTeX likes the mathematical courses because she can draw fancy greek letters. The teacher knows this well and she is often asked to write a formula on the black board.
VBA: He’s in the back, looking through the windows. Not really interested in the courses taught in class. In the exams, he answers always with a table.
C#: He’s in the back playing yet another game on his smartphone. He likes being next to the windows also.
JavaScript: People often mix up Java and JavaScript because they have a similar name. But they are definitly not the same. Javascript spends a lot of time with HTMLand CSS. He’s as artistic as CSS but he prefers things that move. He likes actions and movies. CSS dreams to be a painter wheras JavaScript wants to be a film-maker.
Haskell: He’s a goth. Dressed up in dark. Doesn’t talk to anyone. He doesn’t understand why others write pages when he can write a couple of lines to answer the same question.
Julia: She’s the newest student here. She doesn’t have any friends yet but her secret aim is to be as popular as Python and as fast as C.
Credit: Thomas jalabert4 -
A is for Assembly, a wizard's spell
B is for Bootstrap, so bland and the same. And also for Brainf*ck, will blow you away
C is for COBOL, your grandad knows that
D is for daemon, your server knows what
E is for Express.js, you node what is coming
F is for FORTRAN, which is perferct for sciencing
G is for GNU which is GNU not UNIX
H is for Haskell using functional units
I is for Intance, An action of Object
J is for Java plays with them Always
K is for Kotlin, Android's new toy
L is for Lisp, scheming a ploy
M is for Matlab, who knows how it works
N is for Node a bloatware of code
O is for Objective Pascal, you did not expect that
P is for programming, we all love to do that
Q is for Queries, A database is made
R is for R, statistics are great
S is for Selenium, you have to test that
S is for Smalltalk, let's make it all brief
T is for Turing Test, how human is this?
U is for Unix, build with all talents
V is for Visual Studio, built with all laments
W is for Web, lets build something cool
X is for XHTML, remember all that?
Y is for Y2K, I'm tired as f*ck
Z is for Zip, let's zip is all now.
Get yourself coffee and back to the grind.8 -
Look at this! I finally succeeded in running C code on my shitty TI-83 calculator with a Z80 CPU. This is a huge improvement as I made whole games for that thing with assembly before! 😁😍20
-
Still trying to get good.
The requirements are forever shifting, and so do the applied paradigms.
I think the first layer is learning about each paradigm.
You learn 5-10 languages/technologies, get a feeling for procedural/functional/OOP programming. You mess around with some electronics engineering, write a bit of assembly. You write an ugly GTK program, an Android todo app, check how OpenGL works. You learn about relational models, about graph databases, time series storage and key value caches. You learn about networking and protocols. You void the warranty of all the devices in your house at some point. You develop preferences for languages and systems. For certain periods of time, you even become an insufferable fanboy who claims that all databases should be replaced by MongoDB, or all applications should be written in C# -- no exceptions in your mind are possible, because you found the Perfect Thing. Temporarily.
Eventually, you get to the second layer: Instead of being a champion for a single cause, you start to see patterns of applicability.
You might have grown to prefer serverless microservice architectures driven by pub/sub event busses, but realize that some MVC framework is probably more suitable for a 5-employee company. You realize that development is not just about picking the best language and best architecture -- It's about pros and cons for every situation. You start to value consistency over hard rules. You realize that even respected books about computer science can sometimes contain lies -- or represent solutions which are only applicable to "spherical cows in a vacuum".
Then you get to the third layer: Which is about orchestrating migrations between paradigms without creating a bigger mess.
Your company started with a tiny MVC webshop written in PHP. There are now 300 employees and a few million lines of code, the framework more often gets in the way than it helps, the database is terribly strained. Big rewrite? Gradual refactor? Introduce new languages within the company or stick with what people know? Educate people about paradigms which might be more suitable, but which will feel unfamiliar? What leads to a better product, someone who is experienced with PHP, or someone just learning to use Typescript?
All that theoretical knowledge about superior paradigms won't help you now -- No clean slates! You have to build a skyscraper city to replace a swamp village while keeping the economy running, together with builders who have no clue what concrete even looks like. You might think "I'll throw my superior engineering against this, no harm done if it doesn't stick", but 9 out of 10 times that will just end in a mix of concrete rubble, corpses and mud.
I think I'm somewhere between 2 and 3.
I think I have most of the important knowledge about a wide array of languages, technologies and architectures.
I think I know how to come to a conclusion about what to use in which scenario -- most of the time.
But dealing with a giant legacy mess, transforming things into something better, without creating an ugly amalgamation of old and new systems blended together into an even bigger abomination? Nah, I don't think I'm fully there yet.8 -
My second year of high-school, we started having class in computer science. I was really looking forward to it cause I always wanted to learn programming.
On first sight it appeared that the professor which taught the class knew something, he looked like a genuine geek with those dorky glasses, briefcase and pants like Steve Urkel, but after couple of his lessons you could see he had no real dev experience and just basic understanding of programming in theory. He was more reading stuff from the book than he was trying to explain them to students and give some real world examples.
So it was just one these days, everybody got back from vacation, it's hot outside, the guy is just reading sentences from his book, half of students talk with each other and other half doesn't give a fuck about him or his class. Pretty sure I was the only one trying to listen to him and learn something from his recitals.
All of a sudden he notices the atmosphere in the classroom, slams the book shut, gives out couple of F-s to the loudest students and yells out loud "NONE OF YOU IN THIS ROOM WILL EVER ACCOMPLISH ANYTHING IN YOUR LIFE, BARE ALONE IN PROGRAMMING"
At first I felt like shit, but soon after that I started thinking "who the hell are you to tell me what I could or will accomplish in my life". Couple weeks later I've bought myself a first book in programming and started learning C++ late at night since I understood that I won't learn anything about programming in that school. Two years later I was correcting this same professor with his claims on a whiteboard in front of a whole class.
Today, seven years after his words I'm a developer living in foreign country with what I could say somewhat a solid experience and understanding of how both software and web are build, while that same professor still recites to his pupils difference between assembly and object code, while praying nobody asks him where and how these are used. For maybe a quarter of my paycheck. So much about his psychic powers..4 -
Coded in C language for first time (due to college assignments)...
Just found out that there are no strings in c language 😐29 -
I learnt programming by making cheats for games and reverse engineering them. It was a fun experience as it wasn't always easy to start with C++ and assembly but it was definitely worth it. Though when you come from a low level language such as C++, looking at highly abstract languages such as Javascript makes everything feel wrong in Javascript, especially when it comes to types and how you can just switch types in the middle of the code :D. But it also gives you an understanding of how Javascript could be implemented, what the engine is doing in the background when you create an object etc..
-
!rant
A rather long(it's 8 hrs long to be precise) story
So I just finished an amazing homework assignment. The goal was to open a new shell on Linux using a C program. We were asked to follow instructions from http://phrack.org/issues/49/14.html . However the instructions given were for 32 bit processors and we had to do same for 64 bit machines. In a nutshell we had to write a 64 bit shell code and use buffer-overflow technique to change the return address if the function to our shell code.
I was able to write my own shellcode within 1hr and was able to confirm that it's working by compiling with nasm and all. Also the "show-off-dev" inside me told me to execute "/bin/bash" instead of "/bin/sh"(which everyone else was going to do). After my assembly code was properly executing shellcode, I was excited to put it in my C code.
For that, I needed opcodes of assembly code in a string. Following again the "show-off-dev" inside me, I wrote a shell script which would extract the exact opcodes out of objdump output. After this I put it in my C code, call my friend and tell him that "hell yeah bro, I did it. Pretty sure sir is gonna give me full marks etc etc etc". I compiled the code and BOOM, IT SEGFAULTS RIGHT IN FRONT OF MY FRIEND. Worst, friend had copied a "/bin/sh" code from shellstorm and already had it working.
Really burned my ego, I sat continuously for 8 hrs in front of my laptop and didn't talk to anyone. I was continuously debugging the code for 8 hrs. Just a few minutes ago, I noticed that the shellcode which I'm actually putting in my C code is actually 2 bytes shorter than actual code length. WHAT THE F. I ran objdump manually and copied the opcodes one by one into the string (like a noob) and VOILA ! IT WORKED !!!
TURNS OUT I DIDN'T CUT THE LAST COLUMN OF OPCODES IN MY SHELL SCRIPT. I FIXED THAT AND IT WORKED !!
THE SINGLE SHITTY NUMBER MADE ME STRUGGLE 8 HRS OF MY LIFE !! SMH
Lessons learnt :
1)Never have such an ego that makes you think you're perfect, cuz you're retarded not perfect
2)Examine your scripts properly before using them
3)Never, I repeat NEVER!! brag about your code before compiling and testing it.
That's it!
If you've read this long story, you might as well press the "++" button.6 -
Tl;Dr - It started as an escape, carried on as fun, then as a way to be lazy, and finally as a way of life. Coding has defined and shaped my entire life from the age of nine.
When I was nine I was playing a game on my ZX spectrum and accidentally knocked the keyboard as I reached over to adjust my TV. Incredibly parts of it actually made a little sense to me and got my curiosity. I spent hours reading through that code, afraid to turn the Spectrum off in case I couldn't get back to it. Weeks later I got hold of a book of example code to copy out to do various things like making patterns on the screen. I was amazed by it. You told it what to do, and it did it! (don't you miss the days when coding worked like that?) I was bitten by the coding bug (excuse the pun) and I'd got it bad! I spent many late nights on that thing, escaping from a difficult home life. People (especially adults) were confusing, and in my experience unpredictable. When you did things wrong they shouted at you and threatened to take you away, or ignored you completely. Code never did that. If you did something wrong, it quietly let you know and often told you exactly what was wrong. It wasn't because of shifting expectations or a change of mood or anything like that. It was just clean logic, simple cause and effect.
I get my first computer a year later: an IBM XT that had been discarded by a company and was fitted with a key on the side to turn it on. With the impressive noise it made it really was like starting an engine. Whole most kids would have played with the games, I spent my time playing with batch scripts and writing very simple text adventures. And discovering what "format c:" does. With some abuse and threatened violence I managed to get windows running on it. Windows 2.1 I think it was.
At 12 I got a Gateway 75 running Windows 95. Over the next few years I do covered many amazing games: ROTT, Doom, Hexen, and so on. Aside from the games themselves, I was fascinated by the way computers could be linked together to play together (this was still early days for the Web and computers networked in a home was very unusual). I also got into making levels for Doom, Heretic, and years later Duke Nukem 3D (pretty sure it was heretic; all I remember is the nightmare of trying to write levels entirely by code!). I enjoyed re-scripting some of the weapons and monsters to behave differently. About this time I also got into HTML (I still call this coding, but not programming), C, and java. I had trouble with C as none of the examples and tutorial code seemed to run properly under a Windows environment. Similar for my very short stint with assembly. At some point I got a TI-83 programmable calculator and started rewriting my old batch script games on it, including one "Gangster Lord" game that had the same mechanics as a lot of the Facebook games that appeared later (do things, earn money, spend money to buy stuff to do more things). Worried about upcoming exams, I also made a number of maths helper apps, including a quadratic equation solver that gave the steps, and a fake calculator reset to smuggle them into my exams. When the day came I panicked and did a proper reset for fear of being caught.
At 18 I was convinced I was going to be a professional coder as I started a degree in Computer Science. Three months later I dropped out after a bunch of lectures teaching what input and output devices were and realising we were only going to be taught Java and no C++. I started a job on the call centre of a big company, but was frustrated with many of the boring and repetitive tasks we had to do. So I put my previous knowledge to use, and quickly learned VBA to automate tasks. It wasn't long before I ended up promoted to Business Analyst where I worked on a great team building small systems in Office, SAS, and a few other tools.
I decided to retrain in psychology, so left the job I was in and started another degree. During my work and placements my skills came in use a number of times to simplify and automate tasks. I finished my degree, then took a job as a teaching assistant while I worked out what I wanted to do next and how to pay for it. Three years later I've ended up IT technican at the school, responsible for the website, teaching a number of Computing lessons each week, and unofficial co-coordinator for Computing as a subject. I also run a team of ten year old Digital Leaders who I am training in online safety and as technical experts; I am hoping to inspire them to a future in coding. In September I'll be starting teacher training with a view to becoming a Computing specialist teacher. Oh, and I'm currently doing a course in Android Development in my free time.
And this all started with an accidental knock on the keyboard of a ZX Spectrum.6 -
Telling an Arch beginner to try Gentoo is same as telling someone who switched from Java to C to try assembly.3
-
I've been writing a complex mutation engine that dynamically modifies compiled C++ code. Now there's alot of assembly involved, but I got it to work. I finished off writing the last unit test before it was time to port it all to windows. I switched into a release build, ready to bask in the glory of it all. FUCKING GCC OPTIMIZATIONS BROKE EVERYTHING. I had been doing all my dev in debug mode and now some obscure optimization GCC does in release mode is causing a segfault...somewhere. Just when I thought I was done 😅5
-
Haha kids, you're all dead wrong. Here's my story.
There is a thing called “emergence”. This is a fundamental property of our universe. It works 100% of the time. It can't be stopped, it can't be mitigated. Everything you see around you is an emergent phenomenon.
Emergence is triggered when a lot of similar things come together and interact. One water molecule cannot be dry or wet, but if you have many, after a certain number the new property emerges — wetness. The system becomes _wet_.
Professionalism is an emergent phenomenon too, and its water molecules are abstract knowledge. Learn tech things you're interested in, complete random tutorials, code, and after a certain amount of knowledge molecules is gained, something clicks inside your head, and you become a professional.
Unfortunately, there are no shortcuts here. Uni education can make you a professional seemingly quicker, but it's not because uni knowledge is special, it's because uni is a perfect environment to absorb a lot of knowledge in a short period of time.
It happened to me too. I started coding in Pascal in fifth grade of high school, and I did it till sixth. Then, seventh to ninth were spent on my uni's after-school program. After ninth grade, I drop out of high school to get to this uni's experimental program. First grade of uni, and we're making a CPU. Second grade, and we're doing hard math, C and assembly.
And finally, in the third grade, it happens. I was sitting there in the classroom, it was late, and I was writing a recursive sudoku solver in Python. And I _felt_ the click. You cannot mistake it for anything else. It clicks, and you're a changed person. Immediately, I realized I can write everything. Needless to say, I was passing everything related to code afterwards with flying colours.
From that point, everything I did was just gaining more and more experience. Nothing changed fundamentally.
Emergence is forever. If you learn constantly, even without a concrete defined path, I can guarantee you that you _will_ become a professional. This is backed by the universe itself. You cannot avoid becoming one if you're actively accumulating emergence points.
Here's the list of projects I made in the past 11 years: https://notion.so/uyouthe/...
I'm 24.7 -
Fuck... coming from a Python background learning low level stuff is hard as shit.. gonna need to learn C/C++ and some Assembly real soon!
Gotta say though, understanding that stuff makes everything have more sense all of a sudden hahah6 -
I am a firmware developer with 4 years experience. C and sometimes assembly is my bread and butter.
Like 2 years ago, I was really interested to make a switch to application development. Got referred by my friend to her startup.
But I was a bit rusty with my data structures, high level languages and interpersonal skills.
The first question was to find the number of occurences for each word in a paragraph. The language choice was Java. But I was allowed to use C++ since it was the closest relative to Java that I knew.
And I started implementing a binary search tree from scratch and started inserting each tokenised word into it, wrote a traversal algorithm.
The interviewer, luckily, was a patient guy. After I completed my whole mess, he asked is it possible to do this in a slightly better way with constant time access without traversal.
I said yes, we can with a hash table but I dont know how to implement one. He replied I dont expect you to implement the hash table but see you use it. I asked him if I am allowed to used the standard library, for which he said ofcourse.
*facepalm*.
Finally I understood his expectation, referred cppreference.com and used an unordered_map.
Later there were some quesion on databases for which I tried my best to answer. And I frankly replied that I am not comfortable with JS frameworks as of now. Got rejected.
So the mistake is I never asked basic questions like what is the time complexity expects, if I was allowed to use standard library, didnt spend some extra time on studying stuffs needed for the domain switch and most importantly I panicked.7 -
Tldr; its a long introduction
Hi Ranters,
I've been on this app for quite a while now. As a shy cat watching from a distance and reading all kinds of rants. Anywho I feel comfortable enough to crawl out of my shell and introduce myself. Since I feel you guys together made such a pleasant and safe community, I'm really happy to be a part of it!
Anyway I'm Sam, 24 year old, from the Netherlands. My favorite color is green. Mostly the green you can find in nature. The one that calms you down:). I'm a very introverted person but always very curious and eager to learn new things.
I started to program when I was 12. I did assembly and C++. Because I liked making cheats for online games. Later I learned about C#, Java and Python. Mostly used it for web stuff, scraping, services etc. But also chatbots (for Skype for example).
Currently I'm 2 years in as a data scientist, mostly working in Python.
But on the side as a hobby and with an ambition I have a basic understanding of full stack development.
Mostly Nodejs, express, mongo, and frontend, no frameworks.
(I will later ask you guys some more questions about that! I could really use some advice!)
Anyway enough about me! Tell a bit about yourselves! Happy to get to know you all a little better!22 -
One day browsing the internet, I find a website that is hiring web developers. I was curious, so I decided to see the requirements.
Job : To manage this website
Skills Required
6+ years Experience of
HTML
CSS
JavaScript
Node.js
Vue.js
TypeScript
Java
PHP
Python
Ruby
Ruby on Rails
ASP.NET
Perl
C
C++
Advanced C++
C#
Assembly
RUST
R
Django
Bash
SQL
Built at least 17 stand alone desktop apps without any dependencies with pure C++
Built at least 7 websites alone.
3+ years Hacking experience
built 5 stand-alone mobile with Java, Dart and Flutter
7800+ reputations on stack overflow.
Answered at least 560 questions on stack overflow
Have at least 300 repositories on GitHub, GitLab, Bitbucket.
Written 1000+ lines of code on each single repository.
Salary: $600 per month.
If he learnt all languages one by one at age 0, he will be 138 now!14 -
From NAND to Tetris..
This book is IMO the best book for those who want to venture to the lower level programming.
This books retrains you’re thinking, teaches you from the bottom up! Not the typical top down approach.
You begin with the idea of Boolean algebra. And the move on to logic gates.. from there you build in VHDL everything you will use later.
Essentially building your own “virtual machine”.. you design the instruction set. Of which you will then write assembly using the instruction set to control the gate you built in VDHL.
THEN you will continue up the abstraction layer and will learn how a compiler works, and then begin written c code that is then compiled down to your assembly of your instructions set to be linked and ran on your virtual machine you built.
All the compiler and other tools are available on the books website. The book is not a book where you copy and paste, run and done.... you kinda have to take the concepts and apply them with this book.
Then once you master this book, take it the extra step and learn more about compilers and write your own compiler with the dragon book or something.
Fantastic book, great philosophy on teaching software.. ground up rather than top down. Love it! It’s Unique book.21 -
Languages like python and R are some-what high level languages, with an easy syntax and very readable code. This useful essentially to make it easier for non-programmers to use it. For me as a software developer with +4 years of professional programming. I started with Assembly, Quick-Basic to C++, Java then C#, I found Python super convenient, and at times way too convenient.
At first it felt like I was cheating, and would not consider myself actually writing code, more like pseudo-coding.
After a year or so, I got used to it and it became my default, but it still does not feel right .. is anyone else feeling the same?
I do believe that coding the hard way is not always the right way, but I am just wired that way.17 -
Don't tell me I'm the only one who searched if there's any direct option to use Assembly in Python as it is in C.2
-
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
Bind learning c++ chapter 3
Fuck whoever says that java is easier than c++. If i populate memory i can just fucking clean it out. I have access to the literal fucking stack my code runs on. I can integrate assembly. I can fucking make my program run in -3 seconds for fucks sake, this is so much easier than java where i have to fuck around with scopes to nudge the gc to start cleaning up and i need a separate engine to access jasm code.9 -
Tried to reply to @Fast-Nop who had replied to someone wondering if C would be a good first language.
IMHO C should have been put to sleep ages ago. A few years ago I downloaded the latest, greatest C Standard. For a language billed as small and simple (by many) it was over 800 pages long. Still there's a lot that's unspecified like order of evaluation of function arguments. Int etc is implementation dependent. And error handling, let's not go there. The macro assembler throws away all the semantics leaving behind a cryptic value. It's a complex language due to the innumerable interactions possible.
It's been called assembly language for the PDP-11 minicomputer. Recently learned that even the VAX-1 was built from SSI chips like the 4-bit 74181 ALU. The VAX.
Anyway I had several excellent books on programming style written by Henry Ledgard. He despaired of making C look readable. I commend his books which are so old that the code is UPPERCASE A lot of he wrote had to do with program design, naming things, writing good comments and that the visual shape of a program assists mental clarity.23 -
Ye, so after studying for an eternity and doing some odd jobs here and there, all I can show for are following traits:
* Super knowledgeable in arm/Intel assembly language
* C-Veteran with knowledge of some sick and nasty C-hacks/tricks which would even sour the mood of your grandma
* Acquired disdain of any and all scripting languages (how dare you write something in one line for which I need a whole library for!)
* All-in-all low-level programmer type of guy (gimme those juicy registers to write into!)
After completing the mandatory part of my computer science studies, all I did was immerse myself into low-level stuff. Even started to hold lectures and all.
Now I'm at the cusp of being let free into the open market.
The thing is: I'm pretty sure that no company is really interested in my knowledge, as no one really writes assembly anymore.
Sure, embedded programming is still a thing, but even that is becoming increasingly more abstract, with God knows how many layers of software between the hardware and the dev, just to hide all the scary bits underneath.
So, are there people in here who're actually exposed to assembly or any hands-on hardware-programming?
Like, on a "which bit in which register/addr do I need to set" - kind of way.
And if so, what would you say someone like me should lookout for in a company to match my interest to theirs?
Or is it just a pipe dream, so I'd need to brace myself to a mundane software engineer career where I have to process a ticket at a time?
(Just to give a reference: even the most hardware-inclined companies I found "near" me are developing UIs with HTML5 to be used in some such environment ....)12 -
I just finished writing an Integer Java Virtual Machine in C.
Being able to write an echo server in IJVM Assembly, connect to it through netcat and see it run on my machine is legitimately one of the most satisfying moments I've had so far in programming. -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
Well shit, now I (re-)learned C,
And all I want to do is program in C,
But all C jobs are like -
"C guru that merged to Linux kernel"
"Driver writing low level must know Assembly"
"Military-grade realtime hardware design"
Isn't there a C job that's like Python - "here I wrote this script in 5 minutes and spent the rest of the day playing Eve Online" :D :D10 -
Winter break university projects:
Option A: implement writing and reading floating point decimals in Assembly (with SSE)
Option B: reuse the reading and writing module from Option A, and solve a mathematical problem with SSE vectorization
Option C: Research the entirety of the internet to actually understand Graphs, then use Kruskal's algorithm to decide that a graph is whole or not (no separated groups) - in C++
Oh, and BTW there's one week to complete all 3...
I don't need life anyways... -
Writing x86 assembly code in VS Code feels so weird. I mean, I'm using something that's built using crazily high level languages (JS, HTML, CSS), on top of a mammoth runtime environment (Node, V8), which is itself sitting on a modern and sophisticated operating system (Antergos), and I'm writing code that shifts bits and bytes around in memory in order to get one part of my C program to run just a little faster. Wow.1
-
Web Assembly and Dotnet Blazor. Finaly other languages will become aviable on the frontend, dotnet blazor is a good start for C#1
-
Laying in bed at 1 AM unable to sleep so of course my brain is going wild and trying to convince me that learning Assembly and C to make my own bootloader and OS would be a good idea... Could be fun, think it's worth the shots and giggles?5
-
As a junior dev from a sysadmin and security background, this is a list of software development concepts I never seemed to truly understand but hope to(rated from most intimidating to least):
1) Frontend web development and all the huge world of javascript frameworks and tools. - It's more overwhelming than the political geography of the Holy Roman Empire in the Middle Ages.
2) Machine Learning, Deep Learning and A.I- too much math that fucks with my brain.
3) low-level programming(kernel,drivers) - sounds extremely interesting but the code in assembly/C/C++ looks like Linear A Minoan hieroglyphics.
4) Rx(insert language here) - I never get why it is useful or why someone invented this. Seems interesting though.
5) Code Reflection - sounds like Thelemic magick.
6) Packaging, automation, build tools, devops, CI, Testing -seems too complicated. I just want to run an executable at the client or make a web app that does something. Why all this process?6 -
My image of dream career through different times of my life:
- frontend specs prodigy, css enlightenment, a member of w3c or a similar committee
- indie hacker and entrepreneur, leader of a startup community
- architecture prodigy, expert in scalability
- transsexual evangelist, popular article writer and a rockstar
- hardware engineer: Linux, C, chip and dale’s Gadget-like girlfriends, xkcd, latex, assembly, buying a radio station and a telescope
- scientist like NickyBones, papers, data, more data
- art expert
Though achieving one of this would take the entire life, I had a chance to grasp all of this. WHY does they feel so incompatible? Why do I have to choose?
Why do I feel so sad? Why do I feel like I haven’t achieved anything even though I objectively achieved what I dreamed of like five years ago?
Is it true that it’s in my nature to always seek an environment to feel like a junior in? Is feeling like a junior only pleasant to me because it reminds me of old times when I wasn’t actually this mentally ill and was still happy?
Why do I feel like that arduino and C shit is the equivalent of a red corvette?6 -
I wanna go back to the age where a C program was considered secure and isolated based on its system interface rathe than its speed. I want a future where safety does not imply inefficiency. I hate spectre and I hate that an abstraction as simple and robust as assembly is so leaky that just by exposing it you've pretty much forfeited all your secrets.
And I especially hate that we chose to solve this by locking down everything rather than inventing an abstraction that's a similarly good compile target but better represents CPUs and therefore does not leak.31 -
Holy shiiittttt I finally got 64bit NASM working on windows with cmake. Cmake documentation is fkn bad man.
I’ve got a c++ file that calls a procedure in an assembly file that calls win32 APIs to show dialogs and other cool shit. Compiling was working fine, linking turned out to be a bit of a pain in the ass, but figuring out how to enable NASM in cmake was a nightmare. Why is the cmake docs so horrific 🥺1 -
I learned C with a K&R copy a friend gave me years ago. Now at University we in CompSci get taught in Python the first year and Java next while the engineers start with C and (I'm guessing) move on to assembly later on.
This friend comes to me all worried because he has to submit the next day a working Reversi game for the console written in C. Turns out the game was divided among two labs and he failed to submit the first one.
The guy is smart but once a week or so, when we met to smoke a joint and relax with some other friends, he was always talking about how he would prefer something like law but that would be bad business back in Egypt.
Back to the game, I get completely into it. First hour checking all the instructions he was given, then reviewing the code he wrote and copied from Internet. We decide start from scratch since he doesn't really get what the code he copied do. It took us 10 hours only stopping to eat but we get all the specifications of both labs perfectly.
A week after that he comes to me: "my TA said your code is the ugliest shit he's ever seen but he gave me a perfect score because it passed all the tests". I'm getting better (the courses I'm taking help me a lot) but what really made me happy is that he solved the next lab by himself (Reversi wasn't the first time I helped him, only the first time he was absolutely lost). Now he actually gets excited about coding and even felt confident for his programming final.
No more talking about being a lawyer after those 10 hours, totally worth it.1 -
Whose fucking idea was it to still consider assembly (with C being optional) as the most relevant language in electrical engineering school?
Also teaching like 74HC and Op-Amp IC's are still the most common thing in todays electronis is really grinding my gears!!! Is it still an argument that your 8 NAND gates are essentially the same price as a low cost Microcontroller?
But one can be modified within second and the other you potentially need to redesign the entire board.12 -
Is there something you find genuinely cool and would recommend ? Some webpage, program, OS, library or anything ?
I mean hey. There are SO MANY reaaaally cool things I didn't know until last few months.. Things I'd be so grateful for if I knew them earlier. I'll list some of them and I just know you have few of yours too. Feel free to educate the rest!
Processing - Program so fun to code in + CodingTrain(YTB channel)
Microcorruption.com - so freaking awesome if you wanna learn hacking / assembly (not x86 necessarily)
LiveOverflow - cool hacking channel
Radare - cool cmd Linux disassembler
vim-adventures.com - LEARN VIM (not just how to quit it) LITERALLY by playing a game!!!!!!!!!!!!!!
slashdot - stay updated , like really
"BEST-WEBSITES-A-PROGRAMMER-SHOULD-VISIT" - GUYS THIS! Sorry for caps but search this on GitHub and you will fucking die of happiness of how freaking useful links there are and no bullshit to dig through , just pure awesomeness. REALLY
HandBrake - Top media converter without bullshit and bloat stuff in it
Calibre - Best eBook management software capable of literally everything ebooks related. Kindle is a bloated joke compared to this
QubesOS - You know you can have every OS running at once - you have a Linux but are playing win games. Yup. It's there. Free
Computerphile - You all know it, it's just for completeness
Khan Academy - Same
VulnHub - download vulnerable VMs and hack them, or learn by reading writeup on how to do it!
Valgrind - MUST HAVE for C/C++ programmers
Computer Science crash course videos
That's all I can think of from top of my head but hey, there's more to it so definitely add your 2 cents!
Last thing, if nothing, just check the websites on GitHub, that's lifechanger
Looking forward to see some cool links & recommendations!2 -
I wanna make a c+friends language and it'd be dev friendly and will throw lots of errors on compile to show love. Also it'll compile slower with each newline so you can always say "it's compiling" there will be classes but people instead and then instead of new I'll have create. As for loops let's go with a friendly do while loop and dontdo while as normal while or dowith i while to have a friendly for loop. Instead of ifs let's say decide() and instead of else let's have or. Instead of functions I'll have well you need no functions you'll have jumps and tests before jumps just like assembly has. Oh and everything will be a pointer because then it runs nicer. To create a variable you can't use = because that's the equal sign in decide you need to use "var int myint is 69" because why not. Then to print to the console "console.outputstream.out(myint)" instead of threads I'll have please like "please work" where work is a jump target. I hope you'll enjoy this language ^^
-
Bootcamps get you up and running in coding quickly. If you are a programmer, companies are only interested on how quickly, error free and cheaply you produce marketable output. Bootcamps enable this.
More or less you are not more than a former assembly line worker putting parts on a car platform. Your value is not very high as you may be exchanged at any time at their will.
Nevertheless, you can earn money quickly. You trade in your youth and time which might be a dead end in the long-term. Trends go to machine learning, artificial intelligence. They will not need Bootcamp people and code workers.
It is better you set up Bootcamps and sell them versus absolving this. Like selling shovels during the gold rush, but not working in the mud of Alaska by yourself.
Your choice is: Making quick money, which fades anyway; or striving for the long-term future proof career.
C/S degrees from Technical Universities of reputation give to you the right direction under a strategic consideration. Companies which pay well, or freelancing with a solid acknowledged background, will always look for top graduates. People from Bootcamps are just OK for hammering assembly line coding. Even worse with SCRUM in one noisy room under enormous team server pressure controls, counting your lines of code per minute, with pale people all around. And groups of controllers never acknowledging nor trusting your work.
To acquire a serious degree, a Bachelor is nothing. Here, in INDIA, Bachelor now is what a former high school grade was. You must carry a diploma or Masters degree combined with internships at big companies with high brand recognition. This will require 4–6 years of your lifetime. You can support this financially by working part-time freelancing as making some projects front- or back-end web, data analysis and else.
Bootcamp people will lose in the long-term. They are the modern cannon fudder of software production.
It is your choice. Personally, I would never do Bootcamps. Quality and sustainability require time, deep studies and devotion. -
Need help. Explain why assembly code is wrong.
C code:
long f(long n) {return n + 1} ;
Assembly:
f:
movq rdi, rbp
addq $1, rbp
movq rbp, rax
ret6 -
the one that exists (c#) seems underused compared to where it could (or even should) be used. and the place that uses it the most (enterprise) butchers and mangles its use, just as enterprise tends to do with everything.
the one that i'm designing... the fact that it doesn't exist yet, and that even as i'm zeroing in on syntax and philosophy that i'm very much starting to be proud of, i still don't have a proper idea of how to implement even the most basic parser/interpreter for it, not because it's in any way difficult or unusual, but just because... i've never done that before, so i get into weird circular thought paths that produce weird nonsensical code...
... on top of that, i still only have a very, very fuzzy idea of how will it (sometime in extremely distant future) actually implement the most interesting and core feature - event-based continuous (partial) re-parsing of the source code and the fact that traversing the tokens at the leaf level of the syntax tree should result in valid machine code (or at least assembly) that is the "compiled" program.
i *know* it's possible, i just don't yet know enough to have a contrete idea how exactly to achieve it.
but imagine - a programming language where interactive programming is basically the default way of working, and basically the same as normal programming in it, except the act of parsing is also the (in-memory) compilation at the same time, so it's running directly on the hardware instead of via interpretrer/vm/any of that overhead crap.
also then kinda open-source by definition.
and then to "only" write an OS in that, and voilá! a smalltalk-like environment with non-exotic, c-family syntax and actual native performance!
ahhh... <3
* a man can dream *2 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
Do y'all use Blazor? .-. the C#-based web-UI (web assembly one)
Thinking of going in on it hard coz I hate to think of a world where backend is written in JS (🤮) just for better interoperability with JS-based UI and cheaper devs to hire (JS-fullstacks) 🤮🤮🤮5 -
Is it weird that I hold a high degree of respect for every sector in programming. When we talk about front-end, back-end in websites to the GUI support and logical end in desktop applications to cloud-based microservices, I respect clean, swift, and agile developers who who a structural mindset. For the founding fathers of assembly to high-programming languages like c all the way to high-high level programming languages like C#, JavaScript, Python, I respect them and thank them for their time and dedication in relatively stable libraries. I also thank the creators of OOP and FP as well as the developers that make great use of these paradigms. I come to realization that no one wants to fuck shit up; the great engineers of our past wanted to build some legit, non-trash programming tools, and we can't bash them for that. Respect, courteously critique, and build applications and programming tools to a standard that someone in the future would admire and be grateful for.4
-
I dunno if this is a rant or an anti-rant, but for some reason, Assembly and low-level development hardly ever tick me off. It absolutely dumbfounds me how what many regard as the most frustrating and difficult ways of programming a binary digital computers is just so enjoyable. I spend hours per day teaching myself C and GNU Assembly and how it works with little actually getting done. It just rarely gets frustrating enough to rant about. Ladies and gentlemen I think I have found my calling!
-
!rant
I see a lot of people complain about uni degrees and stuff because they don't learn how to code etc. Is this really the standard?
I mean I'm only in fourth semester bachelor and had coding knowledge before starting uni. But we had basic to intermediate java in the first two semester, now learning how to write secure code and OS-Level stuff in C++, we had a module with practical Assembly coding all while still learning all the theory.
At the end of the first semester we had to write a terminal game in Java. I mean of course that's not "real experience" but if you dive in you definitely learn the basics you need to get started in real life.
Or am I wrong completely / just in a weird uni?6 -
My answer to their survey -->
What, if anything, do you most _dislike_ about Firebase In-App Messaging?
Come on, have you sit a normal dev, completely new to this push notification thing and ask him to make run a simple app like the flutter firebase_messaging plugin example? For sure you did not oh dear brain dead moron that found his college degree in a Linux magazine 'Ruby special edition'.
Every-f**kin thing about that Firebase is loose end. I read all Medium articles, your utterly soporific documentation that never ends, I am actually running the flutter plugin example firebase_messaging. Nothing works or is referenced correctly: nothing. You really go blind eyes in life... you guys; right? Oh, there is a flimsy workaround in the 100th post under the Github issue number 10 thousand... lets close the crash report. If I did not change 50 meaningless lines in gradle-what-not files to make your brick-of-puke to work, I did not changed a single one.
I dream of you, looking at all those nonsense config files, with cross side eyes and some small but constant sweat, sweat that stinks piss btw, leaving your eyes because you see the end, the absolute total fuckup coming. The day where all that thick stinky shit will become beyond salvation; blurred by infinite uncontrolled and skewed complexity; your creation, your pathetic brain exposed for us all.
For sure I am not the first one to complain... your whole thing, from the first to last quark that constitute it, is irrelevant; a never ending pile of non sense. Someone with all the world contained sabotage determination would not have done lower. Thank you for making me loose hours down deep your shit show. So appreciated.
The setup is: servers, your crap-as-a-service and some mobile devices. For Christ sake, sending 100 bytes as a little [ beep beep + 'hello kitty' ] is not fucking rocket science. Yet you fuckin push it to be a grinding task ... for eternity!!!
You know what, you should invent and require another, new, useless key-value called 'Registration API Key Plugin ID Service' that we have to generate and sync on two machines, everyday, using something obscure shit like a 'Gradle terminal'. Maybe also you could deprecate another key, rename another one to make things worst and I propose to choose a new hash function that we have to compile ourselves. A good candidate would be a C buggy source code from some random Github hacker... who has injected some platform dependent SIMD code (he works on PowerPC and have not test on x64); you know, the guy you admire because he is so much more lowlife that you and has all the Pokemon on his desk. Well that guy just finished a really really rapid hash function... over GPU in a server less fashion... we have an API for it. Every new user will gain 3ms for every new key. WOW, Imagine the gain over millions of users!!! Push that in the official pipe fucktard!.. What are you waiting for? Wait, no, change the whole service name and infrastructure. Move everything to CLSG (cloud lambda service ... by Google); that is it, brilliant!
And Oh, yeah, to secure the whole void, bury the doc for the new hash under 3000 words, lost between v2, v1 and some other deprecated doc that also have 3000 and are still first result on Google. Finally I think about it, let go the doc, fuck it... a tutorial, for 'weak ass' right.
One last thing, rewrite all your tech in the latest new in house language, split everything in 'femto services' => ( one assembly operation by OS process ) and finally cramp all those in containers... Agile, for sure it has to be Agile. Users will really appreciate the improvements of your mandatory service. -
Top 12 C# Programming Tips & Tricks
Programming can be described as the process which leads a computing problem from its original formulation, to an executable computer program. This process involves activities such as developing understanding, analysis, generating algorithms, verification of essentials of algorithms - including their accuracy and resources utilization - and coding of algorithms in the proposed programming language. The source code can be written in one or more programming languages. The purpose of programming is to find a series of instructions that can automate solving of specific problems, or performing a particular task. Programming needs competence in various subjects including formal logic, understanding the application, and specialized algorithms.
1. Write Unit Test for Non-Public Methods
Many developers do not write unit test methods for non-public assemblies. This is because they are invisible to the test project. C# enables one to enhance visibility between the assembly internals and other assemblies. The trick is to include //Make the internals visible to the test assembly [assembly: InternalsVisibleTo("MyTestAssembly")] in the AssemblyInfo.cs file.
2. Tuples
Many developers build a POCO class in order to return multiple values from a method. Tuples are initiated in .NET Framework 4.0.
3. Do not bother with Temporary Collections, Use Yield instead
A temporary list that holds salvaged and returned items may be created when developers want to pick items from a collection.
In order to prevent the temporary collection from being used, developers can use yield. Yield gives out results according to the result set enumeration.
Developers also have the option of using LINQ.
4. Making a retirement announcement
Developers who own re-distributable components and probably want to detract a method in the near future, can embellish it with the outdated feature to connect it with the clients
[Obsolete("This method will be deprecated soon. You could use XYZ alternatively.")]
Upon compilation, a client gets a warning upon with the message. To fail a client build that is using the detracted method, pass the additional Boolean parameter as True.
[Obsolete("This method is deprecated. You could use XYZ alternatively.", true)]
5. Deferred Execution While Writing LINQ Queries
When a LINQ query is written in .NET, it can only perform the query when the LINQ result is approached. The occurrence of LINQ is known as deferred execution. Developers should understand that in every result set approach, the query gets executed over and over. In order to prevent a repetition of the execution, change the LINQ result to List after execution. Below is an example
public void MyComponentLegacyMethod(List<int> masterCollection)
6. Explicit keyword conversions for business entities
Utilize the explicit keyword to describe the alteration of one business entity to another. The alteration method is conjured once the alteration is applied in code
7. Absorbing the Exact Stack Trace
In the catch block of a C# program, if an exception is thrown as shown below and probably a fault has occurred in the method ConnectDatabase, the thrown exception stack trace only indicates the fault has happened in the method RunDataOperation
8. Enum Flags Attribute
Using flags attribute to decorate the enum in C# enables it as bit fields. This enables developers to collect the enum values. One can use the following C# code.
he output for this code will be “BlackMamba, CottonMouth, Wiper”. When the flags attribute is removed, the output will remain 14.
9. Implementing the Base Type for a Generic Type
When developers want to enforce the generic type provided in a generic class such that it will be able to inherit from a particular interface
10. Using Property as IEnumerable doesn’t make it Read-only
When an IEnumerable property gets exposed in a created class
This code modifies the list and gives it a new name. In order to avoid this, add AsReadOnly as opposed to AsEnumerable.
11. Data Type Conversion
More often than not, developers have to alter data types for different reasons. For example, converting a set value decimal variable to an int or Integer
Source: https://freelancer.com/community/...2 -
is there any way to convert python straight to C yet? i just barely can't get PyInstaller working on PythonD because no os.fork() (because DOS. no, not cmd.exe, actual fucking DOS.)
one broken function between me and victory
"just use C" DJGPP is kicking my ass all the same, random unknown segfaults are a bitch and also i can't get quite what i want in the memory layout restrictions i have to work under
"just use Assembly/BASIC" their file handling makes me wanna die and BASIC is fucking massive as well18 -
everything is going as planned! :)
Learned Rust Lang. i loved it (that doesn't mean i am done learning na? No! never stop)
new language i could do game memory hacking in without worrying about C++ memory leaks or issues. it also compiles to assembly! another of my favorite languages!
(i use rust for game development and other stuff)
i am not leaving C / C++ though that would be harsh!,
i abandoned javascript for react and typescript.
to be honest the developer just made javascript and left us with a [object Object]
finished learning the android java api so im basically set anything i want to make i can just go on my pc, listen to music and write it out in a couple of days.
well phazor what are you going to do now?!
i will code till i am old.
i will leave my mark like a shid that made its skid in the bowl :)5 -
I'm in a big fat fucking stinking rut, as in progress on this project has absolutely stagnanted.
Gonna rubber face your duck now **UNZIPS** excepts I don't have zippers, as joggers are the one true way; fake Adidas til I fucking drop.
Brain damage aside, I understand both how I've layed out the data and what I'm supposed to do with it. We have a virtual machine, an array of instructions and arguments for a given process within it, and we need to walk this array and map values to registers.
We also need to spill values inside registers to stack, IF they are required at a further point within that block. This also isn't terribly complex. We simply look forward in the array and see if the value is an argument to any instruction that *needs* this value to be loaded (ie, within a register).
So this implies multiple iterations; we need to better understand how one particular value is used throughout an F before we can make a final decision on how many registers and stack space are actually needed for the whole block.
Here's where it gets tricky. If there's a call, we need to be certain that the symbol being invoked has already been fully processed. Besides the obvious fact that recursion fucks me up, there's another matter: say a private method gets invoked by another private method. We can take advantage of this, by which I mean, sacrilege incoming so put on this toga.
Looking at the output for C compilers, it would seem this is not done in practice, I would assume because it's a pain in the ass. But when you have the guarantee that F will only be called internally, as that's what "private" means, there's two ways it can go:
0. It's well below the 13-20 cycle threshold, so you inline the fucker. No suprises there.
1. It's a more involved affaire, and invoked in more than one place, so you don't inline it. Codesize matters.
Recursion and [1] are the big deal things holding me back. Not because it's too hard, like I said this is kindergarten level abstraction. I'm just slow and fanatical, which is how I prefer to spell "constant obsessive paranoid delusions". I can see the potential optimization I can pull here, so I'm stuck trying to figure it out.
Idea would be, handling the register allocation and stack spill for an internal-internal (or deep internal; what we like to call a "guts" method) in synchronization with the *calling* processes. This is, fundamentally, violating all conventions -- but so under the hood no one will notice.
Let me give you an example. If we were to pass some value to a function, expecting to mutate it and get a different value back, in a lot of cases it'd be stupid to make an implicit copy by using two registers, one for input and another for the output. Dude, it's one cycle. Multiply it by a million, say sixty times per second, for every time you __needlessly__ make a copy of a value that we've already stated is mutable.
Clearly unacceptable. This is, in the strictest sense, everywhere in every single codebase. Premature micro optimization is the root of all goodness, God is great and praiseworthy. So how do we go about it?
Answer is I know and I don't know. By which I mean to say, this very thing I've done by hand. Assembly is fun. Now the issue is teaching a calculator how to do it. Not so fun.
There is a dependency chain between processes, as I believe I've kind of alluded to. I'm trying to make decisions on the side of the caller depending on the details of the callee, which is why recursion is rawdogging my soul. This is the same situation, it's inverting the direction of one or more links in the dependency chain, which makes no fucking sense.
And yet it does.
Brain, explain yourself.
How do *you* handle this without crashing?
Brain?
<<ME STEWPED; BEEP-BOOP>>
Alright then, that was a useless attempt at fuckery. Let's have a nap then, maybe it'll come to me in the morning. That's what I've been saying to myself for almost a month now.
Perhaps it is a hardcoded fuk.1 -
I know this is too late to ask this question, but am a final year computer science student, average in all core subjects with 0 knowledge of web development (except a few html tags, but not enough to make a wikipedia like website) or other professional streams.
I know java and python enough to make oop classes and understand code written in them.
Should i
A)study more about web dev/ml-ai/testing/other "professional" stuff
B) learn more and strengthen my core subjects , like operating system, algorithms, data structures, etc or
C) learn another core language like C/c++/assembly?27 -
Typically every computer science major begins with either C C# C++ java or python , creating so much abstraction from the hardware which just loads your mind with questions that remain unanswered.When ever i program something i always think of how the under lying stuff is working.They never explain how and where software meets the hardware.Why are they keeping students away from the hardware. I think a cs graduate without knowing the underpinning of a computer should not be considered a cs graduate as opposed to being a software engineer a computer science major relates to everything that is a computer that includes the theoretical stuff and a little bit know how of computer hardware. Instead of teaching this stuff and assembly as a language in the first semester they teach you java or C++. Could not speculate on why this is so.11
-
My family got our first computer when I was in the 1st grade and I really liked it a lot.
After some years I saw someone code and I was like "What's that?". After they explained me what they were doing I was totally hyped and started searching tutorial videos on how to do simple stuff on VB (this was in my 7th grade, I believe).
By the end of my 8th grade I was introduced to a Computer Engineer that lent me a RoR book and tried to teach me the basics.
(Fun fact: around this time I was doing a Habbo clone server with a friend of mine so that we could play with our friends without all the other people poking around).
In high school I took a Computer Technician course where I learnt stuff like VB, C#, PHP, MySQL, some basic CSS/HTML plus some hardware fundamentals.
After that course I tried to enter college and I failed on my first try, so I took a gap year were I worked as a dev for my family's computer repair shop. It was really a good experience to have time for myself while working on what I loved.
Now I'm on the 2nd year of a Bachelor in Computer Engineering (It's more about software than hardware actually), currently working with Java, C, IA-32 Assembly and PL/SQL. My goal is to get a Masters in Software Engineering after it.