Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "systems language"
-
So, you start with a PHP website.
Nah, no hating on PHP here, this is not about language design or performance or strict type systems...
This is about architecture.
No backend web framework, just "plain PHP".
Well, I can deal with that. As long as there is some consistency, I wouldn't even mind maintaining a PHP4 site with Y2K-era HTML4 and zero Javascript.
That sounds like fucking paradise to me right now. 😍
But no, of course it was updated to PHP7, using Laravel, and a main.js file was created. GREAT.... right? Yes. Sure. Totally cool. Gotta stay with the times. But there's still remnants of that ancient framework-less website underneath. So we enter an era of Laravel + Blade templates, with a little sprinkle of raw imported PHP files here and there.
Fine. Ancient PHP + Laravel + Blade + main.js + bootstrap.css. Whatever. I can still handle this. 🤨
But then the Frontend hipsters swoosh back their shawls, sip from their caramel lattes, and start whining: "We want React! We want SPA! No more BootstrapCSS, we're going to launch our own suite of SASS styles! IT'S BETTER".
OK, so we create REST endpoints, and the little monkeys who spend their time animating spinners to cover up all the XHR fuckups are satisfied. But they only care about the top most visited pages, so we ALSO need to keep our Blade templated HTML. We now have about 200 SPA/REST routes, and about 350 classic PHP/Blade pages.
So we enter the Era of Ancient PHP + Laravel + Blade + main.js + bootstrap.css + hipster.sass + REST + React + SPA 😑
Now the Backend grizzlies wake from their hibernation, growling: We have nearly 25 million lines of PHP! Monoliths are evil! Did you know Netflix uses microservices? If we break everything into tiny chunks of code, all our problems will be solved! Let's use DDD! Let's use messaging pipelines! Let's use caching! Let's use big data! Let's use search indexes!... Good right? Sure. Whatever.
OK, so we enter the Era of Ancient PHP + Laravel + Blade + main.js + bootstrap.css + hipster.sass + REST + React + SPA + Redis + RabbitMQ + Cassandra + Elastic 😫
Our monolith starts pooping out little microservices. Some polished pieces turn into pretty little gems... but the obese monolith keeps swelling as well, while simultaneously pooping out more and more little ugly turds at an ever faster rate.
Management rushes in: "Forget about frontend and microservices! We need a desktop app! We need mobile apps! I read in a magazine that the era of the web is over!"
OK, so we enter the Era of Ancient PHP + Laravel + Blade + main.js + bootstrap.css + hipster.sass + REST + GraphQL + React + SPA + Redis + RabbitMQ + Google pub/sub + Neo4J + Cassandra + Elastic + UWP + Android + iOS 😠
"Do you have a monolith or microservices" -- "Yes"
"Which database do you use" -- "Yes"
"Which API standard do you follow" -- "Yes"
"Do you use a CI/building service?" -- "Yes, 3"
"Which Laravel version do you use?" -- "Nine" -- "What, Laravel 9, that isn't even out yet?" -- "No, nine different versions, depends on the services"
"Besides PHP, do you use any Python, Ruby, NodeJS, C#, Golang, or Java?" -- "Not OR, AND. So that's a yes. And bash. Oh and Perl. Oh... and a bit of LUA I think?"
2% of pages are still served by raw, framework-less PHP.32 -
Ranted about him before but this just came to my mind again.
The fucking windows (to the max) fanboy I had to deal with for too long.
Every time I mentioned something about what programming language to use in a project he was NOT part of:
"I know it's none of my business, BUT I think you should use .net"
(All backend JavaScript and php guys).
Every time I mentioned something about what server system to use:
"I know it's none of my business but I think you should use Windows server"
(All Linux guys)
Every time I'd say something positive about Linux he'd search as long as needed to prove that that was also a windows thing (didn't even come close sometimes)
Every time I told the devs there about a windows security issue (as in "guys they found this thing, install the next update to stay safe :)" - "ahhh will do, thanks for letting know man!") he'd search as long as needed to prove that Linux also had had security issues like that.
(Okay?!? I know?!? I'm just trying to notify people so their systems stay secure and they're genuinely happy with that so STFU)
MOTHERFUCKER.17 -
string excuses[]={
"it's not a bug it's a feature",
"it worked on my machine",
"i tested it and it worked",
"its production ready",
"your browser must be caching the old content",
"that error means it was successful",
"the client fucked it up",
"the systems crashed and the code got lost" ,
"this code wont go into the final version",
"It's a compiler issue",
"it's only a minor issue",
"this will take two weeks max",
"my code is flawless must be someone else's mistake",
"it worked a minute ago",
"that was not in the original specification",
"i will fix this",
"I was told to stop working on that when something important came up",
"You must have the wrong version",
"that's way beyond my pay grade",
"that's just an unlucky coincidence",
"i saw the new guy screw around with the systems",
"our servers must've been hacked",
"i wasn't given enough time",
"its the designers fault",
"it probably won't happen again",
"your expectations were unrealistic",
"everything's great on my end",
"that's not my code",
"it's a hardware problem",
"it's a firewall issue",
"it's a character encoding issue",
"a third party API isn't responding",
"that was only supposed to be a placeholder",
"The third party documentation is wrong",
"that was just a temporary fix.",
"We outsourced that months ago.","
"that value is only wrong half of the time.",
"the person responsible for that does not work here anymore",
"That was literally a one in a million error",
"our servers couldn't handle the traffic the app was receiving",
"your machines processors must be too slow",
"your pc is too outdated",
"that is a known issue with the programming language",
"it would take too much time and resources to rebuild from scratch",
"this is historically grown",
"users will hardly notice that",
"i will fix it" };11 -
Anyone looking for something interesting to do???
Step 1) understand how basic circuitry works on a bread board nothing too fancy. ( Implement NAND, AND, ADDER, SUBTRACTOR)
Step 2) learn about microprocessors and how OS works
Step 3) learn assembly
Step 4)write a basic assembler and understand how loaders and linkers works !
Step 5) write a kernel with very basic features like memory management and process management and some drivers for IO
Step 5) write an emulator for some simple systems .! ex chip-8.
Step 6) read about compiler theory and automata
Step 7) write a basic Python interpreter that compiles (not interpreter) to native assembly.
Step 8) implement TCP stack .
Step 9) learn as much as u can about complexity measurement ), data structures and algorithms using C or C++ it's very important ( familiarity with pointers and thus computer memory )
Step 10) learn any high level language of choice like Python or Ruby.
Step 11) stop debating over tabs vs spaces , emacs vs vim , angular vs vue, php vs Python , OOps vs procedular vs functional ( just know about all of them and when to use but don't fucking debate over which one is superior )..
Step 12) live happily and be healthy.30 -
Every day.
I am a PHP developer.
Yeah, "another PHP is awful" rant... no, not really.
It's just unsuitable for some ambitious projects, just like Ruby and Python are.
First of all, DO NOT EVER use Laravel for large enterprise applications. The same goes for RoR, Django, and other ActiveRecord MVCs.
They are all neat frameworks for writing a todo app, as a better-than-wordpress flexible blogging solution, even as a custom webshop.
Beyond 50k daily users, Active Record becomes hell due to it's lazy fat querying habits. At more than a million users... *depressed sigh*.
PHP is also completely unsuitable for projects beyond 5M lines of code in my opinion. At more than 25M lines... *another depressed sigh*.
You can let your devs read Clean Code and books about architecture patterns, you can teach them about SOLID & DRY, you can write thousands of tests... it doesn't matter.
PHP is scaffolding, it's made of bamboo and rope. It's not brick or concrete. You can build quickly, but it only scales up to a certain point before it breaks in multiple places.
Eventually you run into patterns where even 100% test coverage still doesn't guarantee shit, because the real-life edge cases are just too complex and numerous.
When you're working on a multi-party invoicing system with adapters for various tax codes, or an availability/planning system working across timezones, or systems which implement geographical routefinding coupled to traffic, event & weather prediction...
PHP, Python, Ruby, etc are just missing types.
Every day I run into bugs which could have been prevented if you could use ADTs in a generic way in PHP. PHP7 has pretty good typehints, and they prevent a lot of messy behavior, but they aren't composable. There is no way to tell PHP "this method accepts a Collection of Users", or "this methods returns maybe either an Apple or a Pear, and I want to force the caller to handle both Apple/Pear and null".
Well, you could do that, but it requires a lot of custom classes and trickery, and you have to rewrite the same logic if you want to typehint a "Collection of Departments" instead of "Collection of Users" -- i.e., it's not composable.
Probably the biggest issue is that languages with a (mostly) structural type system (Haskell, Rust, even C#/JVM languages to some degree, etc) are much slower to develop in for the "startup" era of a project, so you grab a weak, quick prototyping language to get started.
Then, when you reach a more grown up phase, you wish you had a better type system at your disposal...28 -
curl cheat.sh — get an instant answer to any question on (almost) any programming language from the command line
tldr
do curl cht.sh/go/execute+external+program to see how to execute external program in go
And this question: why I actually should I start the browser, and the browser has to downloads tons of JS, CSS and HTML, render them thereafter, only to show me some small output,
some small text, number or even some plot. Why can't I do a trivial query from the command line
and instantly get what I want?
I decided to create some service that will work as I think such a service should work.
And that is how wttr.in was created.
Nowadays you probably know, how to check the weather from the command line, but if not:
curl wttr.in
or
curl wttr.in/Paris
(curl wetter in Paris if you want to know the weather in Paris)
After that several other services were created (the point was to check how good the console
can solve the task, so I tried to create services providing information
of various nature: text, numbers, plots, pseudo graphic etc.):
curl rate.sx/btc # to check exchange rate of any (crypto)currency
curl qrenco.de/google.com # to QRenco.de any text
And now last but not least, the gem in this collection: cheat.sh.
The original idea behind the service was just to deliver a various UNIX/Linux command line cheat sheets via curl. There are several beautiful community driven cheat sheet repositories such as tldr, but the problem is that to use them you have to install them first, and it is quite often that you have no time for it, you just want to quickly check some cheat sheet.
With cheat.sh you don't need to install anything, just do:
curl cheat.sh/tar (or whatever)
you will get a cheat sheet for this command (if such cheat sheet exists inf one of the most popular community-driven cheat sheet repositories; but it surely does).
But then I thought: why actually show only existing cheat sheets? Why not generate cheat sheets or better to say on the fly? And that is how the next major update of cheat.sh was created.
Now you can simply do:
curl cht.sh/python/copy+files
curl cht.sh/go/execute+external+program
curl cht.sh/js/async+file+read
or even
curl cht.sh/python/копировать+файл
curl cht.sh/ruby/Datei+löschen
curl cht.sh/lua/复制文件
and get your question answered
(cht.sh is an alias for cheat.sh).
And it does not matter what language have you used to ask the question. To be short, all pairs (human language => programming language) are supported.
One very important major advantage of console oriented interfaces is that they are easily
programmable and can be easily integrated with various systems.
For example, Vim and Emacs plugins were created by means of that you can
query the service directly from the editor so that you can just write your
questions in the buffer and convert them in code with a keystroke.
The service is of course far from the perfection,
there are plenty of things to be fixed and to be implemented,
but now you can see its contours and see the contours of this approach,
console oriented services.
The service (as well as the other mentioned above services) is opensource, its code is available here:
https://github.com/chubin/cheat.sh
What do you think about this service?
What do you think about this approach?
Have you already heard about these services before?
Have you used them?
If yes, what do you like about them and what are you missing?26 -
"Fuck JavaScript, its such a shitty language" seems to be quite a common rant today. It seems as if JS is actually getting more hate than PHP, which is certainly odd, considering the stereotype.
So, as someone who has spent a lot of time in JS and a lot of time elsewhere, here are my views. Please, discuss your opinions with me as well. I am genuinely interested in an intelligent conversation about this topic.
So here's my background: learned HTML/CSS/JS in that order when I was 12 because I liked computers. I was pretty shitty at JS until U was at least 15, but you get the point, Ive had it sploshing about in my brain for a while.
Now, JS certainly has its quirks, no doubt, but theres nothing about the language itself that I would say makes it shitty. Its a very easy leanguage to use, but isn't overdeveloped like VB.net (Or, as I like to call it, TheresAFunctionForThat)
Most of the hate is centered around JS being used for a very broad range of systems. I doubt JS would be in the rant feed so often if it were to stay in its native ecosystem of web browsers. JS can be used in server backend, web frontent, desktop and mobile applications, and even in some system services (Although this isn't very popular as of yet). People seem to be terrified that one very easy to learn language can go so far. And, oh god, its interpreted... How can a system app run off an interpreted language? That's absurd.
My opinion on JSEverything is that it's progress. Thats what we're all about, right? The technologies already in place are unthreatened by JS, it isn't a gamechanger. The only thing JS integration is doing is making tedius and simple tasks easier. Big companies with large systems aren't going to jump ship and migrate to JS. A startup, however, could save a fucking ton of development time by using a JS framework, however. I want to live in a world where startups can become the next Google, because technology will stagnate when youre trying to protect your fortune, (Look at Apple for fucks sake) but innovation is born of small people with big ideas.
I have a feeling the hate for JS is coming from fear of abandoning what you're already doing. You don't have to do that. JS is only another option (And a very good one, which is why it's becoming so popular).
As for my personal opinion from my experiences... I've left this part til the end on purpose. I love programming and learning and creating, so I've never hated a lamguage, really. It all depends on what I want to do. In the times i've played arpund with JS, I've loved it. Very very easy. The idea of having it on both ends of web development makes a lot of sense too, no conversion, just direct communication. I would imagine this really helps with speed, as well. I wouldn't use it in a complicated system, though. Small things, medium size projects: perfect. Running a bank? No.
So what do you think about this JSUniverse?13 -
My company wants to start using Node.JS.
JavaScript.
They wanna use JavaScript.
For everything.
JAVASCRIPT.
FOR EVERYTHING.
Scene;
**Asshat enters break room after meeting**
**Asshat turns to Asshole**
Asshat: “Oh here in a year or two we’ll just be rewriting all of this is Node.JS.”
Asshole: “JavaScript. You’ll be rewriting it in JavaScript. And fucking WHY?”
Asshat: “It’s better”
Asshole: “It’s not really a general use language. Why wouldn’t you guys choose Python if you wanted to write EVERYTHING in a goddamn scripting language?”
Asshat: “Google uses Node.JS”
Asshole: “For back-end web development type stuff. I doubt their accounting systems are written in fucking JavaScript...”
Asshat: “Python is oooooold.”
Asshole (to himself): No you’re old, you stupid, ancient fuck.
**Asshole rolls his eyes and walks away**
**Asshat continues his ignorant chuckling**
End Scene;
Clearly years of fixed format RPG programming has killed too many of Asshat’s brain cells.12 -
(I wrote most of this as a comment in reply about Microsoft buying GitHub on another rant but decided to move it here because it is rant worthy. Also, no, I'm not a Microsoft employee nor do I have any Microsoft stock).
Microsoft buying GitHub makes sense. They contribute more to the open source community on GitHub than any other company. (Side note, they also contribute/have contributed to the Linux Kernel).
Steve Ballmer isn't running the show anymore. Because of that, we have awesome things like:
* Visual Studio Code - Completely free and powerful light weight IDE for coding in just about any script or language. This IDE is also open source, hosted on GitHub. It can be installed on Win/Mac/Linux.
* Visual Studio Community Edition: fully featured flagship IDE free for solo developers and students, can be installed on Win/Mac.
* Fully featured Sql Server running in a Docker container.
* .Net Core, which can be compiled to native binaries of Windows, MacOS AND Linux. You can't even do that with Java, you have to first have the JVM installed in order to run any kind of Java code on any of those operating systems. .Net Core is also an absolutely beautiful framework with so many features at your disposal.
...and more.
Yes, they've done bonehead things in the past but who/which company hasn't. Yes, they have Cortana. Yes, they force Bing on you when searching with Cortana (does anyone actually regularly use Cortana? Or Bing?). Yes, their operating system costs money. Yes, their malware-style Upgrade-to-Windows-10 tactics were evil and they admitted such. Yes, they brought ads and other unfortunate things to Skype. I'd be lying if I said I wasn't concerned about that Skype bit translating over into GitHub. BUT, the fact that so many of their employees use GitHub daily means they are dogfooding the platform, which is a positive thing.
Despite the flaws, from the perspective of a software engineer they really should be given a lot of credit for all these new directions they are moving in now. They directly aim to help and contribute to the developer community. Plus, Windows 10 is finally getting a dark theme! haha.
I think Microsoft buying GitHub makes a lot of sense. Of course do what you want about it, feel how you want about it, but casting the same ol' shade at them for anything they do seems a bit like automatic reflex more than anything else.
I'm bracing myself for the impending wave of angry hornets from the nest I just kicked. In all seriousness though, I welcome discussion on the topic even if you feel differently than I do. I'm not saying there's no reason to dislike them, just saying there are lots of new reasons to hate them less and/or appreciate what they are doing now.19 -
Still trying to get good.
The requirements are forever shifting, and so do the applied paradigms.
I think the first layer is learning about each paradigm.
You learn 5-10 languages/technologies, get a feeling for procedural/functional/OOP programming. You mess around with some electronics engineering, write a bit of assembly. You write an ugly GTK program, an Android todo app, check how OpenGL works. You learn about relational models, about graph databases, time series storage and key value caches. You learn about networking and protocols. You void the warranty of all the devices in your house at some point. You develop preferences for languages and systems. For certain periods of time, you even become an insufferable fanboy who claims that all databases should be replaced by MongoDB, or all applications should be written in C# -- no exceptions in your mind are possible, because you found the Perfect Thing. Temporarily.
Eventually, you get to the second layer: Instead of being a champion for a single cause, you start to see patterns of applicability.
You might have grown to prefer serverless microservice architectures driven by pub/sub event busses, but realize that some MVC framework is probably more suitable for a 5-employee company. You realize that development is not just about picking the best language and best architecture -- It's about pros and cons for every situation. You start to value consistency over hard rules. You realize that even respected books about computer science can sometimes contain lies -- or represent solutions which are only applicable to "spherical cows in a vacuum".
Then you get to the third layer: Which is about orchestrating migrations between paradigms without creating a bigger mess.
Your company started with a tiny MVC webshop written in PHP. There are now 300 employees and a few million lines of code, the framework more often gets in the way than it helps, the database is terribly strained. Big rewrite? Gradual refactor? Introduce new languages within the company or stick with what people know? Educate people about paradigms which might be more suitable, but which will feel unfamiliar? What leads to a better product, someone who is experienced with PHP, or someone just learning to use Typescript?
All that theoretical knowledge about superior paradigms won't help you now -- No clean slates! You have to build a skyscraper city to replace a swamp village while keeping the economy running, together with builders who have no clue what concrete even looks like. You might think "I'll throw my superior engineering against this, no harm done if it doesn't stick", but 9 out of 10 times that will just end in a mix of concrete rubble, corpses and mud.
I think I'm somewhere between 2 and 3.
I think I have most of the important knowledge about a wide array of languages, technologies and architectures.
I think I know how to come to a conclusion about what to use in which scenario -- most of the time.
But dealing with a giant legacy mess, transforming things into something better, without creating an ugly amalgamation of old and new systems blended together into an even bigger abomination? Nah, I don't think I'm fully there yet.8 -
I've optimised so many things in my time I can't remember most of them.
Most recently, something had to be the equivalent off `"literal" LIKE column` with a million rows to compare. It would take around a second average each literal to lookup for a service that needs to be high load and low latency. This isn't an easy case to optimise, many people would consider it impossible.
It took my a couple of hours to reverse engineer the data and implement a few hundred line implementation that would look it up in 1ms average with the worst possible case being very rare and not too distant from this.
In another case there was a lookup of arbitrary time spans that most people would not bother to cache because the input parameters are too short lived and variable to make a difference. I replaced the 50000+ line application acting as a middle man between the application and database with 500 lines of code that did the look up faster and was able to implement a reasonable caching strategy. This dropped resource consumption by a minimum of factor of ten at least. Misses were cheaper and it was able to cache most cases. It also involved modifying the client library in C to stop it unnecessarily wrapping primitives in objects to the high level language which was causing it to consume excessive amounts of memory when processing huge data streams.
Another system would download a huge data set for every point of sale constantly, then parse and apply it. It had to reflect changes quickly but would download the whole dataset each time containing hundreds of thousands of rows. I whipped up a system so that a single server (barring redundancy) would download it in a loop, parse it using C which was much faster than the traditional interpreted language, then use a custom data differential format, TCP data streaming protocol, binary serialisation and LZMA compression to pipe it down to points of sale. This protocol also used versioning for catchup and differential combination for additional reduction in size. It went from being 30 seconds to a few minutes behind to using able to keep up to with in a second of changes. It was also using so much bandwidth that it would reach the limit on ADSL connections then get throttled. I looked at the traffic stats after and it dropped from dozens of terabytes a month to around a gigabyte or so a month for several hundred machines. The drop in the graphs you'd think all the machines had been turned off as that's what it looked like. It could now happily run over GPRS or 56K.
I was working on a project with a lot of data and noticed these huge tables and horrible queries. The tables were all the results of queries. Someone wrote terrible SQL then to optimise it ran it in the background with all possible variable values then store the results of joins and aggregates into new tables. On top of those tables they wrote more SQL. I wrote some new queries and query generation that wiped out thousands of lines of code immediately and operated on the original tables taking things down from 30GB and rapidly climbing to a couple GB.
Another time a piece of mathematics had to generate all possible permutations and the existing solution was factorial. I worked out how to optimise it to run n*n which believe it or not made the world of difference. Went from hardly handling anything to handling anything thrown at it. It was nice trying to get people to "freeze the system now".
I build my own frontend systems (admittedly rushed) that do what angular/react/vue aim for but with higher (maximum) performance including an in memory data base to back the UI that had layered event driven indexes and could handle referential integrity (overlay on the database only revealing items with valid integrity) or reordering and reposition events very rapidly using a custom AVL tree. You could layer indexes over it (data inheritance) that could be partial and dynamic.
So many times have I optimised things on automatic just cleaning up code normally. Hundreds, thousands of optimisations. It's what makes my clock tick.4 -
It has been bugging the shit out of me lately... the sheer number of shit-tier "programmers" that have been climbing out of the woodwork the last few years.
I'm not trying to come across as elitist or "holier than thou", but it's getting ridiculous and annoying. Even on here, you have people who "only do frontend development" or some other lame ass shit-stain of an excuse.
When I first started learning programming (PHP was my first language), it wasn't because I wanted to be a programmer. I used to be a member (my account is still there, in fact) of "HackThisSite", back when I was about 12 years old. After hanging out long enough, I got the hint that the best hackers are, in essence, programmers.
Want to learn how to do SQL injection? Learn SQL - write a program that uses an SQL database, and ask yourself how you would exploit your own software.
Want to reverse engineer the network protocol of some proprietary software? Learn TCP/IP - write a TCP/IP packet filter.
Back then, a programmer and a hacker were very much one in the same. Nowadays, some kid can download Python, write a "hello, world" program and they're halfway to freelancing or whatever.
It's rare to find a programmer - a REAL programmer, one who knows how the systems he develops for better than the back of his hand.
These days, I find people want the instant gratification that these simpler languages provide. You don't need to understand how virtual memory works, hell many people don't even really understand C/C++ pointers - and that's BASIC SHIT right there.
Put another way, would you want to take your car to a brake mechanic that doesn't understand how brakes work? I sure as hell wouldn't.
Watching these "programmers" out there who don't have a fucking clue how the code they write does what it does, is like watching a grown man walk around with a kid's toolbox full or plastic toys calling himself a mechanic. (I like cars, ok?!)
*sigh*
Python, AngularJS, Bootstrap, etc. They're all tools and they have their merits. But god fucking dammit, they're not the ONLY damn tools that matter. Stop making excuses *not* to learn something, Mr."IOnlyDoFrontEnd".
Coding ain't Lego's, fuckers.36 -
Most kids just want to code. So they see "Computer Science" and think "How to be a hacker in 6 weeks". Then they face some super simple algebra and freak out, eventually flunking out with the excuse that "uni only presents overtly theoretical shit nobody ever uses in real life".
They could hardly be more wrong, of course. Ignore calculus and complexity theory and you will max out on efficiency soon enough. Skip operating systems, compilers and language theory and you can only ever aspire to be a script kiddie.
You can't become a "data scientist" without statistics. And you can never grow to be even a mediocre one without solid basic research and physics training.
Hack, I've optimized literal millions of dollars out of cloud expenses by choosing the best processors for my stack, and weeks later got myself schooled (on devRant, of all places!) over my ignorance of their inner workings. And I have a MSc degree. Learning never stops.
So, to improve CS experience in uni? Tear down students expectations, and boil out the "I just wanna code!" kiddies to boot camps. Some of them will be back to learn the science. The rest will peak at age 33.17 -
DEAR CTOs, PLEASE ASK THE DEVELOPER OF THE SOFTWARE WHICH YOU ARE PLANNING TO BUY IN WHAT LANGUAGE AND WHAT VERSION THEY ARE WRITTEN IN.
Background: I worked a LONG time for a software company which developed a BIG crm software suite for a very niche sector. The softwary company was quite successfull and got many customers, even big companies bought our software. The thing is: The software is written in Ruby 1.8.7 and Rails 2. Even some customer servers are running debian squeeze... Yes, this setup is still in production use in 2022. (Rails 7 is the current version). I really don't get it why no one asked for the specific setup, they just bought it. We always told our boss, that we need time to upgrade. But he told every time, no one pays for an tech upgrade... So there it is, many TBs of customer data are in systems which are totally old, not updated and with possibly security issues.9 -
Working in the embedded systems industry for most of my life, I can tell you methodical testing by the software engineers is significantly lacking. Compared to the higher level language development with unit tests and etc, something i think the higher level abstracted industry actually hit out the of park successfully.
The culture around unit testing and testing in general is far superior in java and the rest.
Down here in embedded all too often I hear “well it worked on my setup... it worked at my desk”.. or Oh I forgot to test that part.. or I didn’t think that perticular value could get passed in... etc I’ve heard it all. Then I’ve also heard, you can’t do TTD or unit tests like high level on embedded... HORSESHIT!
You most definitely can! This book is a great book to prove a point or use as confirmation you are doing things correctly. My history with this book was I gonna as doing my own technique of unit testing based on my experience in the high level. Was it perfect no but I caught much more than if I hadn’t done the testing. THEN I found this book, and was like ohh cool I’m glad I’m on the right thought process because essentially what they were doing in the book is what I was doing just slightly less structured and missing a few things.
I’ve seen coworkers immediately think it’s impossible to utilize host testing .. wrong.
Come to find out most the of problems actually are related to lack of abstraction or for thought out into software system design by many lone wolf embedded developers.. either being alone, or not having to think about repercussions of writing direct register writes in application or creating 1500 line “main functions” because their perception is “main = application”. (Not everyone is like this) but it seems to be related to the EEs writing code ( they don’t know wha the CS knows) and CS writing over abstraction and won’t fit on Embedded... then you have CEs that either get both sides or don’t.. the ones to understand the low level need but also get high level concepts and pariadigms and adapt them to low level requirements BOOM those are the special folks.
ANYway..the book is great because it’s a great beginner book for those embedded folks who don’t understand what TDD is or Unit testing and think they can’t do it because they are embedded. So all they do is AdHoc testing on the fly no recording results no concluding data very quick spot check and done....
If your embedded software engineers say they can’t unit test or do TDD or anything other than AdHoc Testing...Throw the book at them and say you want the unit test results report by next week Friday and walk away.
Lol7 -
Warning: Long rant ahead!
So we built an amazing system for managing swarms of drones, and we have flown hundreds of hours, testing, etc.
Comes a client and says, that he wants to buy our system, but he wants to integrate it in a bigger system that is supposed to orchestrate many small systems.
Sounds like a deal.
So they send me on a week course (see previous rant: https://devrant.com/rants/2049071/...) to learn how to integrate our system in theirs.
I was sure that they have some API or something and it should be a breeze. but apparently they give us an SDK that includes all their files, and we have to build and run their entire system, and then build our own API inside of it!
And the reason we needed a week-long course, was to know all the paths where the XML configuration files exist!
So for the last month, I am hacking away inside this huge program, navigating thousands of files in a language I don't know, in order to build an API for their system, so that I can use it on our side.
Yesterday they informed us that a new version is available.
And sure enough, waiting in my inbox this morning was a link to download a new SDK.
No Changelog, No Instructions, Just a zip file with over 25,000 files.
So I phone my contact in their company to ask how exactly I am supposed to update their files, and his answer was: diff them!
WHAT! 25,000 files, half of them built by the c++ compiler, tens of configuration files scattered in different places, linking all the new libraries from scratch, are they crazy or what?
And then he tells me that they are working for 15 years this way. That's why everyone hates them I guess.
going to have a long day...
P.S. many more rants to come from this integration.4 -
Currently I'm working on 3D game engine and making a 3D minesweeper game with it.
I have started creating a compiler not long ago using my own implementation (no Lex no tools nothing just raw algorithms application) to hopefully some day I will be able to make a language that works on top of glsl inside my game engine. I have compilers design class this semester which haven't even started yet and made a lexical analyser generator. I also have another class about geographical information systems which I will be using my engine to create some demos for some 3D rendering techniques like level of details or maybe create something similar to arcgis which we will be using.
Oh man I have many stuff I want to do.
Here is a gif showing the state of my minesweeper game. I clearly lack artistic skills lol. One thing I will be making is to model the sphere as squares not triangles.
Finally I want to mention that I months ago saw someone here at devrant making a voronoi diagrams variant of this which inspired me to make this.
I made long post so
TLDR : having fun reinventing the weel and learning 😀 -
NEW 6 Programming Language 2k16
1. Go
Golang Programming Language from Google
Let's start a list of six best new programming language and with Go or also known by the name of Golang, Go is an open source programming language and developed by three employees of Google and the launch in 2009, very cool just 3 people.
Go originated and developed from the popular programming languages such as C and Java, which offers the advantages of compact notation and aims to keep the code simple and easy to read / understand. Go language designers, Robert Griesemer, Rob Pike and Ken Thompson, revealed that the complexity of C ++ into their main motivation.
This simple programming language that we successfully completed the most tasks simply by librariesstandar luggage. Combining the speed of pemrogramandinamis languages such as Python and to handalan of C / C ++, Go be the best tools for building 'High Volume of distributed systems'.
You need to know also know, as expressed by the CTO Tokopedia namely Mas Leon, Tokopedia will switch to GO-lang as the main foundation of his system. Horrified not?
eh not watch? try deh see in the video below:
[Embedyt] http://youtube.com/watch/...]
2. Swift
Swift Programming Language from Apple
Apple launched a programming language Swift ago at WWDC 2014 as a successor to the Objective-C. Designed to be simple as it is, Swift focus on speed and security.
Furthermore, in December 2015, Swift Apple became open source under the Apache license. Since its launch, Swift won eye and the community is growing well and has become one of the programming languages 'hottest' in the world.
Learning Swift make sure you get a brighter future and provide the ability to develop applications for the iOS ecosystem Apple is so vast.
Also Read: What to do to become a full-stack Developer?
3. Rust
Rust Programming Language from Mozilla
Developed by Mozilla in 2014 and then, and in StackOverflow's 2016 survey to the developer, Rust was selected as the most preferred programming language.
Rust was developed as an alternative to C ++ for Mozilla itself, which is referred to as a programming language that focus on "performance, parallelisation, and memory safety".
Rust was created from scratch and implement a modern programming language design. Its own programming language supported very well by many developers out there and libraries.
4. Julia
Julia Programming Language
Julia programming language designed to help mathematicians and data scientist. Called "a complete high-level and dynamic programming solution for technical computing".
Julia is slowly but surely increasing in terms of users and the average growth doubles every nine months. In the future, she will be seen as one of the "most expensive skill" in the finance industry.
5. Hack
Hack Programming Language from Facebook
Hack is another programming language developed by Facebook in 2014.
Social networking giant Facebook Hack develop and gaungkan as the best of their success. Facebook even migrate the entire system developed with PHP to Hack
Facebook also released an open source version of the programming language as part of HHVM runtime platform.
6. Scala
Scala Programming Language
Scala programming termasukbahasa actually relatively long compared to other languages in our list now. While one view of this programming language is relatively difficult to learn, but from the time you invest to learn Scala will not end up sad and disappointing.
The features are so complex gives you the ability to perform better code structure and oriented performance. Based programming language OOP (Object oriented programming) and functional providing the ability to write code that is capable of evolving. Created with the goal to design a "better Java", Scala became one behasa programming that is so needed in large enterprises.3 -
I spent over a decade of my life working with Ada. I've spent almost the same amount of time working with C# and VisualBasic. And I've spent almost six years now with F#. I consider all of these great languages for various reasons, each with their respective problems. As these are mostly mature languages some of the problems were only knowable in hindsight. But Ada was always sort of my baby. I don't really mind extra typing, as at least what I do, reading happens much more than writing, and tab completion has most things only being 3-4 key presses irl. But I'm no zealot, and have been fully aware of deficiencies in the language, just like any language would have. I've had similar feelings of all languages I've worked with, and the .NET/C#/VB/F# guys are excellent with taking suggestions and feedback.
This is not the case with Ada, and this will be my story, since I've no longer decided anonymity is necessary.
First few years learning the language I did what anyone does: you write shit that already exists just to learn. Kept refining it over time, sometimes needing to do entire rewrites. Eventually a few of these wound up being good. Not novel, just good stuff that already existed. Outperforming the leading Ada company in benchmarks kind of good. At the time I was really gung-ho about the language. Would have loved to make Ada development a career. Eventually build up enough of this, as well as a working, but very bad performing compiler, and decide to try to apply for a job at this company. I wasn't worried about the quality of the compiler, as anyone who's seriously worked with Ada knows, the language is remarkably complex with some bizarre rules in dark corners, so a compiler which passes the standards test indicates a very intimate knowledge of the language few can attest to.
I get told they didn't think I would be a good fit for the job, and that they didn't think I should be doing development.
A few months of rapid cycling between hatred and self loathing passes, and then a suicide attempt. I've got past problems which contributed more so than the actual job denial.
So I get better and start working even harder on my shit. Get the performance of my stuff up even better. Don't bother even trying to fix up the compiler, and start researching about text parsing. Do tons of small programs to test things, and wind up learning a lot. I'm starting to notice a lot of languages really surpassing Ada in _quality of life_, with things package managers and repositories for those, as well as social media presence and exhaustive tutorials from the community.
At the time I didn't really get programming language specific package managers (I do now), but I still brought this up to the community. Don't do that. They don't like new ideas. Odd for a language which at the time was so innovative. But social media presence did eventually happen with a Twitter account that is most definitely run by a specific Ada company masquerading as a general Ada advocate. It did occasionally draw interest to neat things from the community, so that's cool.
Since I've been using both VisualStudio and an IDE this Ada company provides, I saw a very jarring quality difference over the years. I'm not gonna say VS is perfect, it's not. But this piece of shit made VS look like a polished streamlined bug free race car designed by expert UX people. It. Was. Bad. Very little features, with little added over the years. Fast forwarding several years, I can find about ten bugs in five minutes each update, and I can't find bugs in the video games I play, so I'm no bug finder. It's just that bad. This from a company providing software for "highly reliable systems"...
So I decide to take a crack at writing an editor extension for VS Code, which I had never even used. It actually went well, and as of this writing it has over 24k downloads, and I've received some great comments from some people over on Twitter about how detailed the highlighting is. Plenty of bespoke advertising the entire time in development, of course.
Never a single word from the community about me.
Around this time I had also started a YouTube channel to provide educational content about the language, since there's very little, except large textbooks which aren't right for everyone. Now keep in mind I had written a compiler which at least was passing the language standards test, so I definitely know the language very well. This is a standard the programmers at these companies will admit very few people understand. YouTube channel met with hate from the community, and overwhelming thanks from newcomers. Never a shout out from the "community" Twitter account. The hate went as far as things like how nothing I say should be listened to because I'm a degenerate Irishman, to things like how the world would have been a better place if I was successful in killing myself (I don't talk much about my mental illness, but it shows up).
I'm strictly a .NET developer now. All code ported.5 -
I was drunk at a party and so was this guy, that I knew from scouting and who knew that I was capable of programming, even tho he very clearly disagreed with my choice in language.
We started talking about this new system that we (all scouts in Denmark) have to use and he told me how his work was affected by the fact that this systems API is the purest of shit.
He told me that he would really like someone to help him with his work, cause right now he was alone. They were looking for someone new, but for some reason the boss wanted a new guy to have 5 years of experience in Java... Which they don't use.
So he got my information and would put in a good word for me -
Been really busy with things haven’t got around to posting a book in like a week or so..
But I’ll post one today..
This book...
This book, available for free online or you can buy it, written in 1994. But so under appreciated by people for some reason most people never have seen it or know about it. But this is the ONLY book I know of that actually covers this topic.. the only book in existence that specifically goes thru how OOP can be done with C.
NOW hold up before you say just use C++ stop and think for a second.. bear with me.
First off this book is purely for informational purposes and educational use to deepen your understanding of what OOP is actually doing behind the scenes in languages like C++ where keywords exist for these things and you just blindly use them without thinking about under the hood.
This book contains a lot of code and builds you up a complexly library from scratch to make OOP in C... now I don’t take this book literally and this but I have implemented some concepts from this book in projects in the past, and it helps a lot.
Also in my honest opinion If you finish this book, you will be a better C programmer AND c++ programmer, C programming because it teaches you a lot about complex things that you never thought about doing with the language. It proves you can do polymorphism can do inheritance and encapsulation. And it’s not really bloated either.
This books is an awesome book, if you don’t understand C pointers you definitely will after this book.. if you don’t understand OOP in C++ what’s really going on.. you will after this book. After all C++ began as just a preprocessor of C.
Great book for writing reusable, extendable large scale embedded c systems.
Anyway.. rare book of which should not be rare considering it’s free.3 -
First year: intro to programming, basic data structures and algos, parallel programming, databases and a project to finish it. Homework should be kept track of via some version control. Should also be some calculus and linear algebra.
Second year:
Introduce more complex subjects such as programming paradigms, compilers and language theory, low level programming + logic design + basic processor design, logic for system verification, statistics and graph theory. Should also be a project with a company.
Year three:
Advanced algos, datastructures and algorithm analysis. Intro to Computer and data security. Optional courses in graphics programming, machine learning, compilers and automata, embedded systems etc. ends with a big project that goes in depth into a CS subject, not a regular software project in java basically.4 -
WHY THE FUCK DO MY TEACHERS KEEP USING SHITTY TRANSLATIONS FOR PROGRAMMING CONCEPTS?! Like dude, everything related to programming is in english, just use the fucking terms in english for fucks sake. There are some words like "array" that fit into portuguese sentences without needing translation, so why translate it?
Why do you use acronyms in portuguese? People in the Database Systems class will later read a lot the acronym DBMS but won't know what the fuck that is because they teach the acronym SGBD, which is a translation.
It's so cringy and useless, so many terms the students will have to translate back to english when they get out to the real world because everything related to programming is in english.
"oh but what if the person doesn't know english" you don't even have to know english, just associate the concept (which will be explained to you in your language) with an english word. Also if you don't know english you'll have a very hard time, so I'd suggest taking english classes as your electives.
Ok I'm done, I got it out of my system.6 -
Buckle up, it's a long one.
Let me tell you why "Tree Shaking" is stupidity incarnate and why Rich Harris needs to stop talking about things he doesn't understand.
For reference, this is a direct response to the 2015 article here: https://medium.com/@Rich_Harris/...
"Tree shaking", as Rich puts it, is NOT dead code removal apparently, but instead only picking the parts that are actually used.
However, Rich has never heard of a C compiler, apparently. In C (or any systems language with basic optimizations), public (visible) members exposed to library consumers must have that code available to them, obviously. However, all of the other cruft that you don't actually use is removed - hence, dead code removal.
How does the compiler do that? Well, it does what Rich calls "tree shaking" by evaluating all of the pieces of code that are used by any codepaths used by any of the exported symbols, not just the "main module" (which doesn't exist in systems libraries).
It's the SAME FUCKING THING, he's just not researched enough to fully fucking understand that. But sure, tell me how the javascript community apparently invented something ELSE that you REALLY just repackaged and made more bloated/downright wrong (React Hooks, webpack, WebAssembly, etc.)
Speaking of Javascript, "tree shaking" is impossible to do with any degree of confidence, unlike statically typed/well defined languages. This is because you can create artificial references to values at runtime using string functions - which means, with the right input, almost anything can be run depending on the input.
How do you figure out what can and can't be? You can't! Since there is a runtime-based codepath and decision tree, you run into properties of Turing's halting problem, which cannot be solved completely.
With stricter languages such as C (which is where "dead code removal" is used quite aggressively), you can make very strong assertions at compile time about the usage of code. This is simply how C is still thousands of times faster than Javascript.
So no, Rich Harris, dead code removal is not "silly". Your entire premise about "live code inclusion" is technical jargon and buzzwordy drivel. Empty words at best.
This sort of shit is annoying and only feeds into this cycle of the web community not being Special enough and having to reinvent every single fucking facet of operating systems in your shitty bloated spyware-like browser and brand it with flashy Matrix-esque imagery and prose.
Fuck all of it.20 -
We should start with demystifying tech...
For most people, modern phones, tablets and pcs are magical rectangles...
The law of Clarke says, that every sufficiently advanced technology is indistinguishable from magic.
And we have to tackle that.
In geography, we should talk about gps and glosnas
In English or foreign language lessons, we should speak about translator bots and language patters/abstractions
In physics, we have to understand the measurement devices
In politics, we have to speak about licenses of use, we have to speak about netneutrality as a political concept, we have to speak about snowden, shadow brokers, the vault, all the laws some shady imperial beauroticians pipe into our life.
Trojans used by the government and so on...
In cs concepts of operating systems, abstractions and networking should be taught, instead of using excel.
That could be done in math...
Well... No one should have to work with excel.
In maths they could use Wolfram alpha, rlang and gnupolt for example14 -
Many people here rant about the dependency hell (rightly so). I'm doing systems programming for quite some time now and it changed my view on what I consider a dependency.
When you build an application you usually have a system you target and some libraries you use that you consider dependencies.
So the system is basically also a dependency (which is abstracted away in the best case by a framework).
What many people forget are standard libraries and runtimes. Things like strlen, memcpy and so on are not available on many smaller systems but you can provide implementations of them easily. Things like malloc are much harder to provide. On some system there is no heap where you could dynamically allocate from so you have to add some static memory to your application and mimic malloc allocating chunks from this static memory. Sometimes you have a heap but you need to acquire the rights to use it first. malloc doesn't provide an interface for this. It just takes it. So you have to acquire the rights and bring them magically to malloc without the actual application code noticing. So even using only the C standard library or the POSIX API can be a hard to satisfy dependency on some systems. Things like the C++ standard library or the Go runtime are often completely unavailable or only rudimentary.
For those of you aiming to write highly portable embedded applications please keep in mind:
- anything except the bare language features is a dependency
- require small and highly abstracted interfaces, e.g. instead of malloc require a pointer and a size to be given to you application instead of your application taking it
- document your ABI well because that's what many people are porting against (and it makes it easier to interface with other languages)2 -
There are a couple of them to list! But to sum my main ones(biggest personal heroes):
John McCarthy, one of the founding fathers of Artificial Intelligence and accredited with coining such term(sometimes before 1960 if memory serves right), a mathematical prodigy, the man based the original model of the Lisp programming language in lambda calculus. Many modern concepts that we have in programming where implemented in one way or another from his systems back in the day, and as a data analyst and ML nut.....well I am a big fan.
Herb Sutter: C++ programmer extraordinaire. I appreciate him more for his lectures and published articles than anything else. Incredibly smart and down to earth and manages to make C++ less intimidating while still approaching it with respect.
Rich Hickey: The mastermind behind Clojure, the Lisp dialect for the JVM. Rich is really talented and his lectures behind his motivations and reasons behind everything he does with Clojure are fascinating to see.
Ryan Dahl: Awww shit y'all know how it is. The man changed web development both in the backend and the frontend for good. The concept of people writing their own servers to run their pages was not new, but the Node JS runtime environment made it more widely available to people by means of a simple to use language that was already popular with web developers. I would venture to say that Ryan's amazing contributions to JS made the language better, as it stands, the language continues to evolve and new features that make it overall better keep being added. He is currently building Deno, which would be a runtime environment for TypeScript, in Rust.
Anders Hejlsberg: This dude was everywhere man....the original author of Turbo Pascal and the lead of Delphi back in the day. These RAD tools paved the way for what would be a revolution in the computing world. The dude is also the lead architect and designer of the C# programming language as well as TypeScript.
This fucker is everywhere and I love it.
Yukihiro "Matz" Matsumoto: Matsumoto san is the creator of the Ruby programming language. Not only am I a die hard fan of Ruby, but of the core philosophies that the man keeps as the core of his language design: Make the developer happy, principle of least surprise. Also I follow: minswan which is a term made by the Ruby community that states Mats is nice so we are nice. <---- because being cool to others is better than being a passive aggressive cunt.
Steve Wozniak: I feel as if the man does not get enough recognition...the man designed the Apple || computer which (regardless of how much most of y'all bitch and whine) paved the way for modern micro computers. Dude is also accredited with designing one of the first programmable universal remotes(which momma said was shitty) but he did none the less.
Alan Kay: Developed Smalltalk and the original OOP way of doing things. Smalltalk as a concept is really fucking interesting. If you guys ever get the chance, play with Pharo, which is a modern Smalltalk. The thing is really interesting and the overall idea of Smalltalk can be grasped in very little time. It sucks because the software scales beautifully in terms of project building, the idea of hoisting a program as its own runtime environment and ide by preserving state through images is just mind blowing to me. Makes file based programs feel....well....quaint.
Those are some of the biggest dudes for me. I know that the list is large, but I wanted to give credit to the people that inspired me the most. Honorary mention goes to other language creators and engineers of course, but it would be way too large to list!9 -
(long post is long)
This one is for the .net folks. After evaluating the technology top to bottom and even reimplementing several examples I commonly use for smoke testing new technology, I'm just going to call it:
Blazor is the next Silverlight.
It's just beyond the pale in terms of being architecturally flawed, and yet they're rushing it out as hard as possible to coincide with the .Net 5 rebranding silo extravaganza. We are officially entering round 3 of "sacrifice .Net on the altar of enterprise comfort." Get excited.
Since we've arrived here, I can only assume the Asp.net Ajax fiasco is far enough in the past that a new generation of devs doesn't recall its inherent catastrophic weaknesses. The architecture was this:
1. Create a component as a "WebUserControl"
2. Any time a bound DOM operation occurs from user interaction, send a payload back to the server
3. The server runs the code to process the event; it spits back more HTML
Some client-side js then dutifully updates the UI by unceremoniously stuffing the markup into an element's innerHTML property like so much sausage.
If you understand that, you've adequately understood how Blazor works. There's some optimization like signalR WebSockets for update streaming (the first and only time most blazor devs will ever use WebSockets, I even see developers claiming that they're "using SignalR, Idserver4, gRPC, etc." because the template seeds it for them. The hubris.), but that's the gist. The astute viewer will have noticed a few things here, including the disconnect between repaints, inability to blend update operations and transitions, and the potential for absolutely obliterative, connection-volatile, abusive transactional logic flying back and forth to the server. It's the bring out your dead approach to seeing how much of your IT budget is dedicated to paying for bandwidth and CPU time.
Blazor goes a step further in the server-side render scenario and sends every DOM event it binds to the server for processing. These include millisecond-scale events like scroll, which, at least according to GitHub issues, devs are quickly realizing requires debouncing, though they aren't quite sure how to accomplish that. Since this immediately becomes an issue with tickets saying things like, "scroll event crater server, Ugg need help! You said Blazorclub good. Ugg believe, Ugg wants reparations!" the team chooses a great answer to many problems for the wrong reasons:
gRPC
For those who aren't familiar, gRPC has a substantial amount of compression primarily courtesy of a rather excellent binary format developed by Google. Who needs the Quickie Mart, or indeed a sound markup delivery and view strategy when you can compress the shit out of the payload and ignore the problem. (Shhh, I hear you back there, no spoilers. What will happen when even that compression ceases to cut it, indeed). One might look at all this inductive-reasoning-as-development and ask themselves, "butwai?!" The reason is that the server-side story is just a way to buy time to flesh out the even more fundamentally broken browser-side story. To explain that, we need a little perspective.
The relationship between Microsoft and it's enterprise customers is your typical mutually abusive co-dependent relationship. Microsoft goes through phases of tacit disinterest, where it virtually ignores them. And rightly so, the enterprise customers tend to be weaksauce, mono-platform, mono-language types who come to work, collect a paycheck, and go home. They want to suckle on the teat of the vendor that enables them to get a plug and play experience for delivering their internal systems.
And that's fine. But it's also dull; it's the spouse that lets themselves go, it's the girlfriend in the distracted boyfriend meme. Those aren't the people who keep your platform relevant and competitive. For Microsoft, that crowd has always been the exploratory end of the developer community: alt.net, and more recently, the dotnet core community (StackOverflow 2020's most loved platform, for the haters). Alt.net seeded every competitive advantage the dotnet ecosystem has, and dotnet core capitalized on. Like DI? You're welcome. Are you enjoying MVC? Your gratitude is understood. Cool serializers, gRPC/protobuff, 1st class APIs, metadata-driven clients, code generation, micro ORMs, etc., etc., et al. Dear enterpriseur, you are fucking welcome.
Anyways, b2blazor. So, the front end (Blazor WebAssembly) story begins with the average enterprise FOMO. When enterprises get FOMO, they start to Karen/Kevin super hard, slinging around money, privilege, premiere support tickets, etc. until Microsoft, the distracted boyfriend, eventually turns back and says, "sorry babe, wut was that?" You know, shit like managers unironically looking at cloud reps and demanding to know if "you can handle our load!" Meanwhile, any actual engineer hides under the table facepalming and trying not to die from embarrassment.36 -
1. Build a language interpreter from scratch. Without any tool like Gnu Bison
2. Get comfortable with modern C++
3. Find an internship related with operating systems or other similar areas. -
Why is it that virtually all new languages in the last 25 years or so have a C-like syntax?
- Java wanted to sort-of knock off C++.
- C# wanted to be Java but on Microsoft's proprietary stack instead of SUN's (now Oracle's).
- Several other languages such as Vala, Scala, Swift, etc. do only careful evolution, seemingly so as to not alienate the devs used to previous C-like languages.
- Not to speak of everyone's favourite enemy, JavaScript…
- Then there is ReasonML which is basically an alternate, more C-like, syntax for OCaml, and is then compiled to JavaScript.
Now we're slowly arriving at the meat of this rant: back when I started university, the first semester programming lecture used Scheme, and provided a fine introduction to (functional) programming. Scheme, like other variants of Lisp, is a fine language, very flexible, code is data, data is code, but you get somewhat lost in a sea of parentheses, probably worse than the C-like languages' salad of curly braces. But it was a refreshing change from the likes of C, C++, and Java in terms of approach.
But the real enlightenment came when I read through Okasaki's paper on purely functional data structures. The author uses Standard ML in the paper, and after the initial shock (because it's different than most everything else I had seen), and getting used to the notation, I loved the crisp clarity it brings with almost no ceremony at all!
After looking around a bit, I found that nobody seems to use SML anymore, but there are viable alternatives, depending on your taste:
- Pragmatic programmers can use OCaml, which has immutability by default, and tries to guide the programmer to a functional programming mindset, but can accommodate imperative constructs easily when necessary.
- F# was born as OCaml on .NET but has now evolved into its own great thing with many upsides and very few downsides; I recommend every C# developer should give it a try.
- Somewhat more extreme is Haskell, with its ideology of pure functions and lazy evaluation that makes introducing side effects, I/O, and other imperative constructs rather a pain in the arse, and not quite my piece of cake, but learning it can still help you be a better programmer in whatever language you use on a day-to-day basis.
Anyway, the point is that after working with several of these languages developed out of the original Meta Language, it baffles me how anyone can be happy being a curly-braces-language developer without craving something more succinct and to-the-point. Especially when it comes to JavaScript: all the above mentioned ML-like languages can be compiled to JavaScript, so developing directly in JavaScript should hardly be a necessity.
Obviously these curly-braces languages will still be needed for a long time coming, legacy systems and all—just look at COBOL—, but my point stands.7 -
This basically is me rambling all my thoughts that have been clouding my mind.
Learning other programming languages after learning the first is harder than I expected. I learned python first but that's making learning others (which I know arent similar but ) C, ES6, PHP, etc. I need to figure out what makes each one special and get a proper path instead of learning them all the same way. Which is easier for the web dev languages but fuck man I just need a good path for them and I'm good. Like learn this this this this that and that and I've got a basic understanding of the language I dont need to stress and I can casually build my knowledge from here now that I understand all this. Cause I love programming and I want to be the best I can be and just get to the level I am with python. And at some point I have to learn about basic electronics and learning how to program Arduinos with C so I can do stuff with that because I really really REALLY want to.
It doesnt stop there. I want to learn another language and no I'm not talkin bout programming anymore I mean I wanna learn Japanese and German (but japanese primarily) but it doesnt help that I'm always either in school, studying, programming, or playing games. I just cant find time to practice Hiragana&Katakana (two basic writing systems in japan) and it doesnt help that I'm a lazy procrastinating piece of shit that doesnt have or can keep a proper schedule and hell I barely can English and Its my native tongue. Ugh. Itd be better if I had a native speaker to help me tbh.
And finally I want to learn basic pixel animating I have dreamed as a kid to do some kind of animation and programming and I want to do both for games I want to program for fun but it doesnt help that I cant draw sprites or anything for shit. I cant get it and I just am fucked but I'm going to ask some people I know and a few subreddits for advice/help/resources with that
Welp that was the Bubbles Power Hour none of you probably are keen followers of mine and if I had any I'd be shocked and honored but thanks for reading anyways and any advice on anything is always appreciated!random rambling electronics es6 stress language learning php python c foreign languages pixel art javascript11 -
Trying to re-type a massive essay I lost because the app refreshed for some reason. I'll try to keep it short (spoiler: I lied).
Recently, I had a conversation with a couple of non-tech people about AI and the fear of computers making humans obsolete. I have some strong (borderline ranty) opinions about this, and thought I'd post here to see what reaction is get.
This is not a "machines will destroy us" post, it's more about the very legitimate great of losing jobs.
- AI is a tool. It's main use would to be help optimise the more complex routine tasks and free up people's time to be more creative in their jobs. Basically, it's the next step of automation.
- Human intuition can never be replaced. Sometimes, things just seem a bit off. Sure, an AI would avoid ever getting in that situation, but only if it had learnt it in the past. A human will always have to be at the helm of any such system.
- Achieving true intelligence and sentience is like trying to travel at the speed of light. The closer you get, the more challenges you face.
- Getting hyped by sensationalist news that claims the end is nigh because two computers optimised the language they used to communicate when trying to reach a goal is stupid. All this shows is that the tech is working as expected and the systems can optimise on the fly. To me, this was a pretty awesome moment.
Now, I'm not saying dystopia is impossible, neither am I saying that it is inevitable. Just like any tool presented to us, if we use it responsibly, we can make life and society a lot better.5 -
It's been 5 years this month since I started learning programming, getting interested after learning about Linux, wanting to do operating systems and games.
I started with C++, went on to C and assembly language for about 2 years and gave up on it for the most part.
Afterward did Java for two years and hated every second of it! Switched to Python instead (been using it since 2.7.5).
Now I do Haskell and JavaScript and those languages do everything so much easier I can never see myself ever going back!2 -
I wrote a tech book several years ago for O'Reilly, which itself was a dream come true. I'm still amazed I got that deal done, and the fact that my name was on a title with a unique animal on the cover is SUPER cool.
Back then, their publishing system was based on Git with their own markup language, and it was sort of a chore to use. Easy and straightforward, but laborious. I spent 3 entire days just (re)formatting my drafts to their code. They've upgraded it since, I see, based on the same fundamental versioning idea and still using Git. Neat!
I've also done tech writing for .NET Magazine, which used Word's change tracking, and penned articles for other publications using Google Docs, or even drafts in WordPress.
Have all of you run into interesting systems used by publishers to manage content?2 -
every day I see full stack here and there...
full stack is not only db and code, but also "every step the bit goes through " from end user's screen/input to server and back to him
whether is an app or service, end user is only an example.
it's about knowing how the language behaves, how the server interprets and replies to requests, protocols, even how to do every single configuration on the systems you are using, and in my point of view that includes hardware.
pretty much that...
I get sic when I see on a resume claiming "I'm a full stack dev" and there's nothing on it saying that the guy knows at least to change a light bulb... lol
Even worse, when I see job offers asking for "Full stack Dev, with no experience" ...
that's not possible without experience ! sorry9 -
Please share C++ advice for devs new to the language. What do you guys think is the fastest route to self learning C++ development for infrastructure systems and please suggest resources.11
-
Lots of good suggestions up in here.
My personal prefference:
Such as there are governing bodies indiciating how a programming language evolves and a web consortium...there should be a computer science one. That dictates fundamental approaches covering everything that belongs to this wonderful branch of science. Everything from math to differenr scientific branches all the way down to turtles. And for it to be standarized and updated. Indeed, if you want to spend your entire existence gobbling js in the form of web sites then that is fine, but you should have sufficient knowledge to branch out into more academic pursuits if required.
Also, updated tools would be better, every aspiring computer scientist shall be able to navigate through all major operating systems and programming environments regardless of their beliefs and or prefferences and schools should provide said environments in their classrooms.
Data Strucrutes and Algorithms should be a must. Software engineering principles should be a must. Calculus, Algebra and Statistics as well as Physica should be a must.
And succesfully navigating over different engineering areas should be a must.
Not to cleanse the industry. Fuck your elitist mentality. If you think that programming is a sacred art that should exclude people then I really hope you fucking disapear from existence. No, not to cleanse. But to expand the industry and maybe show people that there is more than fucking around between node modules or gemsets.
Peace pendejos
**drops your mom's fatass...i mean mic** -
Been programming one language or another since the 90s. So I have been exposed to a lot of things and worked on a lot of different systems. However I have never heard of Fizz Buzz before. I heard it was something they use to test people's programming skills during an interview. I figured I better look it up in case I get asked this during an interview. Of course I found a nice explanation on wikipedia:
https://en.wikipedia.org/wiki/...
I was shocked. This is being used to test programmers for competency? This is so trivial a non programmer could write the pseudocode to solve this problem. Is the bar really this low?
I remember I didn't want to pay for the C programming class in college. So I bought a book on C++ and read it cover to cover and wrote a bit of code. I then tested out of the C course (didn't know C was much different than C++ then, I started with Pascal). I didn't do that great on the written test. However for the coding test I easily passed that. I formatted the text in nice rows and columns using the modulus operator. The instructor said: "I have never seen anybody make it look this nice." Then I was shocked because that is "just how you do it".
It just seems to me that if fizz buzz is hard, then this may not be the right field for you. Am I egotistical in that opinion? None of this programming stuff has ever been particularly difficult for me.2 -
Last week me and my friend have been changed from a legacy PHP project to new Ruby on Rails-based setup. What, in first instance, looked like a great improvement, now becomes a nightmare.
All this convention-over-configuration is awesome - but only if you already know the conventions, or if somebody told'em to you.
And everything is going even more out of control because the damn project is based upon Spree gem and several other extensions, that MUST be changed to meet out company needs.
I'm getting really mad with all this pressure. Ruby seems to be a great language, but I'd rather be working with Laravel. Its overall organization, the centralization of CLI commands in artisan, and the astoundingly clear, eloquent, direct and well-designed documentation made my adoption curve there a little more pleasant.
I mean, legacy PHP systems are awful, but Laravel framework sounds way more easy-to-learn and well-constructed when compared to rails.
But given all this nightmare, I really want to be proved the opposite.1 -
"What language should I learn?" Wellll.
[0]
43 PERCENT Of banking systems are built on COBOL
80 PERCENT Of in-person transactions use COBOL
95 PERCENT Of ATM swipes rely on COBOL code
220 BILLION Lines of COBOL in use today
"Experienced COBOL programmers can earn more than $100 an hour when they get called in to patch up glitches, rewrite coding manuals or make new systems work with old." [1]
Found this pretty interesting/crazy.
Source:
[0] http://tmsnrt.rs/2nMf18G
[1] http://reuters.com/article/...6 -
Because of the amount of complaining I do at work concerning legacy php applications the HOD is trying to push for different technologies to use for backend services. We have met multiple times to discuss the proper way of handling the situation since there are a lot of very obvious things to consider regarding the push for a new tech stack. The typical names have come about, but my biggest issue will be training people for these stacks.
Testing environments with docker and so forth, push for CERTAIN applications to be more API centric and the use of better frontend frameworks that will remain standard for years to come(hard to bet on this one but I tend to orefer React) among other things are the topics of conversation.
Personally I would love to move the shop to something geared towards Golang, thing is, the lead dev is complaining about it saying that the training for a new language would just take time. After a couple of examples he is still not convinced.
I think its wrong of him to center himself on just PHP and JQUERY as the main development stack he uses and learning new things should be part of the job, I also have a case against the spaghetti code that results from just using vanilla php with no proper development practices(composer based systems, oop etc etc you get the gist)
In the end I am starting to think that it will become one of those "fuck off I am the boss" type of deal since I am going to be here after a long time and he has about 2 years before he medically retire.1 -
Like age 8?
As a kid I really liked flash games and animations and wanted to get into it. I couldn't do flash, it looked too complicated but I found a little software by the name od KoolMoves that was just a simpler flash animation tool.
I did a bunch of shitty stick figure animations in it (hello to everyone from stick figure death theatre) but eventually I realized that I can make it do things (interactive menus, choose your story kinda things, move the player around, shoot...!)
I fell in love with AS1 and later AS2.0 and made bunch of demos and proof of concepts for systems and games. Most are lost to time and datarot by now)
Age 12
Eventually I found out I can make the entire Windows machine do what I want using first Batch files and later Visual Basic script (made a skype bot!) At this point I was also really into graphics and logo/web design
Age 15 - 20 or so
Then it was pretty natural to move to actual Visual Basic, then C# and finally I to C++. And I had the C family in my heart forever. I managed to get a but into 3D graphics too and got a part-time in archviz
Even by this point I never believed I could be a programmer as a profession. I thought of it just as something I love, but have no chance getting into compared to some of the names out there. I half expected to be either doing graphics (cause I found it simple at the time) or some shitty random job in an office.
20+
Finally I decided to go to uni and study software development, see if I can touch the future I always dreamed of! And... Well... I found out more than 80% of the people there never touch a language up until now and most people are just as retarded as I thought..
For a while I also worked as a game designer (still not being comfortable calling myself a programmer, so I chose a non programming position) but I ended up going into the code and improving and fixing game designer tools (it was unity and C#)
After seeing actual programmers at work in a company, and talking to a bunch of them I realized I already have everything I need to do this seriously and with that experience out of the way I breezed through uni, learned to love Linux and landed a proper job :)
I kinda hope my experience with long lasting self doubt will be useful for someone -
So I figure since I straight up don't care about the Ada community anymore, and my programming focus is languages and language tooling, I'd rant a bit about some stupid things the language did. Necessary disclaimer though, I still really like the language, I just take issue with defense of things that are straight up bad. Just admit at the time it was good, but in hindsight it wasn't. That's okay.
For the many of you unfamiliar, Ada is a high security / mission critical focused language designed in the 80's. So you'd expect it to be pretty damn resilient.
Inheritance is implemented through "tagged records" rather than contained in classes, but dispatching basically works as you'd expect. Only problem is, there's no sealing of these types. So you, always, have to design everything with the assumption that someone can inherit from your type and manipulate it. There's also limited accessibility modifiers and it's not granular, so if you inherit from the type you have access to _everything_ as if they were all protected/friend.
Switch/case statements are only checked that all valid values are handled. Read that carefully. All _valid_ values are handled. You don't need a "default" (what Ada calls "when others" ). Unchecked conversions, view overlays, deserialization, and more can introduce invalid values. The default case is meant to handle this, but Ada just goes "nah you're good bro, you handled everything you said would be passed to me".
Like I alluded to earlier, there's limited accessibility modifiers. It uses sections, which is fine, but not my preference. But it also only has three options and it's bizarre. One is publicly in the specification, just like "public" normally. One is in the "private" part of the specification, but this is actually just "protected/friend". And one is in the implementation, which is the actual" private". Now Ada doesn't use classes, so the accessibility blocks are in the package (namespace). So guess what? Everything in your type has exactly the same visibility! Better hope people don't modify things you wanted to keep hidden.
That brings me to another bad decision. There is no "read-only" protection. Granted this is only a compiler check and can be bypassed, but it still helps prevent a lot of errors. There is const and it works well, better than in most languages I feel. But if you want a field within a record to not be changeable? Yeah too bad.
And if you think properties could fix this? Yeah no. Transparent functions that do validation on superficial fields? Nah.
The community loves to praise the language for being highly resilient and "for serious engineers", but oh my god. These are awful decisions.
Now again there's a lot of reasons why I still like the language, but holy shit does it scare me when I see things like an auto maker switching over to it.
The leading Ada compiler is literally the buggiest compiler I've ever used in my life. The leading Ada IDE is literally the buggiest IDE I've ever used in my life. And they are written in Ada.
Side note: good resilient systems are a byproduct of knowledge, diligence, and discipline, not the tool you used. -
Hmm... My first experience with computers was in 1991 or so, when my then best friend had C64. And I was 7. My first PC arrived in 1993. Prince of Persia is the first game I remember from that time. I started programming in 1995 or '96, writing useless things in Pascal. Using PHP since 2000. Still that’s my main programming language. And sadly, my kids have different hobbies than me, so they aren’t even trying to program.
I remember the sound of modem connecting thru phone line to some BBS systems and later to the first public and free internet service in Poland. I remember simple, really „computer-like” voice of my dad’s speech synthesizer (he’s blind person). I remember, when our time to „play on PC” was limited to max 1hr a day... What will our kids remember? -
Need some dev feedback here, went to twitter and got nothing and thought here is probably the best place...
I'm working on a dev terminal for my game engine and I'm building a basic app development for it (CLI and CLGUI) but not sure if I should allow for full RGB via Hex or should I just stick with the standard CGA 16 colour pallet...
And I'm thinking of building a basic scripting language that will transpile into an obfuscated JSON structure (Mostly because I have a lot of experience at building systems that use JSON as a scripting language) but just want to know if anyone could recommend things to try2 -
I don't know what to do because union and sum types both totally suck but I need them for my scripting language
Union types are fun and intuitive because they can be used with type refinement but they're not hierarchical thus bad for generics.
Sum types (or tagged unions) are great because they're hierarchical and can be nested properly but they need ugly type matching constructs.
The positive thing is I'm not making a systems language anymore so I only wanna jump of a bridge every second day5 -
Even if he's a younger guy than most other examples, my mention is:
Jordan Walke
He's the inventor of React, which probably changed the way to write (web-)apps for a lot of people and was based on a prototype written in StandardML.
He's also created ReasonML which is not only in many ways a more fitting language to write React, but also a good systems language (props to OCaml and it's unbreakable type system). Many React concepts/patterns have their origins in functional language concepts, including reducers and hooks.3 -
What to choose for first job?
1. Small company "10-15 people" working with CRM systems C# Full stack.
2. Big company "1000+ people" working with java backend.
Same pay and language doesn't really matter right now. What's best for the future?16 -
I missed this last week... so too bad ;)
My introduction into programming was rather slow. When I was a child, we had an Apple IIc, but there were no disks. When you'd boot it up, you got a prompt and I recall being able to type commands into it that someone told me was "Apple BASIC".
At the same time, our family computer was a 386 and it came with something called GWBasic. I was a huge Mortal Kombat fan as well, and I recall finding the moves for the game on an AOL usenet. I took them all and wrote a program in BASIC that let you search and find moves for your character. I distributed this on some floppies to friends.
After that I lost interest. My "Information Systems" shop in high school was more about how to use Office than it was about programming. A few years later I found out that you could run your own text-based games (MUDs) and I quickly jumped into that and the C language.
From there, I was in and out of programming - C, to C++. Java and PHP, then back to Java. It would be about 15 years later until I finally realized I wasn't bad at this and land a job doing it. :) -
First and foremost, students should be carefully taught the logic and mentality behind programming. Most of the time I see that the introductory programming courses waste so much energy in teaching the language itself. So students kinda just get fucked cause many people end up ending the course without having actually gained the "programming perspective".
Stop teaching pointers and lambdas and even leave the object oriented stiff till later. If a student doesn't know why we use a For loop then how can they learn anything else.
I believe once that thing in your brain clicks about programming, everything goes smooth from there... kinda :P
Second of all, and this pertains mainly to the engineering and science disciplines.
We need a fundamental and strong mathematical foundation. And no I don't mean taking fucking double integrals. Teach us Linear Algebra, Graph theory, the properties of matrices, and Probability theory.
One of the things I suffered from most and regret in university is having a weak foundation in math and having to spend more time catching myself up to speed.
It's so annoying reading a paper on a new algorithm or method and feeling like an idiot because I can't understand what magic these people did.
Numerical Methods...
Ok this is more deeper, maybe a 2nd year course.
But this is something we take for granted.
Computers don't magically add and subtract and multiply.
They fuck up.
And it'll bite you in the ass if you're not even aware that the computer we all love so much isn't as perfect as we think
Some hardware knowledge.
Probably a basic embedded systems course with arduinos
just so you can get a feel for how our beautiful software actually makes those electrons go weeeeeeeee
And finally
Practice practice
Projects projects
like honestly
just give me the internet and some projects
Ill learn everything else
Projects are the best motivation
I hate this purely theoretical approach
where we memorize or read code and write these stupid exams
Test what we are capable off
make us do projects that take sleepless nights and litres of coffee
And judge our methods, documentation, team work, and output
Team work skills and tools (VCS, communicating, project management, etc.)
Documentation and Reporting
Properly
:)
maybe even with LaTeX :D
Yeah that's the gist of whats on my mind at the moment regarding an ideal computer science education
At least the foundations
The rest I leave it to the next dude. -
It's a shame that people don't want to use F# but prise C# for how cool it became and continue becoming. At the same time, little do they know that many of the features were simply drawn from F#.
It's just rediculous how far this OO and C-Style syntax crap has progressed. They keep copying things from functional langugages, making the initial language to be a monstrocity like C++ is now, insted of just using languages like C#. I mean, it was right there before C#: async/task, immutablility, records, indexes, lambdas, non-null by default, who the hell knows what else.
Besides, many people (in my company at least) are just blindly overengineering with patterns and shit, where a simple function would be just enogh.
Watch some some NDC talks about F#, in particular those of Scott Wlaschin. It's just better in so many ways: less noice (I'm looking at you, brackets, commas and semicolons), the whole LOT of type inference and less duplication (just look at the C# signatures of linq methods - it's difficult to read them), immutability by default, non-nullable by default, ADTs and pattern matching, some neat features like type providers (how many times have used "paste special" or an online tool to create C# classes from a JSON/XML file, and how many times have your regenrated it because of schema changes?) and units of measure.
Of course, in some cases it's not optimal, in some cases mutable datastructures of C# are better for performance. But dude, how many performance critical systems have you wrote in C#? I mean, if it comes to performance you should use Rust or C++ or C after all.
*sighs*15 -
I really hate working with learning management systems (LMS).
I make training simulations for retail companies and some of these have the worst, backwards LMS's out there.
The providers who install and manage these LMSs for the companies always insist we make our training run inside their own environment, but we can't since it's a 3D training made in Unity that doesn't run well in a browser.
Luckily some of these are fine to figure out. Just a few API calls here and there for authorization and reporting progress, but some are an absolute nightmare.
Just now one of the providers provided me with a 2000 page documentation of all the functions of the LMS's API that our customer is using. All I need are like 5 pages that explain what URL to call with what data and the responses, but now I'm stuck spending days trying to find the 0.5% of this documentation that I need to communicate with their API.
And of course, the documentation is vague as all hell. minimal descriptions of what each endpoint does. Subjects names are super vague, as in do I look for course progress or lesson completion state. What the heck is a Learning Event, is it relevant to me?
And the errors in this document, too.
Bullet-point lists with duplicate items.
language errors everywhere.
Property lists where they copy-pasted the description of properties.
An entire EMPTY chapter, literally a page with only the chapter's title.
I just can't stand how these providers barely seem to know anything about the API of the LMS's they provide to customers.
(for clarity, the LMS is produced by some big tech company, it's installed and maintained by some 3rd party which is our main line of communication when rolling out trainings to these).
It always goes like: "Hey, we want to use your training." "Oh, that's great, we have our own, simple LMS where you can view your employee's progress." "Nah, we want to use our backwards LMS. Here's a giant manual about it's API, go figure it out!"
And then I'm left here tearing my hair out trying to figure out which 3 calls I need to send their API from the tons of extra stuff it can do which is completely unnecessary and being unable to rely on the provider because they lack the knowledge and have such thick skulls about the implementation of the LMS itself that they also seems completely unwilling to help to begin with!
Just another day at the office. -
I think my first encounter with a PC was when my cousins invited me to play a video game. I had never used a keyboard or mouse, I did not know how to turn a computer on or off.
For that reason my parents encouraged me to study basic computing, that helped me get a part-time job, and I realized that knowing how to use computer systems gave me a certain advantage over my other colleagues.
That led me to study engineering related to telecommunications, but I didn't know how to program and I didn't have the required level, obviously I failed the first course. But there was a teacher who supported me to study programming with the C language. I will always thank that teacher for helping me and seeing that I had programming skills, which helped me a lot to finish my degree.1 -
Anyone else feel in hindsight, college was a huge waste of money so basically just 4yrs of partying/independence from parents?
Watched Accepted on Prime yesterday which in hindsight send to be the truth...
https://m.imdb.com/title/tt0384793/
I majored in finance and information systems... Well the finance stuff I remember (for stock trading) I could've learned reading some books.
IS... I didn't even need to try since I started coding when I was a kid. SQL, know it already... Matlab/weka, just another language/tool.21 -
Whats people's opinion using other design languages on other operating systems.
For example, using the UWP design language on the Xbox android app.
I absolutely hate it... Makes the app look terrible and out of place in my opinion, even worse when they use there square style icon and just put it in a circle for it to 'match' the pixel launcher, for example, LinkedIn -,- -
(heading)How a programming language is created? Because I want to make my own.(heading)
I am learning C and next I will learn C++, SQL,DS&A, Assembley, Lex&Yacc,Operating Systems, Computer Arcticture, Computer Networks because I think it's enough for my goal. The only reason I am learning this, to make my own C++ clone with my own knowledge. But I really don't know how can I create my own programming language like C++ from scratch. Like what are the first steps to began with. As I know that C, first step is Preprocessor then Compiler then Assembler then (Loader/Linker).
Anyone please give me a step by step guide like learn this language first then this then this. So I can finally reach that amount of knowledge which I can implement to create my own programming language like C++.6 -
!rant
Ok so I'm about to start working on an OS but I am going to run through a few tutorials to get the base systems down then I'll incorporate a interpreter for BASIC and my custom scripting language.
Just curious if anyone can point me in the direction of a few well written tutorials that will explain the systems being used. (I want to use Assembly and C only btw, but am open to others)
I only have 1 decent tutorial but it's older and complete (https://github.com/cfenollosa/...)3 -
Fiddled since the days of DOS, fell in to the world of Linux ~15 years ago, fiddled some more.
In 2010, though, I jokingly/enthusiastically commented on @anderwebs twitpic about how he was adding theming to the ADW Android launcher and I was excited about a BuuF theme for Android.
He replied with something like, "cool, you gonna do it?". And I thought to myself, sure why not...and I did. Great learning experience.
Since then, I've stuck doing more of the systems/backend side of things...and I still, to this day, wouldn't consider myself a programmer as I'm not proficient in any one language....I'm a copy/paste weekend coder. I take advantage of software and my skills to manipulate it whenever/however I can.
I need some inspiration to move forward with my education and immersion with programming. I continue to take intro courses, but have not gotten to an advanced level.
Any recommendations for getting started with Android programming, without using much Java? I'd imagine I would have really gotten in to it if it had been Python, for some reason. -
So today my teacher told me to do that project for some competition or something(frankly, I don't remember clearly what this is for). He gave us the machines we need, the CDs with the systems we have to work with. We are supposed to make a properly working Beowulf cluster from the things I've been given.
Well, no.
Fucking no.
I am really okay with making this the way my teacher wants us to do. I am okay with installing an ubuntu 16.04 server that is completly irrevelant to the project, because it's not part of the cluster. I am really okay with using some weird linux distribution on the master nobody has ever heard of. But I'm not okay when the software we've been given(including operating system) has seven pages of documentation, escpecially when fucking screenshoots of how PXE booting should look like are roughly 70% of it. No, I couldn't find a thing on the internet about it. I couldn't read the fucking manual. There was no fucking manual. There was no fucking --help. There was no motherfucking english language. Everything was motherfucking spanish, including that 7 pages long document that was supposed to guide us through our work. It was planned to be done until march. The only reason I can think of about why doing the stuff the document tells us to do would take four motherfucking months is that we'd have to learn spanish to do this. And I'm not going to do that. Not because I don't like spanish or learning. Simply because I didn't sign up for this to learn languages.
And no. I can't switch to other, human purposed software. I am only allowed to use the things the teacher has given us. Because somebody has worked on it already couple of years ago and they had left a pdf file about how to install that ubuntu server I've been writing about a while ago. Which, by the way, was the "installation guide for animals". Showing how to install a system, screenshoot after screenshot.
It took about an hour to figure out the thing supposed to handle pxe booting computers all the time was telling us that it can't work because we had to configure ethernet interface manually. Because why the fuck not. -
I’ve been interviewing at a few companies lately. I’m a dev with ~6 years of experience with a specific language. Most of the experience comes from working in companies that developed their own software, not talking about cms stuff. Analytical, data tracking systems. Now working at a fintech. I’ve got an offer to work as a senior developer in a smaller tech team, with more salary. I’ve approached the current company about the offer and they told me that they don’t think I’m a senior dev and rather a strong mid level dev. The Hr also told me to think about if I’m really a senior and if the other companies expectations would be met. They would increase my salary, but not quite match it. It’s not too far off though. Their reasoning for this was that you need a lot of experience with their product (which does not correlate with seniorness of a developer, only the worth of specific employees for a company IMHO) and system architecture design. The problem is that we don’t see any tasks that could implement any system design for as log as I’ve worked here, so I don’t see how I could work into a senior role at this company. Of course imposter syndrome kicked in and I’m triple guessing myself if I should join the other company as a senior now. How should I aproach this? The current company is stressful to work at because of big workload, a lot of my coworkers think the same thing about the workload.11
-
Damn lots of you knew this shit before turning of age.
I didn't code a single line until I went to college.
I tried to, but it was just too fucking complicated and I didn't understand a thing. Tried to grasp how to use some tools like Unity or an Adventure Maker of sorts and something called Flix for Flash games. Didn't understand shit.
I decided to study systems engineering due to a career aptitude test I took hoping somehow that way I could learn sthg.
First thing I was taught was bash.
When I realised I already knew enough to code a whole text adventure from scratch with such a simple language I felt really hyped.
Always loved text and graphic adventures.
Afterwards I was taught the Z80 assembly language and how CPU registers worked and it blew my fucking mind.
That was the first half-year.
Then I was taught C. And boy was it hard. Didn't get how memory was being handled until the very end.
I happened to be one of the few passing a stupidly complicated semifinal test with triple indirection pointers.
That felt goood.
Learning other languages afterwards was a piece of cake. C#, Java, X86 assembly, C++...
It was a hard door to open. Fucking heavy. But now nothing seems black magic anymore and boy isn't that something to be proud of! :D -
Question: What are 3 or 4 hard development skills I can focus on learning in the next two months or so to make me more marketable, given my lack of real development experience?
Details: I graduated college with a compsci degree, but have been doing systems/service administration since then. Aside from some small scripts for work, I don't have any post-college development experience. And even the skills I got from college aren't phenomenal because I was convinced I would be satisfied on the admin -> engineer -> architect ladder that I'm on right now.
But things have changed. My interest has dwindled in my current field, and I want to switch into a development role.
I am extremely comfortable with the Python language, but not so much with its many frameworks for frontend and web development.12 -
Most people who talk about language performance are just repeating what they heard from others.
For 98% of use cases, Python or Ruby, for example, are more than fine for running production systems at scale.
Also, a language does not necessarily guarantee speed/performance if you write shit code.
I've seen a properly written Python application perform better than a Java application.
I'd love to stop having this debate with folks every time.12 -
Please don't use OS specific libraries/binaries/build tools...etc
I'm talking to C/C++ users here. once in a while I see something on github maybe im just curios maybe I find your niche code useful but then you use make (who the hell still uses make?) or your library depends on another library than can only be mindlessly installed in a unix environment. and the most obscene of all a solution file...
thank god for rust.14 -
[CONCEITED RANT]
I'm frustrated than I'm better tha 99% programmers I ever worked with.
Yes, it might sound so conceited.
I Work mainly with C#/.NET Ecosystem as fullstack dev (so also sql, backend, frontend etc), but I'm also forced to use that abhorrent horror that is js and angular.
I write readable code, I write easy code that works and rarely, RARELY causes any problem, The only fancy stuff I do is using new language features that come up with new C# versions, that in latest version were mostly syntactic sugar to make code shorter/more readable/easier.
People I have ever worked with (lot of) mostly try to overdo, overengineer, overcomplicate code, subdivide into methods when not needed fragmenting code and putting tons of variables.
People only needed me to explain my code when the codebase was huge (200K+ lines mostly written by me) of big so they don't have to spend hours to understand what's going on, or, if the customer requested a new technology to explain such new technology so they don't have to study it (which is perfectly understandable). (for example it happened that I was forced to use Devexpress package because they wanted to port a huge application from .NET 4.5 to .NET 8 and rewriting the whole devexpress logic had a HUGE impact on costs so I explained thoroughly and supported during developement because they didn't knew devexpress).
I don't write genius code or clevel tricks and patterns. My code works, doesn't create memory leaks or slowness and mostly works when doing unit tests at first run. Of course I also put bugs and everything, but that's part of the process.
THe point is that other people makes unreadable code, and when they pass code around you hear rising chaos, people cursing "WTF this even means, why he put that here, what the heck this is even supposed to do", you got the drill. And this happens when I read everyone code too.
But it doesn't happens the opposite. My code is often readable because I do code triple backflips only on personal projects because I don't have to explain anyone and I can learn new things and new coding styles.
Instead, people want to impress at work, and this results in unintelligible, chaotic code, full of bugs and that people can't read. They want to mix in the coolest technologies because they feel their virtual penis growing to showoff that they are latest bleeding edge technology experts and all.
They want to experiment on business code at the expense of all the other poor devils who will have to manage it.
Heck, I even worked with a few Microsoft MVPs.
Those are deadly. They're superfast code throughput people that combine lot of stuff.
THen they leave at you the problems once they leave.
This MVP guy on a big project for paperworks digital acquisiton for a big company did this huge project I got called to work in, which consited in a backend and a frontend web portal, and pushed at all costs to put in the middle another CDN web project and another Identity Server project to both do Caching with the cdn "to make it faster" and identity server for SSO (Single sign on).
We had to deal with gruesome work to deal with browser poor caching management and when he left, the SSO server started to loop after authentication at random intervals and I had to solve that stuff he put in with days of debugging that nasty stuff he did.
People definitely can't code, except me.
They have this "first of the class syndrome" which goes to the extent that their skill allows them to and try to do code backflips when they can't even do code pushups, to put them in a physical exercise parallelism.
And most people is like this. They will deny and won't admit, they believe they're good at it, but in reality they aren't.
There is some genius out there that does revoluitionary code and maybe needs to do horrible code to do amazing stuff, and that's ok. And there is also few people like me, with which you can work and produce great stuff.
I found one colleague like this and we had a $800.000 (yes, 800k) project in .NET Technology, which consisted in the renewal of 56 webservices and 3 web portals and 2 Winforms applications for our country main railway transport system. We worked in 2 on it, with a PM from the railway company.
It was estimated 14 months of work and we took 11 and all was working wonders. We had ton of fun doing it because also their PM was a cool guy and we did an awesome project and codebase was a jewel. The difficult thing you couldn't grasp if you read the code is if you don't know how railway systems work and that's the only difficult thing.
Sight, there people is macking me sick of this job11 -
There needs to be a new (MOOC) class for people like me.
Hi, I'm William. I can't get my head around designing systems. I've read GoF and a few breakdowns of it as well. I find some patterns obvious for my field of interest (game dev, woot!) while I'm reading through the stuff, but have a pretty hard time retaining much of it. I'm aware of the danger of over using patterns, so I don't worry that much about it. I'll look something up when I'm sure I need it.
Still, I'm tired of the tutorial blues. I can watch a few different people write entire games, usually not in the language of choice, but that only helps me so much.
How do I fight scope creep? In the meantime, how can I make things extensible? Scope does need to creep some, after all.
People joke about starting with (visual) BASIC ruining you forever. I don't believe in that crap, but is this just denial? Am I too dumb for this? Not that I'd ever seriously blame a language for that.
I've been a hobbyist for well over 10 years, please don't make me count exactly how long I've been unsuccessful.
I'm baffled by Löve. I think it's the coolest shit I've seen, maybe ever (unless we're counting IPFS).
I think what really prompted this rant, apart from the obvious degradation of my mental health, was my search for an entity component system for Löve/Lua. Hold your replies. I know there's a few of them, and I'm positive that they're fantastic. I'd roll my own, but that requires actual Lua specific knowledge that I just haven't dug all that deep into yet. I can't wrap my head around the ones that exist, even though I can tell their complexity is next to none really.
I have severe tool anxiety, I'm shocked that I've stuck with ZeroBrane Studio as long as I have. It feels good though.
Sorry to use this as "Devs Anonymous", but I think that's how this community helps (me) best.
I feel like I should stop now and just say: Advice? before this gets much deeper/less readable. -
Complain about your build systems/pipelines here please! I want to hear about it.
----
I'm finally ready to publicly say I've been designing a build system. It's a culmination of around a decade of studying, thought, ideas and prototypes.
If you have any sort of build step in your project (any language, any compiler) that is unusual, custom, weird, or has a lot of requirements, extensive, etc. - anything even slightly outside the box - please let me know about it below. I want to know as much as possible about it.
Any strong opinions, hateful comments, gripes or annoyances, etc. please don't hesitate. I'd love to hear what issues you face with build systems. I want to make sure such cases are covered.
I'll also answer any questions for the curious.6 -
DevOps With Ruby and Chef on FreeBSD (and Linux)
I am Ops and Dev by heart. I have always automated *nix systems long before any automation framework was invented because I am pretty lazy. Doing stuff more than once manually is just one time too often for me. Imho Ruby is a really elegant language. The same applies for the tools that are built around it. The Chef ecosystem fits into this with its own elegance and stability perfectly because the server is Erlang driven and the rest is Ruby.
Being a Linux and BSD user since the early 90s I have always loved a *nix system for it's concepts and simplicity. One command for exactly one purpose and everything is combineable like letters are combinable to words in my mother language. I have always loved FreeBSD more though. Imho it is even more focused on simplicity. Because it is a really clean approach of system design that envies a base system and keeps 3rd party separated in a clean way for example. It also values classic UNIX philosophies that most Linux distros these days abandon but which saved my life multiple times through better design and execution that also focuses alot more on stability, fault tolerance and ease of use than any Linux I have come across. The hardcore guys should read "Design and Implementation of the FreeBSD Operating System", compare the readings to the Linux way of things and see for themselves.*
*The author acknowledges that this text is his opinion and just his wet dream alone and may not be of any relevance for the sexual lifes of everybody else -
Why is Java goto language for most backend systems, distributed and scalable, specially in big organizations like Amazon?12
-
Orchid lesson #many:
Church tuples exist only to demonstrate how general substitution is. Just like Church numerals, they aren't meant to be used for real computation and cause a lot of problems. Few type systems and fewer optimizers can deal with them, they're a pain to pass through FFI boundaries, and they're much slower in an interpreted context than a native smart array. And in a lazy language the tuple is almost always lighter than the code that generates it, so you want to generate the tuple eagerly and thunk the actual elements, if thunk you must.
I'll go write a vector based tuple and end this madness tomorrow. New version soon, probably.
With dynamic dispatch.7