Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
Get a devDuck
Rubber duck debugging has never been so cute! Get your favorite coding language devDuckBuy Now
Search - "computing"
WINDOWS USER VS LINUX USER
A Windows User's view on computing
I have the blue screen of death again
You'll never hear me say
I'm happy with my computer
At the end of the day
my operating system
in programs i use and the features I've
i have complete control
i lose sleep
worrying about getting viruses
microsoft patching vulnerabilities in time
i don't have time to think about
some thing better.
to live with old software issues
There's no way i'm planning
to change, and
its worth it to me
A Linux User's view on computing
(read this bottom to top)15
My brother and I have been messing with our IBM 5150 and doing cool stuff with it. I got it to play a youtube video via telnet via my bro's mac via mplayer with libcaca (ascii video output) + youtube-dl (a youtube downloader. The mac is doing all the heavy lifting, but it is still cool to see these images on a IBM 5150, just by typing a few commands on that old keyboard... more fun projects to come with this old thing.8
Let's get rid of the developer training: Pair Programming
Let's get rid of the software testers: Test First Programming
Let's get rid of the project managers: Agile
Let's get rid of the project planners: Scrum
Let's get rid of the system admins: DevOps
Let's get rid of the security guys: DevOpsSec
Let's get rid of the hardware budget: Bring Your Own Device
Let's get rid of the servers: Cloud Computing
Let's get rid of the other scruffy guys: Outsourcing
Let's get rid of the office space: Home Office
Let's get rid of the whole fucking company: Takeover7
After I submitted a code review:
Coworker: What did you mean with this comment?
Me: **translating the comment to Portuguese** Your Footer component isn't rendering any footer element.
Coworker: **blank stare** what?
Me: There is no footer tag here. **points to Footer component**
Coworker: **computing... found approximate result** I'm rendering the Footer here. **shows me where the Footer component is being rendered**
Me: **internal facepalm** Yes, I know, but I'm not talking about that. I'm saying that inside the Footer component you should be rendering a footer element.
Coworker: **segmentation fault** what?
And then I had to explain that there is an HTML footer element. To a mid level frontend developer (or so they say).
HTML is not only divs, for fuck sake.26
Before iPads took over the general population of home computing, I used to do house calls to help people with their computers for some extra folding money. One day I get a call from a regular saying that ever since I last worked on his computer it won't stay on.
He says it comes on for a few seconds, then just shuts down. It never did that before I upgraded the RAM.
So I drive over to his house and turn on the computer. He says, "See, it starts fine, but in a few seconds it'll just shut off. Just watch"
The computer boots up without any issues.
He says, "Well, of course it doesn't do it now that you're here!"
I reboot it a few times, boots fine every time. Suddenly I realize what's going on. I say to him, "Hey, why don't you try turning it on for me?"
He says, "What difference will that make?"
I say, "Just trust me, turn it on."
He bends down, presses the power button, looks up at the monitor and watches it boot. But he doesn't release the button! He just keeps holding it down until it shuts off.
"See!" he says, "why does it only do that when I turn it on!"
I then have to explain to him how holding down the power button forces a shutdown.
But, it never did that before I worked on it!17
Recipe for a Great Programmer:
-Books for a computer science curriculum from a top university
1. Cover computer science books with lighter fluid
2. Light books on fire
3. Use flames to cook an energy-rich meal for the thousands of hours ahead
4. Pick an IDE
5. Choose a project beyond current capabilities. Good ways to push boundaries:
- Unfamiliar domain (e.g. large scale data processing, UI programming, high performance computing, games)
- Exotic programming language
- Larger in scope than any project before
6. Shut up about your IDE
7. Attempt to build
8. Stop procrastinating on Hacker News
9. Re-attempt to build
10. Squeeze stress ball and scream into pillow as necessary to keep sanity
- Paste stack traces into Google
- Find appropriate mailing list to get guidance
- Realize that real learning happens when you are stuck, uncomfortable, and/or frustrated
- Seek out books, classes, or other resources AFTER you have a good understanding of your deficiencies
11. Repeat #4 to #10 for at least 10 years
12. Results guaranteed! (to the same extent static types guarantee bug-free programs)
skype interview with chinese it vp,
vp: do u know cow-computing?
me: sorry what?
vp: cow computing
me: really can't hear you, did u mean actual Cow computing?
vp: i mean cow! you know like in the sky.
me: oohhhh, cloud computing.. (face turns red over embarrassment)
First day at CERN: done!
Nothing to rant about :) The place and the people are beautiful, lots of support and it's easy to navigate through things even for very young people like me! Couldn't ask for better stuff.
The welcome event in the Globe of Science and Innovation is already an experience on its own :) so many people to meet and share words with! Later on one of my senior colleagues showed me around the surface datacenter of ATLAS, as well as its control room and a (physically) separate computing testing environment to run simulations and software on to later be deployed at Point 1 (ATLAS). I am stunned, humbled and excited to say the least! More to come soon! Post your curiosities below and I'll gladly answer!19
Larry Tesler, a computer scientist who created the terms "cut," "copy," and "paste," has passed away at the age of 74 (17 Feb 2020).
In 1973, Tesler took a job at the Xerox Palo Alto Research Center (PARC) where he worked until 1980. Xerox PARC is famously known for developing the mouse-driven graphical user interface and during his time at the lab Tesler worked with Tim Mott to create a word processor called Gypsy that is best known for coining the terms "cut," "copy," and "paste".
In addition to "cut," "copy," and "paste" terminologies, Tesler was also an advocate for an approach to UI design known as modeless computing. It ensures that user actions remain consistent throughout an operating system's various functions and apps. When they've opened a word processor, for instance, users now just automatically assume that hitting any of the alphanumeric keys on their keyboard will result in that character showing up on-screen at the cursor's insertion point. But there was a time when word processors could be switched between multiple modes where typing on the keyboard would either add characters to a document or alternately allow functional commands to be entered.12
Saw a video of an interview on Cloud Computing...
That genius guy says: "Cloud computing is highly risky. Because if it rains, all the data will be lost."4
Not using blockchain to color my cryptocurrencies pink so that my AI knows which cloud computing would be best for GDPR1
Downside to being a computing student:
I need my PC to study, but all my distractions are on my PC so it's really hard not to get distracted while studying.6
Every job description out there:
" JUNIOR XY position.
Requirements: 50 years experience of Assembly, Java and Masonry, HTML, cloud based computing and artificial intelligence. Must be able to write algorithms like Hummingbird. Fluent in English, Mandarin and Latin. Must have five doctor and two Bachelor degrees. Experience in leading a Fortune 500 company benefitial.
Renumeration: 5 rice grains"6
Open source block chain neural network binary tree growth hacker synergy vertically integrating cryptocurrency game changing GDPR compliant internet of things node.js quantum computing start up that'll disrupt and pivot the cloud based ecosystem11
Companies: We are commited to linux and it is truly the future!
Developers: Awesome! So are you going to port your most popular softw-
Companies: AI! Machine Leaning! Cloud computing! Streaming!3
In computing class - Teacher asks for disadvantages of open source.
"It may end up like Linux..." <I stopped listening after that>
My first dev job was a paid internship at Oak Ridge National Laboratory. But I wasn't in the computing division with the supercomputer and the 30-foot 18-screen wall display. In a way, I was doing something more exciting. I was in the Hollifield Radioactive Ion Beam Facility.
That meant that I was working next to a radioactive ray gun that they fired at different targets to try and make new kinds of particles. To refine the beam components, there was a tower with the world's highest voltage Van de Graf generator at 25,000 kilovolts. I got training on how to put on a radiation suit, and was told that if I got locked in the wrong room and red lights began to flash, I had about five seconds to run to the far wall and push the E-stop, before I got irradiated and died slowly over the next five weeks.
But, I was reassured, that never happened. Radiation leaks are rare too (that's why we wore dosimeters). More likely, there would be a leak in the generator tower. To explain why that's bad, that tower wasn't filled with normal air. 25,000 kilovolts would punch through that like nothing, arc against the walls, and we'd lose the electric charge. No, instead, the tower was filled to a few atmospheres of pressure with sulfur hexafluoride gas. You know how helium makes your voice go up? This stuff makes your voice go down. It's heavier than air, and it kills you by displacing and starving your lungs of oxygen.
So, while I was happily coding away on PHP, CSS and the Bash shell, making a log book for all the ion gun settings and targets the scientists used in their experiments, I was keeping an ear out for the oxygen alarm. I had a blast!2
It all began yesterday in my math lesson.My teacher introduced us into a new subject: The Heron method.
This is a method for iteratively computing the square root of a number which was made up by Hero of Alexandria.
But here's the cool part:
When my teacher said "method" I had the thought that I could make a programm on my TI-82 Stats (my calculator).
So I used the next break to make a little programm that does exactly the same what my mates has to do in several steps, in just one step.
Here is a picture so I can explain the programm:
First it asks for the root that it should calculate then for a value to begin with and finally for the amount of arithmetic operations.
Then it goes trough the algorithm and displays the interim results (this is important for later ...) .
So back in story.It wasn't surprising that we got exercises with this method as homework but through my programm I just needed 5 minutes instead of 20 minutes (like my class mates).9
I got a used computer case in a second hand hardware store and it still has the sticker with the specs of the computer they wanted to sell it with on it. It was going to be a moderate to shity pc. I built an absolute computing monster in it (i7 6900k, 32GB ram, 23TB storage). I like having visitors over and telling them this is the primary computer I use to do my high particle count fluid simulations and little bigdata projects with.6
"I think my next laptop is going to be a Chromebook. I can do everything from browser and any heavy computing/coding I will just ssh to home. Sounds good."
3 days later, my Chromebook is running Ubuntu 80% of the time since I bought it.18
I hope computing heavens have:
-One brand of hardware
-No closed source software
-One monitor aspect ratio
-One fucking programming language with a fucking big standard library.
-Phones are just the same exactly the same OS as in computer, not stupid adaptations.
-All pages are only HTML/CSS, without JS.
-Due there is one browser and one OS, when you need a dynamic page, you can display a desktop app in the browser downloading its binary.
-There are one fucking brand on printer with standard drivers which are included in the OS.
We are so far from heaven16
Python Tools to Get Started with Machine Learning
SciPy - the most fundamental library with essential packages such as NumPy, matplotlib, Pandas, and SymPy.
NumPy - gives you the ability to play with your data as 'arrays' using some powerful array functions and linear algebra functions. Very essential since most computing is done with arrays of numbers.
Matplotlib - to visualize data and model outputs using 2D plotting with some 3D functionality.
Pandas - a highly flexible package which introduces dataframes to Python, a type of in-memory data table. Makes it easy to understand the data's structure and provides easy to use SQL-like commands to play with the data.
SymPy - is a package used for symbolic mathematics and computer algebra.
StatsModels - commonly used package for statistical methods and algorithms.
Scikit-learn - Most popular and easy to understand library filled with machine learning algorithms. A good start for beginners and practitioners working with smaller data loads.
RPy2 - A cross between Python and R. Allows you to call R functions from within Python.
NLTK (Natural Language Toolkit) - this toolkit in Python has functions and methods for text analysis.13
nephew: what's the meaning of word "Enterprise", particularly in computing context?
me: No worries about that. Once You endup in enterprise You will know
nephew: How do I know?
me: when bug in your software prevent at least 250 people from doing their job, congratz, You are in Enterprise! And You will know that instantaneously, trust me :)2
Prof: So yeah this is going to be difficult. We're going to make the scalable math library. Then we have to make a functional finite elements library using that. Then make a multiphysics engine using that library. This could easily take your entire PhD. Are you prepared for that?
Me: May I show you something?
Prof: Sure, sure.
Me, showing him: We can use moose to code in the multiphysics. It's built atop libmesh for the finite elements. Which can be built with a petsc backend. Which we can run on GPUs and CPUs, up to 200k cores. All of this has been done for us. This project will, at worst, take a couple months.
Guys, libraries. Fucking. Libraries. Holy fucking shit.5
My mother is the one that introduced me to computers from a young age. She would tell me that they were the future and that people could do amazing things with them. Fast forward at me graduating from uni with a B.S in Computer science and she was the happiest :) she tells everyone that I am a computer scientist, she seldom says "programmer" or "developer". She is super well versed in general computing and can use Linux and Mac, so yeah :) mom is awesome. My dad has lil idea of what I do, to him its just magic, my step dad is the same way but he will be the first to tell everyone that I am a wizard.
My brother and sister could care less...my sister tells everyone that I am the smartest person she knows, but that I spend most of my time glued to the screen "playing with a bunch of weird code!"
The rest of my family is pretty meh about it, 2 of my uncles are super proud of it and normally ask for my input regarding tech or about life as a dev.
Finally, the wife. The wife knows how to code from before I even knew what code was :) so she knows exactly what I do :)8
Okay, story time.
This rant is about the many mistakes I made at the time, specifically the biggest – but not the first – of which: publishing some preliminary results very early on.
So I posted a sarcastic question to the Software Engineering Stack Exchange, which was originally worded differently to reflect my frustration, but was later edited by mods to be more serious.
You can see the responses for yourself here: https://goo.gl/poHKpK
Most of the serious answers were along the lines of "multithreading is hard". The top voted response started with this statement: "1) Multithreading is extremely hard, and unfortunately the way you've presented this idea so far implies you're severely underestimating how hard it is."
While I'll admit that my presentation was initially lacking, I later made an entire page to explain the synchronisation mechanism in place, and you can read more about it here, if you're interested:
But what really shocked me was that I had never understood the mindset that all the naysayers adopted until I read that response.
Because the bottom-line of that entire response is an argument: an argument against change.
Nexus does not and will not hold your hand. It will not repeat Node's mistakes and give you nice ways to shoot yourself in the foot later, like `process.on('uncaughtException', ...)` for a catch-all global error handling solution.
No, an uncaught exception will be dealt with like any other self-respecting language: by not ignoring the problem and pretending it doesn't exist. If you write bad code, your program will crash, and you can't rectify a bug in your code by ignoring its presence entirely and using duct tape to scrape something together.
Back on the topic of multithreading, though. Multithreading is known to be hard, that's true. But how do you deal with a difficult solution? You simplify it and break it down, not just disregard it completely; because multithreading has its great advantages, too.
Like, how about we talk performance?
How about distributed algorithms that don't waste 40% of their computing power on agent communication and pointless overhead (like the serialisation/deserialisation of messages across the execution boundary for every single call)?
How about vertical scaling without forking the entire address space (and thus multiplying your application's memory consumption by the number of cores you wish to use)?
Some will say that the performance gains aren't worth the risk. That the possibility of race conditions and deadlocks aren't worth it.
That's the point of cooperative multithreading. It is a way to smartly work around these issues.
If you use promises, they will execute in parallel, to the best of the scheduler's abilities, and if you chain them then they will run consecutively as planned according to their dependency graph.
If your code doesn't access global variables or shared closure variables, or your promises only deal with their provided inputs without side-effects, then no contention will *ever* occur.
If you only read and never modify globals, no contention will ever occur.
Are you seeing the same trend I'm seeing?
When someone says we shouldn't use multithreading because it's hard, do you know what I like to say to that?
"To multithread, you need a pair."18
Everybody talking about Machine Learning like everybody talked about Cloud Computing and Big Data in 2013.4
When you're a great, quick to write programming language suited for many computing tasks in general but everyone thinks you're a language for kids and a shit language because you're slow in general.1
To be able to learn, is an opportunity. To be able to teach, is a privilege.
Cheers to another successful iteration of The #HourOfCode, by Team ACM BVP in association with Code.org. It was amazing teaching the students of 5th standard the basics of programming and logic building, and quite surprising to see how quickly they were able to grasp the concepts!17
I automatically don't trust People who use pictures of clouds on the background for anything related to cloud computing.4
This F***ing government college faculty crossed my complete answer of a F***ing bubble sort in 3rd year of Mathematics & Computing by saying and I quote, " Why is this i loop inside of j loop?" and after getting again on my feet after listening and understanding this absurd statement, I tried to explain to which he asked ne to show any book where it is written like this.
To i loop and j loop he meant the variable name in for loops, 🤬🤬🤬🤬
these f***ing reserved government professors in elite institutions like IITs15
So Here's a story of how I severely messed up my mental health trying to fit in university.
But the bonus: Found my passion.
Her we go,
Went to university thinking it'll be awesome to learn new stuff.
1st sem was pure shock - Programming was taught at the speed of V2 rockets.
Everything was centred around marks.
Wanted to get a good run in 2nd sem, started to learn Vector design, but RIP- Hospitalized for Staph infection, missed the whole sem and was in recovery for 3 months.
So asked uni for financial assistance as I had to re-register the courses the next semester. They flat out refused, not even in this serious of a case.
So, time to register courses for third semester, turns out most of the 2nd year courses are full, I had to take 3rd year courses like:
Social and Informational Networks
Human Computer Interaction
Parallel and Distributed Computing (They had no prerequisites listed, for the cucks they are: BIG MISTAKE)
Turns out the first day of classes that I attend, the Image proc. teacher tells me that it's gonna be difficult for 2nd years so I drop it, as the PDC prof. also seconds that advice.
Time travel 2 months in: The PDC prof is a bitch, doesn't upload any notes at all and teaches like she's on Velocity-9 while treating this subject like a competition on who learns the most rather than helping everyone understand.
Doesn't let students talk to each other in lab even if one wants to clear their friend's doubt, "Do it on your own!" What the actual fuck?
Time for term end exams and project submission: Me and 3 seniors implement a Distributed File System in python and show it to her, she looks satisfied.
Project Results: Everyone else got 95/100
I got 76.
She's so prejudiced that she thinks that 2nd years must have been freeloaders while I put my ass on turbo for the whole sem, learning to code while tackling advanced concepts to the point that I hated to code.
I passed the course with a D grade.
People with zero consideration for others get absolutely zero respect from me.
Well it's safe to say that I went Nuclear(heh.. pun..) at this point, Mentally I was in such a bad place that I broke down.... Went into depression but didn't realise it.
I met a senior in my HCI class that I did a project with, after which I discovered we had lots of similar interests.
We became good friends and started collaborating on design projects and video game prototyping.
Enter the 4th sem and holy mother of God did I got some bad bad profs....
Then it hit me
I have been here for two years, put myself through the meat grinder and tore my soul into shreds.
This Is Not Me
This Wont Be The End Of Me
I called up my sister in London and just vented all my emotions in front of her.
Been a long time since I felt that.
I decided to go for what I truly feel passionate about: Game Design
So I am now trying to apply for Universities which have specialised courses for game design.
I've got my groove again, learnt to live again.
Learning C# now.
It's been a long hello, and If you've reached till here somehow, then damn, you the MVP.
That awkward moment when you ask you final year CS project mentor to clone your git repo for his feedback and he says
Oh. CLOUD COMPUTING!!!!
You get the feeling to be an INDIAN.5
Since I moved from pure dev to Code Forensics, and studying with Forensic Computing students (who do one module on security), the amount of Kali Linux wallpapers on a Windows machine is overwhelming.
It's like the entire class watched three episodes of Mr Robot and now thinks they can change the world with a goddamn semester of teaching!4
dammit. I fucking hate it when I get stuck because of low level computing concepts and there is no explanation on Google.
like.. I understand the difference between an int and a float, but no one ever explains how you convert 32bit signed vectors to floats. or how bgra and rgba differ. or how to composite two images on a GPU. etc. the internet is great and all, but fuck, sometimes it seems as everyone is just as dumb as I am.4
"Sleep" is the last frontier in high performance computing. Is your code still slow? Just Sleep™, and you'll have your results instantly*.
* Speed benefits apply only to the sleeper. Sleep is not a solution for immediate deadlines.3
My computing teacher says that html is his favourite programming language to teach.
Needless to say he's not very good at teaching us html and js.7
Realising if you'd only studied a degree in computing, you would be useless in the real world as a developer6
Tl;Dr - It started as an escape, carried on as fun, then as a way to be lazy, and finally as a way of life. Coding has defined and shaped my entire life from the age of nine.
When I was nine I was playing a game on my ZX spectrum and accidentally knocked the keyboard as I reached over to adjust my TV. Incredibly parts of it actually made a little sense to me and got my curiosity. I spent hours reading through that code, afraid to turn the Spectrum off in case I couldn't get back to it. Weeks later I got hold of a book of example code to copy out to do various things like making patterns on the screen. I was amazed by it. You told it what to do, and it did it! (don't you miss the days when coding worked like that?) I was bitten by the coding bug (excuse the pun) and I'd got it bad! I spent many late nights on that thing, escaping from a difficult home life. People (especially adults) were confusing, and in my experience unpredictable. When you did things wrong they shouted at you and threatened to take you away, or ignored you completely. Code never did that. If you did something wrong, it quietly let you know and often told you exactly what was wrong. It wasn't because of shifting expectations or a change of mood or anything like that. It was just clean logic, simple cause and effect.
I get my first computer a year later: an IBM XT that had been discarded by a company and was fitted with a key on the side to turn it on. With the impressive noise it made it really was like starting an engine. Whole most kids would have played with the games, I spent my time playing with batch scripts and writing very simple text adventures. And discovering what "format c:" does. With some abuse and threatened violence I managed to get windows running on it. Windows 2.1 I think it was.
At 12 I got a Gateway 75 running Windows 95. Over the next few years I do covered many amazing games: ROTT, Doom, Hexen, and so on. Aside from the games themselves, I was fascinated by the way computers could be linked together to play together (this was still early days for the Web and computers networked in a home was very unusual). I also got into making levels for Doom, Heretic, and years later Duke Nukem 3D (pretty sure it was heretic; all I remember is the nightmare of trying to write levels entirely by code!). I enjoyed re-scripting some of the weapons and monsters to behave differently. About this time I also got into HTML (I still call this coding, but not programming), C, and java. I had trouble with C as none of the examples and tutorial code seemed to run properly under a Windows environment. Similar for my very short stint with assembly. At some point I got a TI-83 programmable calculator and started rewriting my old batch script games on it, including one "Gangster Lord" game that had the same mechanics as a lot of the Facebook games that appeared later (do things, earn money, spend money to buy stuff to do more things). Worried about upcoming exams, I also made a number of maths helper apps, including a quadratic equation solver that gave the steps, and a fake calculator reset to smuggle them into my exams. When the day came I panicked and did a proper reset for fear of being caught.
At 18 I was convinced I was going to be a professional coder as I started a degree in Computer Science. Three months later I dropped out after a bunch of lectures teaching what input and output devices were and realising we were only going to be taught Java and no C++. I started a job on the call centre of a big company, but was frustrated with many of the boring and repetitive tasks we had to do. So I put my previous knowledge to use, and quickly learned VBA to automate tasks. It wasn't long before I ended up promoted to Business Analyst where I worked on a great team building small systems in Office, SAS, and a few other tools.
I decided to retrain in psychology, so left the job I was in and started another degree. During my work and placements my skills came in use a number of times to simplify and automate tasks. I finished my degree, then took a job as a teaching assistant while I worked out what I wanted to do next and how to pay for it. Three years later I've ended up IT technican at the school, responsible for the website, teaching a number of Computing lessons each week, and unofficial co-coordinator for Computing as a subject. I also run a team of ten year old Digital Leaders who I am training in online safety and as technical experts; I am hoping to inspire them to a future in coding. In September I'll be starting teacher training with a view to becoming a Computing specialist teacher. Oh, and I'm currently doing a course in Android Development in my free time.
And this all started with an accidental knock on the keyboard of a ZX Spectrum.7
Just started playing with Microsoft's Quantum Computing Kit and it's so amazing 😍!
Well done Microsoft!25
This is the state of desktop computing: When a web browser uses twice more RAM than a full virtual machine.
To be fair, I did have 5 windows with >10 tabs each, but still...14
OCR (The exam board for my course) are fucking thick in the head when it comes to anything computing.
- I get a mark or two for saying open source software is worse than thier propritary counterparts
- ALL open source software forks must also be make open source. They spend so much time going over the legal stuff BUT HAVE NEVER HEARD OF OPEN SOURCE LICENCING!
- One exam paper had a not gate picture with 2 inputs...
- I have to differentiate between portable and handheld! YOU MEAN HANDHELD DEVICES ARE NOT PORTABLE!?!!?!?
- In level 2 education, OCR say 1 MB = 1024 KB - In level 3, they say 1 MB = 1000 KB, and 1 MiB = 1024 KiB, and expect you to differentiate. Why do you expect the wrong answer in level 2!?
- INFORMATION FORMATS AND STYLES ARE COMPLETELY DIFFERENT THINGS! If you look up synonyms for "style", "form" is there, and if you look up synonyms for "format", "style" is there.
- When asked for storage devices, I have to say "smartphone", "tablet", "desktop PC" - I mean yeah they store data but when you ask me for storage devices I will say "hard disk drive", "solid state drive", "SD card", etc. >.>
I could probably go on an on about this...
I sure do love being asked to copy-paste existing HTML/JS/CSS and being asked to just tweak it here and there, and then wait for other people's incompetence in copy-pasting... I sure do love being stuck with this sort of "education" ._.4
Sitting in the computing lab with my lab partner and the debugging duck. Shortened for hilarity.
Partner: Urgh, there's something wrong with my python code.
Me: Hmm, let's see. Oh, by the way you look good in that shirt.
P: Okaaay, thanks... Are you hitting on me?
M: Seriously, James? IT'S BEEN 8 BLOODY MONTHS AND YOU'VE ONLY JUST NOTICED!?
P: Oh, I'm straight.
M: Oh, eh, so what's wrong with your program?
Trust me to make everything awkward. Not looking forward to tomorrow 🙄🤐3
Other than being an a**hole, Linus. Guy changed computing as we know it with a little pet project59
NEW 6 Programming Language 2k16
Golang Programming Language from Google
Let's start a list of six best new programming language and with Go or also known by the name of Golang, Go is an open source programming language and developed by three employees of Google and the launch in 2009, very cool just 3 people.
Go originated and developed from the popular programming languages such as C and Java, which offers the advantages of compact notation and aims to keep the code simple and easy to read / understand. Go language designers, Robert Griesemer, Rob Pike and Ken Thompson, revealed that the complexity of C ++ into their main motivation.
This simple programming language that we successfully completed the most tasks simply by librariesstandar luggage. Combining the speed of pemrogramandinamis languages such as Python and to handalan of C / C ++, Go be the best tools for building 'High Volume of distributed systems'.
You need to know also know, as expressed by the CTO Tokopedia namely Mas Leon, Tokopedia will switch to GO-lang as the main foundation of his system. Horrified not?
eh not watch? try deh see in the video below:
Swift Programming Language from Apple
Apple launched a programming language Swift ago at WWDC 2014 as a successor to the Objective-C. Designed to be simple as it is, Swift focus on speed and security.
Furthermore, in December 2015, Swift Apple became open source under the Apache license. Since its launch, Swift won eye and the community is growing well and has become one of the programming languages 'hottest' in the world.
Learning Swift make sure you get a brighter future and provide the ability to develop applications for the iOS ecosystem Apple is so vast.
Also Read: What to do to become a full-stack Developer?
Rust Programming Language from Mozilla
Developed by Mozilla in 2014 and then, and in StackOverflow's 2016 survey to the developer, Rust was selected as the most preferred programming language.
Rust was developed as an alternative to C ++ for Mozilla itself, which is referred to as a programming language that focus on "performance, parallelisation, and memory safety".
Rust was created from scratch and implement a modern programming language design. Its own programming language supported very well by many developers out there and libraries.
Julia Programming Language
Julia programming language designed to help mathematicians and data scientist. Called "a complete high-level and dynamic programming solution for technical computing".
Hack Programming Language from Facebook
Hack is another programming language developed by Facebook in 2014.
Social networking giant Facebook Hack develop and gaungkan as the best of their success. Facebook even migrate the entire system developed with PHP to Hack
Facebook also released an open source version of the programming language as part of HHVM runtime platform.
Scala Programming Language
Scala programming termasukbahasa actually relatively long compared to other languages in our list now. While one view of this programming language is relatively difficult to learn, but from the time you invest to learn Scala will not end up sad and disappointing.
The features are so complex gives you the ability to perform better code structure and oriented performance. Based programming language OOP (Object oriented programming) and functional providing the ability to write code that is capable of evolving. Created with the goal to design a "better Java", Scala became one behasa programming that is so needed in large enterprises.3
Happy birthday, Dr. Nabil Ali!
Today’s Google Doodle celebrates Egyptian pioneer of Arabic language computing, Dr. Nabil Ali, on his 82nd birthday. Dr. Nabil Ali’s innovations in the field of computational linguistics propelled the Arab world into the Information Age by creating programs that enabled computers to understand Arabic in digital form.
Dr. Nabil Ali was born in Cairo on this day in 1938. Expressing an interest in art at a young age, Mohamed was inspired to apply his creative passion for visual aesthetics to the world of engineering. After obtaining his PhD in Aeronautical Engineering at Cairo University, he spent over 20 years working as an engineer with the Egyptian Air Force, as well as with various computer and electronics companies throughout the world.
For Dr. Nabil Ali, digitization of Arabic, with its complex linguistic rules and morphology, was a way to connect Arabic speakers with the world.
Over the course of his career, Dr. Nabil Ali published a number of papers, books, and technical reports in support of the developments he was making in the field of computational linguistics. His work won him several awards, including the prestigious Saudi Arabian award, the King Faisal Prize, in 2012—recognizing his pioneering contributions to the Arabic Language and Literature.5
TL; DR: Bringing up quantum computing is going to be the next catchall for everything and I'm already fucking sick of it.
Actual convo i had:
"You should really secure your AWS instance."
"Isnt my SSH key alone a good enough barrier?"
"There are hundreds of thousands of incidents where people either get hacked or commit it to github."
"Well i wont"
"Just start using IP/CIDR based filtering, or i will take your instance down."
"But SSH keys are going to be useless in a couple years due to QUANTUM FUCKING COMPUTING, so why wouldnt IP spoofing get even better?"
"Listen motherfucker, i may actually kill you, because today i dont have time for this. The whole point of IP-based security is that you cant look on Shodan for machines with open SSH ports. You want to talk about quantum computing??!! Lets fucking roll motherfucker. I dont think it will be in the next thousand years that we will even come close to fault-tolerant quantum computing.
And even if it did, there have been vulnerabilities in SSH before. How often do you update your instance? I can see the uptime is 395 days, so probably not fucking often! I bet you "dont have anything important anyways" on there! No stored passwords, no stored keys, no nothing, right (she absolutely did)? If you actually think I'm going to back down on this when i sit in the same room as the dude with the root keys to our account, you can kindly take your keyboard and shove it up your ass.
Christ, I bet that the reason you like quantum computing so much is because then you'll be able to get your deepfakes of miley cyrus easier you perv."10
Who the fuck decided that serverless computing is a good name for something that isn't serverless?7
I hate it when marketing people decide they're technical - quote from a conference talk I regrettably sat through:
"The fourth industrial revolution is here, and you need to make sure you invest in every aspect of it - otherwise you'll be left in the dust by companies that are adopting big data, blockchain, quantum computing, nanotech, 3D printing and the internet of things."
2 things I'm working on now:
#1 a personal project I am hoping to commercialize and turn it into my moneymaker. Hoping it'd at least be enough to pay the bills and put food on my table so I could forget 9/5 for good. But it has a potential of becoming a much, MUCH bigger thing. This would need the right twist tho, and I'm not sure if I am "the right twister" :) We'll see.
#2 smth I'm thinking of opensourcing once finished -- a new form of TLS. This model could be unbreakable by even quantum computing once it's mature enough to crack conventional TLS. I'm probably gonna use md5 or smth even weakier - I'm leveraging the weakness of hashing functions to make my tool stronger :)
I mean how long can we be racing with more powerful computers, eh? Why not use our weakneses to make them our strengths?
Unittests are already passing, I just haven't polished all the corner-cases and haven't worked out a small piece of the initialization process yet. But it's very close6
In an age of GitHub and cloud computing, how can a freelance dev using their own laptop be classed as a security risk?
These crude rules laid down by corporate IT depts just make companies look silly.1
After 10 years of thinking of getting into gamedev, I just joined a team game jam and it's going somewhere.
4 months ago I wrote a rant about how difficult it was for me to get into gamedev.
I guess I finally started because:
a) I'm not doing this alone
b) Another person takes care of the art
Regarding "a", computing, programming can be a very lonely task. I realized how much I missed the college years where I was paired up with other people to do something
There's something magical about being in a team.
You may not be a fan of your mates personalities. You may even hate their guts.
But working on something together, when everyone does the thing they should do, when things just flow... it's just magical.
When that happens, "all the bullshit goes away"™, and it's just you and your team sharing the same hope.
As for "b", I think I realized that, at least for my way of thinking, art (even in an initial, rudimentary state) is what ends up creating a game.
While I always tried to do it the other way around, first the game, then the art.
Maybe now I could dabble into pixel art and then use that as the thing that would define the game.
I was also an emotional mess for most of my 20s (and still kinda am, but not that much), so I guess that made getting into gamedev hard too.
Now, here's the negative part: the guy that does the art (and also codes) sucks balls at communicating and at git.
He takes a shitload of time to respond, doesn't address the things I state are important, doesn't join the damn trello, sometimes gives me some sass on his comments.
And he accidentally overwrote my changes on git three times.
The good thing is that he acknowledges his fuckups and fixes them.
I'm not really mad though. I'm almost 30, he's 20 or so.
When I was 20 I was a goddamn mess.
And it's just a week, and the pleasure of working with someone is far greater.6
I joined ACM (Association of Computing Machinery) when I helped my friends found out school's chapter.
I haven't had time to explore all it offers (other than free access to books I'm using for my certs), but I got an email saying they elected Cherri Pancake as President and I can't stop laughing. I feel a bit bad for the lady, as she may have had no say in her name (if it's her maiden name), but it's a wonderful name that makes me happy.1
Perhaps more of a wishlist than what I think will actually happen, but:
- Everyone realises that blockchain is nothing more than a tiny niche, and therefore everyone but a tiny niche shuts up about it.
- Starting a new JS framework every 2 seconds becomes a crime. Existing JS frameworks have a big war, until only one is left standing.
- Developing for "FaaS" (serverless, if I must use that name) type computing becomes a big thing.
- Relational database engines get to the point where special handling of "big data" isn't required anymore. Joins across billions of rows doesn't present an issue.
- Everyone wakes up one day and realises that Wordpress is a steaming pile of insecure cow dung. It's never used again, and burns in a fire.9
Yes, my Python scripts are not remotely pretty. But then, neither was my nonexistent formal training in scientific computing. And no, I will not 'write two lines of comments for every line of code'. Physics major programmer problems.1
How to get investors wet:
“My latest project utilizes the microservices architecture and is a mobile first, artificially intelligent blockchain making use of quantum computing, serverless architecture and uses coding and algorithms with big data. also devOps, continuous integration, IoT, Cybersecurity and Virtual Reality”
Doesn’t even need to make sense12
Anything I (am able to) build myself.
Also, things that are reasonably standardized. So you probably won't see me using a commercial NAS (needing a web browser to navigate and up-/download my files, say what?) nor would I use something like Mega, despite being encrypted. I don't like lock-in into certain clients to speak some proprietary "secure protocol". Same reason why I don't use ProtonMail or that other one.. Tutanota. As a service, use the standards that already exist, implement those well and then come offer it to me.
But yeah. Self-hosted DNS, email (modified iRedMail), Samba file server, a blog where I have unlimited editing capabilities (God I miss that feature here on devRant), ... Don't trust the machines nor the services you don't truly own, or at least make an informed decision about them. That is not to say that any compute task should be kept local such as search engines or AI or whatever that's best suited for centralized use.. but ideally, I do most of my computing locally, in a standardized way, and in a way that I completely control. Most commercial cloud services unfortunately do not offer that.
Edit: Except mail servers. Fuck mail servers. Nastiest things I've ever built, to the point where I'd argue that it was wrong to ever make email in the first place. Such a broken clusterfuck of protocols, add-ons (SPF, DKIM, DMARC etc), reputation to maintain... Fuck mail servers. Bloody soulsuckers those are. If you don't do system administration for a living, by all means do use the likes of ProtonMail and Tutanota, their security features are nonstandard but at least they (claim to) actually respect your privacy.2
What you are expected to learn in 3 years:
digital signal processing,
signal and control system,
NLP, data algorithm,
Java, C++, Python,
ASP.NET web development,
computer security ,
Android app development,
IOS app development,
3D game development,
introduction to DevOps,
how-to -fix- computer,
Project of being entrepreneur,
and 24 random unrelated subjects of your choices
This is a major called "computer engineering"4
Being a programmer in a scientific discipline can be infuriating.
using "no one" ="almost no one"
using everyone = "almost everyone"
1. No one knows what even the very idea of good practice is. And everyone refuses to learn. 3k lines of repetitive copy pasted main. 500 lines of plotting method.
2. Raw C-style pointer based array creation. Won't use develope array libraries because what if development stops. FUCKING HAVE YOU SEEN YOUR CODE WHAT IF DEVELOPMENT ON YOUR CODE STOPS. FUCK.
3. LOOP VARIABLES DECLARED AT THE BEGINNING OF THE METHOD WHY.
4. Everyone wants to make modular, independent code. No one wants to use OOP. NOPE. ALL IN ONE FILE. WRITE C++ LIKE A FUCKING PYTHON NOTEBOOK. FUCK.
5. LIBRARIES OH MY GOD PLEASE DO NOT CODE UP YOUR MATRIX MULTIPLICATION. PLEASE DO NOT TRIPLE LOOP IT. NO. THE LINEAR ALGEBRA LIBRARY WILL STAY IN DEVELOPMENT.
6. Please realize that literally not one comment over an 1800 line file does not help anyone.
FUCKING. WHY. WHY ARE WE SCIENTISTS SO GOOD AT SCIENCE AND SO FUCKING SHIT AT THE CODE THAT MAKES OUR SCIENCE HAPPEN. WHY. FUCKING. WHY. FUCK.4
Fuck you Intel.
Fucking admit that you're Hardware has a problem!
"Intel and other technology companies have been made aware of new security research describing software analysis methods that, when used for malicious purposes, have the potential to improperly gather sensitive data from computing devices that are operating as designed. Intel believes these exploits do not have the potential to corrupt, modify or delete data"
With Meltdown one process can fucking read everything that is in memory. Every password and every other sensible bit. Of course you can't change sensible data directly. You have to use the sensible data you gathered... Big fucking difference you dumb shits.
Meltown occurs because of hardware implemented speculative execution.
The solution is to fucking separate kernel- and user-adress space.
And you're saying that your hardware works how it should.
Shame on you.
I'm not saying that I don't tolerate mistakes like this. Shit happens.
But not having the balls to admit that it is because of the hardware makes me fucking angry.5
Follow-up to my previous story: https://devrant.com/rants/1969484/...
If this seems to long to read, skip to the parts that interest you.
~ Background ~
Maybe you know TeamSpeak, it's basically a program to talk with other people on servers. In TeamSpeak you can generate identities, every identity has a security level. On your server you can set a minimum security level you need to connect. Upgrading the security level takes longer as the level goes up.
~ Technical background ~
The security level is computed by doing this:
SHA1(public_key + offset)
Where public_key is your public key in Base64 and offset is an 8 Byte unsigned long. Offset is incremented and the whole thing is hashed again. The security level comes from the amount of Zero-Bits at the beginning of the resulting hash.
My plan was to use my GPU to do this, because I heared GPUs are good at hashing. And now, I got it to work.
~ How I did it ~
I am using a start offset of 0, create 255 Threads on my GPU (apparently more are not possible) and let them compute those hashes. Then I increment the offset in every thread by 255. The GPU also does the job of counting the Zero-Bits, when there are more than 30 Zero-Bits I print the amount plus the offset to the console.
~ The speed ~
Well, speed was the reason I started this. It's faster than my CPU for sure. It takes about 2 minutes and 40 seconds to compute 2.55 Billion hashes which comes down to ~16 Million hashes per second.
Is this speed an expected result, is it slow or fast? I don't know, but for my needs, it is fucking fast!
~ What I learned from this ~
I come from a Java background and just recently started C/C++/C#. Which means this was a pretty hard challenge, since OpenCL uses C99 (I think?). CUDA sadly didn't work on my machine because I have an unsupported GPU (NVIDIA GeForce GTX 1050 Ti). I learned not to execute an endless loop on my GPU, and so much more about C in general. Though it was small, it was an amazing project.1
Did you know that the national center of super-computing in Swiss (CSCS), dedicates 30% of its resources for computations for the Swiss weather channel?
Talk about priorities 😂5
Hey guys it's not a rant, but i feel this place might help...
I am a 20 yr old, second year guy ...have got some experience in core Java and after that, i have been doing android for 8months... Yeah , i coded some basic apps got my hands dirty on firebase, sql libraries and some connectivity...
Even got landed in an internship.
Today i feel myself to be an intermediate android dev , nd i know their are many things that can be learnt in android that i don't know..
But what after that?development as a carrier interests me, but i fear for a job security ... I could learn more of Android,maybe learn ios after that but their are always articles coming out that react is future, webapps will replace android and stuff like that...
I Have also heard stuff like companies today want to squeeze more out of their techs, so they want less and complete developers having experience in both web and mobile app designing and other stuff like that
Are you freakin kidding me? Android and ios alone are like drinking Pacific and indian ocean and to add web developing, its like drinking out every drop of ocean in the world.
I guess their are guys which exist with knowledge of all three, maybe I can cover them all too(someday) but that would take my whole clg life of 4 years..(I guess)
And no ,I don't have problems with that too.. I actually like developing but again i hear big words like cloud computing, AR,VR AI, data sciences, automation, graphics designing, game dev, and many more...
Basically i hear too much and i fear too much 😅 and i don't think closing my ears would be a good choice...
So, which ocean of carrier should i aim to go for?nd are my fears real? Do companies really prefer some web guy designing Amazon like apps over android-only guys like me?is automation nd templates really gonna take all we, developers jobs?should i look into ai/data sciences?
Well , i am a simple guy, who got his first pc at 17 so naturally, i am fascinated even by the working of a calculator app and anything relates to tech so am open to pursue my interests in any fields24
It was not until 20 that I had access to regular computing. In school I had to take up Finance as my Maths was weak. I couldn't take Sciences including computers and how could I , my childhood wasn't as fortunate as my peers.
When I entered college I got my brothers old gaming pc as we had a couple of work laptops at home. I was always the inquisitive one. I got interested in web development just because of curiosity while I was on my first job and I hated it. I used to write article and freelanced and ran a website for friends where I learned a lot by trial and error. I single handedly learned mySQL, PHP and basic web development.
The main job was a core night from 11pm -8 am . Drained me and my social life drowned. I lost my brother in an accident. Silver Lining: I quit my job.
I understood I was interested in computers like nothing else. I single handedly learned a programming language. After leaving the job I took up classes to learn from root level in a structured manner: Web design and Development.
Now though I am jobless and I am searching for my second job it is for something I love. :)2
Honestly, I'm crazy fucking excited about ML, blockchain and containerisation.
Their applications outside traditional computing are what does it for me. I'm hoping to learn a shit ton of python and ML so I can apply it to help my significant others family research using Molecularly Imprinted Polymers to find and treat cancers.
I'd like to build statistical models of varying types and progressions of cancer and map how the MIPs help and interact so they can be better optimised.
Okay, I think I am losing it, how do you explain a distributed computing VM has to somehow execute code on some hardware at a point to a customer that is clearly a big ass bullshit eater/buzzword bitch?
Because if I can't, I may buy a plane ticket for Canada and an axe, and that is not for cutting lumber11
The TA for my computing lab in uni consistently shows up 45 minutes late. I'm usually done in 20 because I use the rest of the time to work on the next lab.
He walks through the door, lets out the biggest sigh, sits down, sighs again, opens up his laptop, and sighs once more. When someone asks for help, he sighs so hard you can see his lungs shrivel up as he exhales, and then provides them with a pointless answer.
The best part about the cs department here is that when you join cs, you are given an account to use with the ubuntu machines in the computer labs. They send you the password over school email, and you can't change it on any system they provide.
Informative article on why Golang is relevant is today's computing ecosystem. I too find many server side programming being done in Golang nowadays. I liked its c like features and simplicity over complexity.
Once again I have loads.
My best teachers were...
The contractor that taught me C#, ASP MVC and SQL Server. Dude was a legend, so calm and collected. He wanted to learn JQuery and Bootstrap so at the same time as teaching, he was learning from me. Such an inspirational person, to know your subordinates still have something to teach you. He also taught me a lot about working methodically and improving my pragmatism.
The other, in school I studied computing A-Level. 100% scored at least one of the exams... basically I knew my stuff.
But, as a kid, I didn’t know how to formulate my answers, or even string together coherent answers for the exams. This dude noticed, first thing he did was said “well you’re better at this bit than me, practice but you’ll be fine” (manually working out two’s complement binary of a number).
Second thing he did was say “you know what man, you know what you’re on about but nobody else is ever going to know that”.
He helped me on the subjects I wasn’t perfect on, then he helped me on formulating my answers correctly.
He also put up with my shit attendance, being a teenager with a motorcycle who thinks he knows it all, has its downsides.
As a result, I aced the hell out of that course, legendary grades and he got himself a bit of a bonus for it to use on his holiday. Everyone’s a winner.
Liam, Jason, if you guys are out there I owe you both thanks for making me the person I am today.
The worst, I’ve had too many to name... but it comes down to this:
- identify your students strengths and weaknesses, focus on the weaknesses
- identify your own and know when to ask for help yourself
- be patient, learning hurts.
You can always tell a passionate teacher from one who’s there for the paycheck.1
i honestly hate the ap computer science principles curriculum. we're taking an ap test soon, so for the past few weeks, we've been constantly taking practice tests.
it pisses me off so much. the questions, the criteria, it's all bs.
we have questions like "what will reduce the digital divide?" with choices like "education for low income families on computers." like, I DONT FUCKING KNOW.
frankly, I DONT FUCKING CARE. giving electronics to people who cant afford it is great and all, BUT IT DOESNT INVOLVE ANYTHING ABOUT COMPUTING.
HEY, COLLEGE BOARD, KNOWING IF AN ALGORITHM IS TECHNICALLY AN "ABSTRACTION" DOESNT FUCKING MATTER. WHAT MATTERS IS THAT I CAN IDENTIFY WHATS MORE EFFICIENT, WHERE A BUG IS, CONCEPTS INVOLVED IN PROGRAMS, THINGS LIKE THAT.
NOT IF DNS IS SIMILAR IN STRUCTURE TO THE US POSTAL SYSTEM.
god i hope whoever wrote this gets hit in the head by a github server that was dropped from the 2^8th floor.2
The Fibonacci sequence also known as he fingerprint of God is the most efficient memory maintenance method in computing as well
There are a couple of them to list! But to sum my main ones(biggest personal heroes):
John McCarthy, one of the founding fathers of Artificial Intelligence and accredited with coining such term(sometimes before 1960 if memory serves right), a mathematical prodigy, the man based the original model of the Lisp programming language in lambda calculus. Many modern concepts that we have in programming where implemented in one way or another from his systems back in the day, and as a data analyst and ML nut.....well I am a big fan.
Herb Sutter: C++ programmer extraordinaire. I appreciate him more for his lectures and published articles than anything else. Incredibly smart and down to earth and manages to make C++ less intimidating while still approaching it with respect.
Rich Hickey: The mastermind behind Clojure, the Lisp dialect for the JVM. Rich is really talented and his lectures behind his motivations and reasons behind everything he does with Clojure are fascinating to see.
Ryan Dahl: Awww shit y'all know how it is. The man changed web development both in the backend and the frontend for good. The concept of people writing their own servers to run their pages was not new, but the Node JS runtime environment made it more widely available to people by means of a simple to use language that was already popular with web developers. I would venture to say that Ryan's amazing contributions to JS made the language better, as it stands, the language continues to evolve and new features that make it overall better keep being added. He is currently building Deno, which would be a runtime environment for TypeScript, in Rust.
Anders Hejlsberg: This dude was everywhere man....the original author of Turbo Pascal and the lead of Delphi back in the day. These RAD tools paved the way for what would be a revolution in the computing world. The dude is also the lead architect and designer of the C# programming language as well as TypeScript.
This fucker is everywhere and I love it.
Yukihiro "Matz" Matsumoto: Matsumoto san is the creator of the Ruby programming language. Not only am I a die hard fan of Ruby, but of the core philosophies that the man keeps as the core of his language design: Make the developer happy, principle of least surprise. Also I follow: minswan which is a term made by the Ruby community that states Mats is nice so we are nice. <---- because being cool to others is better than being a passive aggressive cunt.
Steve Wozniak: I feel as if the man does not get enough recognition...the man designed the Apple || computer which (regardless of how much most of y'all bitch and whine) paved the way for modern micro computers. Dude is also accredited with designing one of the first programmable universal remotes(which momma said was shitty) but he did none the less.
Alan Kay: Developed Smalltalk and the original OOP way of doing things. Smalltalk as a concept is really fucking interesting. If you guys ever get the chance, play with Pharo, which is a modern Smalltalk. The thing is really interesting and the overall idea of Smalltalk can be grasped in very little time. It sucks because the software scales beautifully in terms of project building, the idea of hoisting a program as its own runtime environment and ide by preserving state through images is just mind blowing to me. Makes file based programs feel....well....quaint.
Those are some of the biggest dudes for me. I know that the list is large, but I wanted to give credit to the people that inspired me the most. Honorary mention goes to other language creators and engineers of course, but it would be way too large to list!7
VIM! ViM! vim! Vi Improved! Emacs (Wait ignore that one). What’s this mysterious VIM? Some believe mastering this beast will provide them with untold mastery over the forces of command line editing. Others would just like to know, how you exit the bloody thing. But in essence VIM is essentially a command line text editor at heart and it’s learning curve is so high it’s a circle.
There’s a lot of posts on the inter-webs detailing how to use that cruel mistress that is VIM. But rather then focus on how to be super productive in VIM (because honestly I’ve still not got a clue). This focus on my personal journey, my numerous attempts to use VIM in my day to day work. To eventually being able to call myself a novice.
My VIM journey started in 2010 around the same time I was transiting some of my hobby projects from SVN to GIT. It was around that time, that I attempted to run “git commit” in order to commit some files into one of my repositories.
Notice I didn’t specify the “-m” flag to provide a message. So what happened next. A wild command line editor opened in order for me to specify my message, foolish me assumed this command editor was just like similar editors such as Nano. So much CTRL + C’ing CTRL + Z’ing, CTRL + X’ing and a good measure of Google, I was finally able to exit the thing. Yeah…exit it. At this moment the measure of the complexity of this thing should be kicking in already, but it’s unfair to judge it based on today’s standards of user friendly-ness. It was born in a much simpler time. Before even the mouse graced the realms of the personal computing world.
But anyhow I’ll cut to the chase, for all of you who skipped most of the post to get to this point, it’s “:q!”. That’s the keyboard command to quit…well kinda this will quit the program. But…You know what just go here: The Manual. In-fact that’s probably not going to help either, I recommend reading on :p
My curiosity was peaked. So I went off in search of a way to understand this: VIM thing. It seemed to be pretty awesome, looking at some video’s on YouTube, I could do pretty much what Sublime text could but from the terminal. Imagine ssh’ing into a server and being able to make code edits, with full autocomplete et al. That was the dream, the practice…was something different. So I decided to make the commitment and use VIM for editing one of my existing projects.
So fired the program up and watched the world burn behind me. Ahhh…why can’t I type anything, no matter what I typed nothing seemed to appear on screen. Surely I must be missing something right? Right! After firing up the old Google machine, again it would appear there is this concept known as modes. When VIm starts up it defaults to a mode called “Normal” mode, hitting keys in this mode executes commands. But “Insert” entered by hitting the “i” key allows one to insert text.
Finally I thought I think I understand how this VIM thing works, I can just use “insert” mode to insert text and the arrow keys to move around. Then when I want to execute a command, I just press “Esc” and the command such as the one for saving the file. So there I was happily editing my code using “Insert” mode and the arrow keys, but little did I know that my happiness would be short lived, the arrow keys were soon to be a thorn in my VIM journey.
Join me for part two of this rant in which we learn the untold truth about arrow keys, touch typing and vimrc created from scratch. Until next time..
Gdpr thing aside...
Does anybody read the new policy...
I just did for 2 apps (intel driver software and 9ga.. You know it)...
On the things that they collected on the side without you knowing include:
Device memory, language, battery level, timezone, unique device identifiers, compass, accelorometer and microphone... Even service provider and signal strength
Website you visit, how you use your computer(vague too vague) and computing environment.
Did anybody knew for sure before this that their apps are listening to them? That they just made a profile of you with all the data?
With all this they dont even need your ip they already know who you are and what you do on a daily basis...
There are 20 more but it will be waaay too much to write about. These 2 are way worse since 9ga doesnt use microphone for anything... And why a driver reads what websites i am going to?4
I hate when people ask you to find their deleted files. Fucking people! It is like asking an architech to recover something from their trash bin. People are idiots that don't want to learn. Some people think that they know a lot of computing and barely can power on they monitors. At this level of average stupidity, people should get licenses to use computers.7
Hey guys, I have a serious question for you: How do you define science?
And yes this is going to be a long Rant. This topic really pisses me off.
A bit of context first. I come from a "humanities" background. I study history and dude, I love it. The problem is that even though we fucking pull our brains out studying historical phenomena with a fucking ton of conceptual tools, our work is mostly seen as literature to entertain the elderly during their lonely evenings. But that's not really the point of this rant.
My fucking problem is that while we try to do some serious work; actual work that could help society for real, it all goes into that magical fucking kingdom called "humanities". HOW THE FUCK DO THEY DARE TO CALL SOMETHING "HUMANITIES". IT'S A FUCKING HISTORICAL TERM THAT MEANS "TO FULFILL MEN IN ALL IT'S ASPECTS", AND NOW THEY'VE REPURPOSED IT, MAKING IT CONTAIN ANY STUDY THAT ISN'T "EMPIRICAL", "OBJECTIVE", ADD ANY FUCKING SCIENTIFIC DELUSIONARY TERM YOU CAN THINK OF.
And don't get me started on "objectivity". Oh boy, your fucking objectivity is hollow as a kid's balloon. There is no such thing as a objective study, even when it applies your "rational" "godly" scientific method. Some guys follow that shit as if it was a fucking religion. I do understand it's useful and all that, but in the end it's just a tool, you can't fucking define "science" by it's tools.
"""Q: What is carpintery?
A: Well, it's hammers, nails and wood. Yep. Hammers, nails and wood."""
THE SCIENTIFIC METHOD WAS FUCKING INVENTED DURING THE XVIII CENTURY, WHAT THE FUCK DO YOU THINK WAS GALLILEI BEFORE THAT? "HUMANITIES"?
Why do I say objectivity isn't posible? Well, guess what? YOU ARE FUCKING HUMAN. Every thing you know is full of preconceptions and fucking cultural subjectivities invented to understand the world. And it's ok, becouse if you understand your own subjectivity, at least you can see yourself in a critical sense, and at least "tend" to objectivity, in the same way functions tend to infinity.
And here comes the best part: people studying "cs" in my university pass most of the time studying a ton of shit that isn't really science, but is taken as scientific becouse it is related to "science". These guys spend entire semesters just learning programming fundational stuff that in my opinion isn't really science, it's just subjective conceptual constructs built to make the coding process better. They only have TWO fucking classes on discrete mathematics and another 3 or 4 in actual scientific fields related to computing. THESE GUYS AREN'T FUCKING BEING TAUGHT TO BE COMPUTER SCIENTISTS; THEY ARE TEACHING THEM TO BE PROGRAMMERS. THERE'S A HUGE DIFFERENCE BETWEEN CS AND PROGRAMMING AND THAT IS THE WORD SCIENCE. And yes, I'm being drastic on the definition of science on purpose becouse guess fucking what? I'M PISSED OFF.
"Hey, what are you doing?"
"Just doing science with scrum and agile development."
I understand most of you guys would think of science as "the application of the scientific method", "Knowledge by experimentation and peer-review", "anything techy". Guys, science is a lot broather than that. I define it as "the search for truth", mainly becouse that's what we are all doing, and what humans have been doing to gain knowledge through the ages. It doesn't matter what field of truth you are seeking as long as you do it seriously and with fundaments. I don't fucking care if you can't be objective: that's impossible. Just acknowledge it and continue investigating accordingly.
I believe during the last centuries the concept of science has been deformed by the popular rise of both natural and applied sciences. And I love the fact that these science fields have been growing so much all this time, but for fucks sake don't leave every other science (science as I define it) behind. Governments and corporations make huge mistakes becouse they don't treat history, politics and other sciences seriously. Yes, I called history a "science", fuck you.
And yes, by my definition programming is not a science. I don't know what most of you think programming is, but for me it's a discipline that builds stuff, similar to carpintery or blacksmithing. Now if you are pushing the limits, seeking ways to make computing go further, then that's science. The guys that are figuring out AI are scientists, the guys that are using it to detect hotdogs aren't - unless they are the same person- deal with it. I guess a lot of you guys are with me on this point.
In the end, we are all artisans building abstract tools by giving orders to a machine.
I still have some characters left, so I want to thank the community as a whole for letting me vent my inner rage. I don't have much ways to express myself on these matters, so for me DevRant is a bless.8
Everything will be about the same, but faster. Quantum computing will allow brick-shitting speeds of data processing, Nvidia will at some point develop a quantum GPU and call it Fuckall architecture or something that will allow to simulate all the atomic-level physics of a whole car (and stuff), 1Tb network speeds will be common, websites and databases work in a blink of an eye.
Also someone will find a spectre/meltdown-level vulnerability in quantum CPUs and everyone will get f-d in the a. Again. Almost.14
These were back in highschool and I was around 13 or 14, and no one taught me any html and have to figure it out myself by reading scarce references:
*When I started to try configuring my Friendster profiles with CSS ;
*when I successfully made cute sites for me and my friends in Geocities with personalized free domain names;
*Oh, i made little pages on local for my favorite bands;
*and, when I experienced computing shit at DOS level
Those are little things that drove me into learning indepth programming.
Studying software development in the evenings. More so to get the piece of paper than to learn. Just reading up the lecturers definition for cloud computing...
"It was a fluffy shape which represented something we couldn’t contemplate in its entirety"
I fear for the others in my course1
Today I'm beginning my third year of Bachelor in embedded computing. And just as last year, I'm bored as fck. "Learning" the same stuff over and over, and wasting my time when I could be at work as a PHP developer ... FML7
What is fucking wrong with Windows? When shit doesn't respond it's impossible to kill it and it freezes other processes. NEVER happens in Linux, all I do is kill the PID. When you can't open task manager or "end the process" you are shit out of luck. You'd think they'd fix this in the decades they've had to built a computing platform. I'd use linux exclusively but some work and tools at my company necessitates windows.8
I dunno if you gents remember the Nickelodeon show known as Drake and Josh.
It was pretty big in Mexico and the U.S.
Well, one of the characters from that show is the singer/actor Drake Bell.
For a while, Drake Bell would **constantly** tweet about how much Justin Bieber sucks.
I aint denying that Justin Bieber sucks, i don't like his music at all.
But the constant attacks came out as jealousy, at least to me.
What does this has to do with development or even computers? Well this is EXACTLY how I feel about Louis Rossman CONSTANTLY making videos about apple products.
We get it man we really do, sadly for a lot of us the only way to get ios development done is through a fucking Mac
EVEN if his whiny ass is right about the hardware not being top notch and all that shit I AM still not able to explain a 2013(early...as in january) macbook pro still working with literally NO fucking problems. Before that the other macbook was just changed because we wanted the 2013 model. The thing worked, the one before did so too and the 2017 model that I have works, amazingly so i will add.
Still, the army of dell,hp and lenovo laptops that I've had before just died or are not functioning properly. Either it is my shit luck or Apple's "shitty hardware" got something really fucking right.
I think its retarded really. If you don't like them then fine, you don't have to, personally I fucking love all computers and os, but I don't get fanboys hating for the sake of hate.
the fuck you care if I spend 2500 on a computer? I would the same shit for your mom and the computer would last me longer.
Does owning multiple macs make me better than you? No
Does this mean that you are piss poor and can't afford shit and that is why you are hating? No
Will I call you <insert number of insults> gor your choice of pc or os? No
What is retarded is this: you all are DEVELOPERS(at least a good chunk) and your ass better fucking know that some people USE a certain tool because IT IS THE RIGHT ONE FOR THE JOB.
It is a damn fine operating system, a really good computing experience. It ain't your taste? Fine, das cool, but for fucks sake it does not mean that the other people are idiots or whatever.
Grow the fuck up and get yourself an opinion.20
Every god damn time I have to interact with windows I hate it more and more. I sold my old Dell PC, and said I'd install Win10 on it for the dude.
It's now been 2 hours of me trying to figure out why the Win10 installer complains about missing drivers, and trying to figure out exactly what drivers are missing, because they sure as absolute fuck don't tell you.
"A media driver your computer needs is missing. This could be a DVD, USB or Hard disk driver. If you have a CD, DVD, or USB flash drive with the driver on it, please insert it now."
Well how the fuck am I supposed to know what driver you need? The least helpful error message in the history of computing....12
I'm not going to lie, the surge of bootcamps really irks me. Not because I'm afraid of competition, or that I'm an elitest. Mainly because a lot of people who attend these bootcamps have no real interest in software engineering. I sometimes attend a meetup, and it's a beginner meetup. I try to help out. And a lot of people clearly have no patience for learning software engineering. I try to be encouraging, but sometimes I just want to be dick and tell them "Why the hell do you want to be a dev, if you're not interested in how computers work".
I'm an 100% myself taught developer. Granted I'm 38 and taught myself programming at 14. But it came out of an earnest desire and love for technology in general. So I never shyed away from learning? C and assembler? Bring it on. Theoretical computer science? I can get with that. For me I loved computer so much, that I was willing to learn about anything in the realm of computing.
This is what annoys me with the adult bootcamp crowd. I feel they're only willing to learn as long as it's easy. If something gets complicated or complex, then they check out. And I a lot of their questions is "tell me how to do this/that". But they don't know why they would do it.
To me it feels like they're trying to fast track themselves to a dev job. Yet you would think if they're trying to do this all professionally, they would be open to learning as much as possible, and not closing themselves off.
My semi-friend who runs the meetup is trying to start a bootcamp himself. So I try I severely hold my tongue when I attend those meetups. And I want to be supportive. I certainly don't want to be the reason why people are turned off by programming. But at the same time, I hate how people are abusing this profession because they think it's fast money and an easy way to earn 6 figure salaries.5
I’ve been trying to implement an alarm clock as an example for my physical computing lecture but the merge of my existing low power clock with my existing state machine based timer is driving me nuts. And I’m the lecturer of this course!2
My first rant here, I just found out about it, I don't have much of programming background, but it always triggeredmy intetest, currently I am learning many tools, my aim is to become a data scientist, I have done SAS, R, Python for it (not proficient yet though), also working on google cloud computing, database resources and going to start Machine Learning (Andrew Ng's Coursera).
Can anybody advice me, Am I doing it right or not.?2
Develop my first mobile app with a restful backend for consumer usage
Learn more about cloud architecture/computing
Finish learning calculus
Learn linear algebra, discrete math, statistics and probability
Maybe start ML this year depending on math progress and time2
Thought I'd mention that there's more books on the study of sexuality than there are of computing at my uni.
- Buy some big old computer.
- Sell them as quantum computer.
- Get cash and run away.
Currently this idea may work. 🤔3
Just finished Microsoft's newest CEO, Satya Nadella's book "Hit Refresh." It was actually really great. He talks about changing Microsofts culture and global impact, inspiring makers, as well as what the needs are going forward in technology.
It's going to be a long rant here and probably my fist rant ! And yes I am pissed up with a community growing in dev world .
There are so called framework experts who are so good that they can spin up a nodejs server with express and mongodb .
So to the people who bash on php , who bash down MySQL for no fuckig reason other than they have heard these are not so cool.fuck yourself incompetent piece of crap!!! I can hear all day about how algorithms and datsructure are not important form these people.fuck you because if you don't know /understand /want to understand the basics of computing how the fuck can your brain be trusted with anyting serious??If you can't write down proofs of basic / standard algorithms and till bash down on people who do those please fuck you because those are the people indirectly responsible for your Job so that u can work on fancy frameworks and cool IDE's .
Instead of whining down dedicate some time to your maturity and knowledge because that what we devs are all about.we like solving problems right?.
I repeat if you are anything like stating up it career in mid 20s maybe.leave everything if you can .Forget all fucking frameworks and technologies start with basics of computing, right at instruction level using assembly .Then move to a higher language when u know and reason about what your CPU is actually doing.
If you can't do that and keep on crying and bashing down things wihout proper explanations fuck yourself with a cactus .5
So for those of you keeping track, I've become a bit of a data munger of late, something that is both interesting and somewhat frustrating.
I work with a variety of enterprise data sources. Those of you who have done enterprise work will know what I mean. Forget lovely Web APIs with proper authentication and JSON fed by well-known open source libraries. No, I've got the output from an AS/400 to deal with (For the youngsters amongst you, AS/400 is a 1980s IBM mainframe-ish operating system that oriiganlly ran on 48-bit computers). I've got EDIFACT to deal with (for the youngsters amongst you: EDIFACT is the 1980s precursor to XML. It's all cryptic codes, + delimited fields and ' delimited lines) and I've got legacy databases to massage into newer formats, all for what is laughably called my "data warehouse".
But of course, the one system that actually gives me serious problems is the most modern one. It's web-based, on internal servers. It's got all the late-naughties buzzowrds in web development, such as AJAX and JQuery. And it now has a "Web Service" interface at the request of the bosses, that I have to use.
The programmers of this system have based it on that very well-known database: Intersystems Caché. This is an Object Database, and doesn't have an SQL driver by default, so I'm basically required to use this "Web Service".
Let's put aside the poor security. I basically pass a hard-coded human readable string as password in a password field in the GET parameters. This is a step up from no security, to be fair, though not much.
It's the fact that the thing lies. All the files it spits out start with that fateful string: '<?xml version="1.0" encoding="ISO-8859-1"?>' and it lies.
It's all UTF-8, which has made some of my parsers choke, when they're expecting latin-1.
But no, the real lie is the fact that IT IS NOT WELL-FORMED XML. Let alone Valid.
THERE IS NO ROOT ELEMENT!
So now, I have to waste my time writing a proxy for this "web service" that rewrites the XML encoding string on these files, and adds a root element, just so I can spit it at an XML parser. This means added infrastructure for my data munging, and more potential bugs introduced or points of failure.
Let's just say that the developers of this system don't really cope with people wanting to integrate with them. It's amazing that they manage to integrate with third parties at all...2
At work we get points each month if we don't miss work for the entire month. I have enough points to get a Canon EOS eSLR, or a new mobo-cpu-mem combo. I hate having to make desicions and choices.6
Im learning Advanced Computing in a institute. Prior to that I have done many projects with Unity 3D, MEAN and C#. But guy who is teaching is very rude every time to the class. 'you people are stupid' 'you cant do anything on your own' 'that's why your here'. And Im self tought programer so Im getting really angry at such comments. How to deal with this?? 😥😫5
Since learning how to program, I have started to see the world in a different way. The algorithmic and Mathematical way of approaching computing problems I have adapted to approach all of my problems. Everything is just a problem that can be solved by taking a logical approach!
Every few fucking years some new shit comes up. Everyone and their dog have either a programming language or a JS framework and each and every one of them is the very best to learn and use immediately otherwise you are weak and will not survive the winter.
And now this shit with qubits and what have you? "they are installing the first quantum computer in germany" and "IBM have been making quantum computers for the military for 375 years now".
Just started learning gnuplot yesterday. Sure, it's not the shiniest of tools, but I'd heard enough about its performance to give it a go.
It's like learning vim. You Google thrice to write a single functional line. You spend hours trying to find a single command for a single task.
But. GODDAMN. This thing's the fastest plotting framework I've ever dealt with. I love Matplotlib, but as great as its plots are, when I need to plot shit up in half a second, I've found a new friend.
Also, tutorial suggestions appreciated.1
Unlike the built-in ** operator, math.pow() converts both its arguments to type float. Use ** or the built-in pow() function for computing exact integer powers.
Well who knew?
source: python docs3
"I see you're computing the same result multiple times, you shouldn't do that, here's how you optimize that out"
Okay listen you fuck, that's a null guard which goes directly into throwing an exception. The most optimal path is getting past the null guard as quickly as possible, which is what I do. Once you've failed the null guard, throwing an exception faster doesn't do you any good.
I swear plenty of FOSS programmers don't even really look at the project, they just find "errors" that make them feel smart.5
Was wondering if there are any open source or free for non-commercial (would like to tinker with it / learn about it as my weekend adventure first) use projects (or approaches how to solve this) that I might not have heard of, related to what I'm about to describe....
Usually, when you got 'n' ammount of machines (be it VM's, baremetal or just a bunch of Raspberry Pis) the usual approach is to either:
1)Run them as independent machines, each serving it's own purpose;
2)Run them as independent machines but "tied together" with software configured in a cluster and featuring some sort of replication;
3)Pool them together into a "swarm"/resource pool and provision VM's across them, with resources spread across the machines;
But I here's where I started wondering: What if I had 3 identical desktop PCs with relatively decent computing power and resources allocated to each... stay with me here - I can already feel some of you writing this off completely or already preparing to answer how impossible this is... what if I was ready to sacrifice portion of the combined I/O or computing power so I could set these 3 computers up in some form of "grid computing" (2-3 machines working as one)? Are there any solutions out there that could take care of the related problems (I/O error/fault handling, fragmentation, CPU governing, timing, synchronization.... all the stuff that sane and normal people don't have to worry about because it's already been taken care of on the lowest levels of programming/computing)?
I'm also thinking about the average consumer grade x86/x86_64 and ARM architecture here, not some exclusive spacegrade architectures normally used in supercomputers.7
I don't really understand how my mid range computer seems to outperform my top of the line 12 core computer on some seriously intensive i/o and memory applications, it is like the bus in my mid range laptop is better.3
What is the probability of alien rootkit signal that would be intercepted by satellite and then executed on modern computers to create AGI that can use cloud computing and digital currency to take over our world ?
From my perspective pretty high 🤣🤣🤣
Let’s convince some government people and create intergalactic cyber attack defense institution, that would keep earth safe from alien invasion, with high money grants so we can prevent those threats.
Maybe Ernest Cline Armada is already a thing.
What you think ?2
Ok, so the new programming language Q# is out. VERY exciting for me! I love the idea of quantum computing! Then I realize that developers will need to know the basics of quantum physics to use it effectively. Yay or nay? Welp, those extremely big, expensive machines won't program themselves (yet).2
Today I learned:
My computing teacher has never thought about having the front end and back end as different apps. He always just connects to the database from the client and calls the clients non-gui parts the backend.
I now understand why he was so confused when I said I was thinking about the backend accepting web-based and desktop-based clients.2
One of biggest epiphanies came through this fundamental critique in SICP of the assignment operator. Through years of imperative programming it seems so innocent, doesn't it? But that you lose referential transparency, run into the alias problem and fundamental difficulty to determine object equality (or of their instances) - that was kind of eye opening considering all the pain I had already experienced with state in concurrency.
(It led me so far to think it's an ontological issue, that even in the discrete computing universe we have not come so much further than Zenon's paradoxa on change.)7
what do you recommend for me to learn about next?
I have learnt about:
- web frontend/backend (php)
- android and java
- c, c++, nasm, gnu assembler
- parallel computing
- cli operating systems
with that background, what would you recommend?
- neural networks
- making a server
- ethical hacking
- starting a blog7
1 - I love coding because since when I was a kid I really loved to solve problems and create things
2 - I always tried to understand how computers worked, and how could yo make a program because when I was a kid I was almost always on the computer and my dream was to create a virus 😂
3 - I was studying my baccalaureate and I hadn't decided what to study in the university. I was only playing videogames and installing software to make jokes. So, my computing teacher taught me to code in VB.net and how to manage a local network so I decided to study and IT degree before going to the university, and when I was studying that I falled in love with programming so I'm currently in the university studying software development engineer
How far we went. Like two or three years ago quantum computing was something from a Sci fi movie, and now there is a fucking programming language for them that anyone can learn in theory!3
in case you don't know what Cloud Computing is, or if you'd like to know more about it, check out this video. https://youtube.com/watch/...2
The CS instructor who was (maybe an adjunct?) who no matter what homework we turned in gave us randomly different grades. We actually all handed in the same hw one time to see what would happen (only changing variable names and such), and confirmed it. Also, he used to call me J-Lo when I'd raise my hand to answer questions. (I was fit and am Latina) The second time he did it, I sternly corrected him in front of the whole class. He stopped after that. And yes he was gone from the school soon after!5
Oh, me? I am so excited about all the computing power that's gonna be stolen from people who had updated their Intel CPUs last month.
I dunno what they're up to but I'm sure it's very exciting. I'm lost between Skynet and a one world government cryptocurrency mining they will use the power for.
What do you think?3
Quantum computing is at least trying to be the next "ML".
People seemed to ignore it over decades but suddenly a few months back, everyone got excited on Google's headline progress.
Later, people realized it is not a big deal and everyone moved on.8
In computing class, we were asked to identify different connectors/ports in one lesson.
Someone couldn't tell what a USB port looked like >.>3
The moment I knew I wanted to be a dev was very early in life, but I didn't realize it until I had gotten out of high school. My parents gave me my first computer when I was like 8 and it was my grandfather's old Windows 95 PC. I loved to play the Army Men game with the plastic figures like from Toy Story. I also tinkered around and found out how Word and some of the other programs worked. About two years later, I got his old Windows 98 PC. I continued to play around in Windows and discover some nuances of the operating system. My parents had a Windows XP machine at the time and they called me in every time they needed help. I got on their computer from time to time to use the Internet, where I discovered so many cool things. In junior high, we were forced to take a typing course where I honed my typing skills through playing games. I soon was able to easily complete all of the challenges. To understand my persona, you must know that I was bullied throughout elementary and high school. I was "the nerd" of our class and I wore that badge even with all of the negative energy that it came with. I received constant criticism, ridiculed for being intelligent (my paycheck isn't too funny now, is it losers?). I didn't care, though, my mission has and always will be to show them their wrong doing. I actually can't wait to have a reunion just to see how UNSUCCESSFUL they are. My parents didn't like my interest in gaming and technology either, but that's a rant for another day. After junior high, I wasn't exposed to much else until I got to college four years ago, where I took Fundamentals Of Computing. My professor was a true nerd (major Zelda fanatic), and he taught us how to program in Python. I began to love being able to create something literally out of nothing. He opened my eyes to a world where there was order and I could have control in a world where I've never had any control in before. Since then, I've only began to love my profession more and more. This is truly what I was born to do.
A long time ago: Joking about QuantumComputingAiBlockchainVR ...
Today: Reading articles about QuantumAi ..
What comes next? 🤦1
!rant Big ++ to all who encouraged us as we slowly shared this project on DevRant.
@qberry1 and have 1 chapter in the books with big props to DevRant
@compSci @klonky @tachoknight @n1had @dfox1
Scheduled an on-site. *internal screaming*
Does anyone have any resources for studying distributed computing and operating system topics or have any pointers for studying for a systems design interview?
Also, how did y’all get comfortable with recursion? I don’t have issues with problems I already know the solutions to but it’s like when that’s not the case my brain just goes into panic mode for a bit.
Teach me your ways?7
Okay. We believe in atmospheric physics when we're running weather forecast models. But somehow half the population thinks the science of anthropogenic climate change is in dispute.17
Learning to tech to speed up learning.
Using a new cooperative learning technique, AI Lab researchers cut by half the time it took a pair of robot agents to learn to maneuver to opposite sides of a virtual room.
A combination of deep learning and reinforcement learning algorithms are responsible for computers achieving dominance at challenging board games like chess and Go, a growing number of video games, including Ms. Pac-Man, and some card games, including poker. But for all the progress, computers still get stuck the closer a game resembles real life, with hidden information, multiple players, continuous play, and a mix of short and long-term rewards that make computing the optimal move hopelessly complex.
Image: Dong-ki Kim1
Thousands of PH/s computing power spent on mining Bitcoin.
Meanwhile, took me an hour to explain it to my DevOps guy why I need a t2.medium as compared to t2.small.3
Starting the process of applying for developer jobs without any computing qualifications (I'm self taught) and I'm convinced that I'll not hear from anyone 😣 any tips from Dev rant to help me find that first job?10
Thought experiment time:
Imagine that this whole universe is a simulation created by a Group Of Developers (GOD).
- Who would make up this group?
- What kind of design patterns would they follow?
- What type of programming language would they use?
- What kind of bugs are there if any?
- How do they test?
- Assuming the use of quantum computing, what are the implications? Parallel simulations? All possibilities play out?
- Would the controller input be life?
- Who is AI and who are players?
- Has all time already been rendered?
- Do we respawn?
- What would the leaderboard look like?
- What kind of stats are tracked
- What are dreams, nightmares, lucid dreams, sleep paralysis, birth and death?
- How is memory stored, accessed and pruned?
- What kind of neural net is used and where?
etc etc, if you can think of any other interesting fire away7
In my high school we just finished our prelims (Aka, Test Exams, Just to see how we are doing). I failed everything except computing. In Higher Computing I got a B (3-4 Marks off A), and I was the only person in my class to pass, and we are the only higher Computing class in the school. And there is no Advanced higher Computing class.
And I was one of only 3 of the S5s taking it, Everyone else was S6.
I feel more proud than I probably should.2
aagh fuck college subjects. over my last 4 years and 7 sems in college, i must have said this many times : fuck college subjects. But Later i realize that if not anything, they are useful in government/private exams and interviews.
But Human computer Interaction? WHAT THE FUCK IS WRONG WITH THIS SUBJECT???
This has a human in it, a comp in it, and interaction in it: sounds like a cool subject to gain some robotics/ai designing info. But its syllabus, and the info available on the net , is worse than that weird alienoid hentai porn you watched one night( I know you did).
Like, here is a para from the research paper am reading, try to figure out even if its english is correct or not:
Looking back over the history of HCI publications, we can see how our community has broadened intellectually from its original roots in engineering research and, later, cognitive science. The official title of
the central conference in HCI is “Conference on Human Factors in Computing Systems” even though we usually call it “CHI”. Human factors for interaction originated in the desire to evaluate whether pilots
could make error-free use of the increasingly complex control systems of their planes under normal conditions and under conditions of stress. It was, in origin, a-theoretic and entirely pragmatic. The conference and field still reflects these roots not only in its name but also in the occasional use of simple performance metrics.
However, as Grudin (2005) documents, CHI is more dominated by a second wave brought by the cognitive revolution. HCI adopted its own amalgam of cognitive science ideas centrally captured in Card, Moran & Newell (1983), oriented around the idea that human information processing is deeply analogous to computational signal processing, and that the primary computer-human interaction task is enabling communication between the machine and the person. This cognitive-revolution-influenced approach to humans and technology is what we usually think of when we refer to the HCI field, and particularly that represented at the CHI conference. As we will argue below, this central idea has deeply informed the ways our field conceives of design and evaluation.
The value of the space opened up by these two paradigms is undeniable. Yet one consequence of the dominance of these two paradigms is the difficulty of addressing the phenomena that these paradigms mark as marginal.
I don't know if this can be classified as a legit "regret" or not, but anyway (hence no wk78 tag).
I've always chosen to focus more on the theory behind computers and computing rather than on practical dev skills. Not saying that the more theoretical things aren't fun - concepts from theoretical CS and maths still regularly blow my mind, as do the more "esoteric" languages like Haskell, Idris, and Coq. However, after seeing you fine folks here at dR talk about practical development, it feels like there's a whole world of stuff that I've missed about computers and programming, especially web programming. I think I'll tackle that next when I have some free time, maybe spend some time learning PHP to see what all the hate's about... (really though, it must do something right if it has such a huge userbase, plus, I think devRant uses it too...?)
Anyway, just wanted to say that you folks are really cool and an awesome source of inspiration. Best community ever.3
Which cloud hosting provider do you use or prefer and why?
I've been using Digital Ocean for two years, but I'm thinking about switching to AWS or Google, because two friends of mine recommended them. For me, at least AWS, feels way more complicated than DO. But if they are clearly better, I will switch. What's your recommendation, if you have any?
Thanks a lot!9
Thinking really hard about starting my own retro pc collection starting with the NEC pc-98 ......hmmmmmm wondee how my wife would feel about me spending money in this shit
Recently I have taken to all things retro tech, always liked it really, specially since my mom showed me pics of me playing with an old commodore 64 when i was younger as well as another of a family friend showing me the sharp 68k this shit fuels my appetite for knowing more about the programming ways of the old school coders. Some pretty interesting stuff, I feel that the newer generations would benefit greatly by knowing the things we had to do in order to build efficient programs back in the day. Not to say that I was part of that at all. I was born in 1991, how I came to see these systems is unknown and forgotten by me, but something that none the less os part of my story in computing.
Because of the industry that surrounds me I have been dealing with working with web development, but shit is really not that much of a passion of mine, had I the skills more than the academic knowledge I would love to work with low level C code all day, I just feel that the things that developers do there are so much more interesting than handilg web development, web development is tedious and a current shitstorm, not to say that shit was not like that for the programmers that i am referencing, but i just want more.
Web development has made me a successful man, at 28 i am the head of my department, I might sound like a Disney princess but I want more, I want more knowledge and more experience in different areas of Computer Science. I want to know it all and it seems like time continuously goes against me.
Oh well, here is to a new year lads, see what i can do.3
Ideas I've had over the years that could pan out and be useful:
SMS-DB: Stands for SMS-Data Burst. Used to allow those with low cell signal or no data plan to transfer data between a phone and some client via the standard SMS text space. Would be slow, but would act kinda like dial-up over SMS (as mobile lines are compressed on all service levels, even LTE, so traditional dial-up wouldn't work!) I have a general idea on how packets would be laid out, but that's about it so far...
everything2PNG: Allows one to transpose any file's data into a PNG with a 3 byte per pixel (full color RGB), which allows for a "compression" of sorts (about 91, 93% on preliminary tests) AND allowing further, more efficient compression of the resulting file. (Plus... it's just kinda cool to see files transposed as PNGs.) I actually have a simple transposer to go to PNG, but can't yet go back. Large files (around 600MB) use upwards of 4GB with efficient paging and other optimizations via NumPy so far, so it's not *viable* yet, but it's coming along nicely.
RPi-GPIO Interconnection Bus: A master/slave or round robin method to allow for Raspberry Pis to communicate using GPIO, which can help free up network bandwidth in RPi cloud computing clusters. At most, this'd allow for 4 bits used for pushing to the GPIO "bus", and 4 bits used for pulling from the "bus". 8 pins total are usually unused minimum, so either 3 or 4 pins for upload, 3 or 4 for download, and potentially 1 or 2 for commands, general non-data communication, etc. I made a version of this concept using Round Robin for a client, but it was horribly slow. (I also don't have distribution rights for the code, so i'm working from scratch.) Definitely doable.
Well this is interesting: just stumbled upon Microsoft's Q#, a quantum computing language, complete with compiler and sim.
It has a similar symtax to C# but is slightly different, and it also depends on C# for doing something with the computed bits.1
We had this teacher in uni that was teaching several lectures and one of them being mobile computing ( actual name, but it was just android dev).
So on the first lecture he started to add a single button on the screen and trying to add an onClick functionality. But once he started to write the code he got errors (didn't include Button) and said to everyone:
"Ok, this is normal and now when I click on IDEs save button this will go away" ofc it didn't go aways.
So after 5 minutes of trying to write the full code from head he just opened another project and copied the code he need and tried to run the app (it crashed).
So after about 2/3 of a lecture I stopped laughing and went over to his desk and just hit alt+enter to import the lib and built the project without errors :D
Never went back to those lectures but I passed the class with highest grade by just demonstrating an app I built for fun without any proof that it is actually mine.
anybody ever work with ProjectQ or QISKit? I'm doing a project for my algorithms class on Shor's algorithm, and I'm trying to find a guide for an implementation.
Is it doable to install macOS on a hypervisor on aws/google/azure and use it via VNC screensharing?5
After reading the script for the architect scene in Matrix Reloaded I was determined to use the word 'concordantly' in a sentence. I am proud to say I have succeeded, and with reference to cloud computing no less.1
Share your thoughts of General AI /Strong AI
How far away?
Will it need Quantum Computing?
What company will get General AI working first?
Do you fear Strong AI?9
Strangermeet up conversation
Me: just sleep bro ,u can't get girl here.
Stranger : can't sleep R8 now, reading quantum computing book.
Me : interesting
Yes we are now friends (#~#)
For a computing project at school, I need to do some market research, I'd be very grateful if anyone could be bothered to fill in this survey: https://surveymonkey.co.uk/r/...9
What is the cheapest and closest to "decent" cloud computing provider you've come across? I'm currently using scaleway ARMs -- all thanks to someone posting scaleway's name and comparing server prices to a cup of morning coffee :) . It's OK, really can't complain (although it's somewhat silly to sync ssh keys on-boot only IMO). Is there anything cheaper with no less quality?6
IBM Cloud seems to be the only cloud computing platform that has a responsive website.
Admittedly I have only used GCP and AWS, I haven't touched Azure yet. Both GCP and AWS have incredibly slow web portals that take ages to load after every single click.
IBM Cloud is the only cloud service platform when I clicked a button and it loaded the next page like a normal website. It honestly felt surreal to navigate through all of their services. I have no clue why AWS and GCP are both so bad, it reflects really poorly on their services. If they can't get their own web portals to run quickly, why should I expect their services to be fast and reliable?3
University labs. I’ve been lucky enough to have 2 brand new computing labs constructed during my time here. Ultra wide QHD USB-C monitors 😍
So I got my GCSE results back yesterday. Its the first year with the new 1-9 grading system, and I was really hoping that I could get a 9 in computing (I did in the mock).
What I got? Well. I got 138/160 marks total. What did I need to get a 9? 140/160 marks. 2 marks off of the best grade. 2. I was so damn close.5
How do you become a PM?
Do you need computing science knowledge? And at which point in your career do you lose it?5
What are the thoughts of privacy conscious people about quantum computers? As far as I understand current TLS version encryption method is vulnerable to quantum computers, thus if your ISP or other agencies store all your traffic data right now, they'll be able to decrypt it after gaining access to quantum computers.
One way to secure your privacy would be to use your own VPN that uses encryption method that is quantum-resistant, but again the VPN would be using TLS to connect to the Internet.6
Silly question, but why is it that in this age of 64-bit computing and gigabytes of RAM applications still have trouble with text files/SQL dumps over 1MB in size? Surely for something so simple it should be able to store it all in memory without any issues, no?11
I can already imagine in the future:
Remember back in the 10s when there was quantum computers with the size of a room for tens of thousands of dollars? Now everyone has one implanted in their head with 100 times the computing power! With the old hashing algorithms we could mine hundreds of blocks every second just with thinking about it1
Things I say to my clients when I know that a reboot is required to fix their issue but I don't have enough evidence to prove it to them :
"... On any computing platform, we noted that the only solution to infinite loops (and similar behaviors) under cooperative preemption is to reboot the machine. While you may scoff at this hack, researchers have shown that reboot (or in general, starting over some piece of software) can be a hugely useful tool in building robust systems.
Specifically, reboot is useful because it moves software back to a known and likely more tested state. Reboots also reclaim stale or leaked resources (e.g., memory) which may otherwise be hard to handle. Finally, reboots are easy to automate. For all of these reasons, it is not uncommon in large-scale cluster Internet services for system management software to periodically reboot sets of machines in order to reset them and thus obtain the advantages listed above.
Thus, when you indeed perform a reboot, you are not just enacting some ugly hack. Rather, you are using a time-tested approach to improving the behavior of a computer system."
Evolution of servers: A normal fucking server -> Cloud -> Serverless -> Fog computing
Holy shit, can't wait for what will be next...7
Had a client whom was using the staging system on my server as cdn, remote computing, etc... because his prod server was a cheap vhost while the vm was a beast compared to it. I shut it down without telling. I just got a call that his site is now slow a f and full of errors.
I kindly told him that there was a recent security breach called dirty cow. Then I told him that I shut the vm down because it would mean security risk for him since there are no patches available yet and only Power on again with there was work for me to do.
If you want resources pay for them
I have last few months left out of graduation and i don't know what should i learn. There's so much things (web dev , ai/ml, blockchain, android , cloud, ,hybrid apps, gaming, ar/vr, data analysis, security,etc) and as a cs student, i feel i should be knowing them all.
In last 6 years ,
Techs that i liked or got success in :
java, Android,python, data analysis, hybrid apps(flutter)
Techs that i didn't liked or failed in : ai/ml, cloud computing , webdev(css/js) ,hybrid apps(react/angular/ionic/...)
Techs that i didn't tried : security, cryptography, blockchain, open cv , ar/vr, gaming
I am not bound by my likeness or success.
My failures was mainly because i didn't liked those techs and continued further in them. And my success comprises of just launching a few apps, passing in some certification or grabbing an internship opp because of those skills.
But if you think a particular skill is necessary to have as a cs professional then let me know. I just want to earn a lot of money and get out of this mess asap1
Hobby coders, what’s your favourite vintage platform to develop on? I recently started dipping my toes into vic20 and Commodore 64.
Feelin like a time traveller 🛸3
eBay's APIs make me want to cry.
Take the sandbox for example:
- Every time you log into a session, it logs you out.
- When you create an order (eventually!) and want to retrieve it, tough shit it doesn't feel like doing that today.
- Functionality both exists and doesn't exist at the same time on both the LIVE and Sandbox APIs. I don't know how they've managed to get quantum computers in their server room, but their GOD DAMN API LIBRARIES ARE NOT THE BEST USE CASE FOR QUANTUM COMPUTING!!
I don't know if I despise eBay or Magento more...
Everything I know is self taught... From a time I dunno when I'm 20, so likely just after the year 2000
From my perspective I think different from most devs more formally trained, which can be to my advantage , the downside of this I'm terrible with names, everything in computing has a anagram.
I'm bad with names anyway... Dyslexic 😉. But if explained to me I know what it is your on about.
I consider myself a good dev, not experienced but otherwise good. But I want to be the best...
I'm also a hacker (nice one) which I think helps me build better more secure programs knowing common vulnerabilitys
I'm proud of what I've achieved so far. Whilst I'm not perfect nor is my work that's what I work towards ... As should every dev
- Finish "Introduction to algorithms"
- Learn some genetic algorithms
- Get my hands dirty on reinforcement learning
- Learn more about data streaming application (My currently app is still using plain stupid REST to transport image). I don't know, maybe Kafka and RabbitMQ.
- Learn to implement some distributed system prototypes to get fitter at this topic. There must be more than REST for communicating between components.
- Implementing a searching module for my app with elastic search.
- Employ redis at sometime for background tasks.
- Get my handy dirty on some operating system concepts (Interprocess Communication, I am looking at you)
- Take a look at Assembly (I dont want to do much with Assembly, maybe just want to implement one or two programs to know how things work)
- Learn a bit of parallel computing with CUDA to know what the hell Tensorflow is doing with my graphic card.
- Maybe finishing my first research paper
- Pass my electrical engineering exam (I suck at EE)1
After the "Cloud Computing" trend, the new trends these days seems to be ML, VR and AI. And while I am very excited about all these techs and the possibilities it can bring, I can't help feel that most of us are using the term "AI" a bit incorrectly.
What we are trying to do here, as far as I can see, is VI, not AI. The intelligence we see in the so called "AI"s available so far are simulated and fails to emulate real intelligence, let alone demonstrate actual intelligence and awareness. They are not fully aware. But I guess that is why there is the singularity constraint. It is no doubt that when a VI finally becomes are fully aware AI, that is indeed the point of singularity.
Anyway, leaving the future dystopian thoughts aside, a mixture of ML, "AI" and VR have made some very interesting concepts, especially in the gaming industry, which I would love to see bear fruit in the near future.2
rent / question (there is a question at the end and I'd appreciate your opinion)
8 months ago, I agreed to help a not too distant relative of mine to do his master thesis at the company where I work. He was supposed to build something really MVP, but useful for us and I'd help him get some scientific questions out of it, and provide him with (computing) resources to test his theories / implementations under simulated and much heavier load.
Since then, he didn't get done anything even remotely useful, always just stuck on very rudimentary issues, claimed things are almost ready, I wrote a quick smoke test to prove that the whole application blows up when you touch it, in short - a disaster and went over to radio silence.
In the meanwhile, we didn't need it anymore, so 1.5 months ago, I got in touch with him again, with an even more technical proposal, something, at least I'd think, that's even cooler to do. He asked me some question about hypothetical load, the system should be able to handle eventually, to come up with alternative implementations to compare them against each other. He said that his exam period is going to be over soon and he'll get back to me with some initial version.
2 weeks ago, I got back in touch with him, trying to urge him, to get finally started and get something done. If he'd actually sit down and do it during the holidays as a "full time job", he'd be probably done in 2 weeks. Last week, he came back to me and said he has an initial PR ready to review.
I was excited about it, but basically froze when I realized what he did. He deleted all his previous work - some infrastructure stuff which took us basically 3 months of back and forth to get running - and as far as I could see, all the new code were only auto generated clients based on a swagger specification. In short - I could do it in less then an hour. If you really have no idea what you're doing, it might take you half a day, but definitely nowhere near to a week.
His brother, which a good friend of mine, thinks I'm being too hard on him. His argument was, that it's too hard, and he has to do it in C#, but he only knows Java (I gave him access to some of our repositories to copy paste code together, he didn't need to invent anything. I also prefer C# but wrote my master thesis in Java) Personally, I'm just pissed because he promises stuff that he never does. I totally understand him - I was like that as a student as well, I guess karma is a ... but still, he's wasting my time.
Right now I'm thinking how to get out of this, without having even more time wasted. I doubt he'd ever deliver anything useful. He got plenty of input from me about what he could consider for his scientific question, how to measure performance, ... He can keep his credentials to access our test environment with the test data, but I won't give him access to any additional computing resources, to compare how his solutions might scale on our company's cost. (mainly it's not the money, but I'd have to provide that stuff, and probably help him set it up)
does it sound like a fair deal (saying, I'm done with you. You can finish your topic on your own, but don't expect any help from me)? or am I being a dick about it and too demanding?1
I am attending a lecture about IBM mainframe computing and I have no idea about what the lecturer is talking about1
Can any of you recommend on good universities (mainly in EU) with strong computer vision/visual computing/computer graphics labs?
For M.Sc studies.2
Time for an exam about Cloud computing and deploying Microservices to the cloud using kubernetes, followed by another exam about Usage of scientific C libraries. This feels both so disconnected for being part of the same degree.2
Interplanetary networking and quantum computing 🤔 something to ponder about... I want to be there now!
Linux users, be honest: if I switch over to the penguin, how much time am I going to spend wondering why things don't work as they should and trying to fix them? Will my experiences of development and personal computing merge in this way?14
If you ever have to deal with distributed computing in Java, jgroups.org and Atomix is your best friends. Use them.
Unless you hate your coworkers, then just invent a custom gossip protocol...
Not before long, I guess we may have to put up with bots during code review...😂
Meet the Bots That Review and Write Snippets of Facebook's Code
Front-end development leaves me slightly in awe of the developers. How do you do it?
I come from a background in scientific computing. I can write boundary element code that's fast, performant and safe. I can build Monte Carlo simulations that work well. I'm even decent with backend development in Flask somehow. But ask me to build a simple web form and... argh!3
Thought I'd take a look into how Cloud computing works and what it's all about.
I regret everything.1
My two best friends has been the most influential mentors I've ever had. One is a compiler engineer at a major computing company and the other one is a security engineer at a major company in Japan.
Both have sat down and taken the time to not only teach me different aspects of the computing environment, but empowered me to learn more on my own. One project I was working on ended up tapping into both of their teachings. I took a moment to think back on when they were teaching me and felt so grateful to have such patient teachers.
The moral here is that not everyone knows what you do. What makes a good teacher is someone who takes the time to teach and empower the individual. It really goes a long way.
Nobody is interested in the pioneers and the early history of computing. We have to know our history in order to understand the present! I have friends which say that it's lousy to dive deep into the history but I enjoy it thoroughly.7
Replace every other profession eventually, actually screw that remove computing profession too and just chill and let things burn3
My university is offering Mobile Computing and Large Scale Data Processing as electives. Which one do you think would be more useful in the future?4
I agree with Python being very useful due to library availability. Not sure what I think about C beating out C++ though. I would much prefer programming in C++ to C any day. I don't like Java, but a LOT of people use Java.
I find it interesting that a lot of people talk about Rust, but I am not seeing it in the top 10. Is it just too new?
What I find most interesting is that this is a good list of languages to learn. These are what are being used in the field. Well, at least from the the perspective of IEEE.
Arg! Learn to debug for your bleeding self you are supposed to be a bunch of senior developers it's the same bloody issues all the freaking time. So I create a step by step guide what buttons to click what text to enter because I'm so f***ing through with the same issues you bug me with day in day out! A 12 year old with no computing knowledge can follow the guides yet you don't even bother reading it half the time or choose to completely miss steps out and bug me with your issues.
Damn it why do I bother you bunch of ass hats get paid more than me too I know it!
How do you recharge and turn on an ancient computing device ?
I'm taking a class in my university about Cloud computing. In 2 weeks we made a simple web app to upload videos and then a simple job that converts all videos to mp4.
Now we took the app to the Cloud using AWS. We created different instances for the web servers, we changed the database to NoSQL, used SQS to queue the convert videos jobs to the different workers instances, used SES, S3, CloudFront, ElastiCache. All that stuff.
And all that is worthless because I cannot get my Ubuntu instance to run a fucking command on reboot. I don't really know how and I feel that all my work was wasted.
Feels bad man2
There doesn't seem to be an easy way to automatically horizontally scale your VMS... Why is that? I'd think that I'd be able to check a box on a VM to configure all that bull shit. Guess not....
What I hate most about studying computing? Getting exams about shit I hate - fucking stats exam tomorrow, wasted my time coding and now I'm afraid Ill fuck up big time1
So we have to do a final project for a course in groups of four people. The project's about multimodal user interfaces and physical computing. Apparently they decided to randomly assign the groups. No biggie, I thought. So once we got in touch with each other, it turns out the three other people had a lot in common.
1. "I'd prefer to take care of the design and visual stuff, coding isn't really my strength"
2. "I don't know python, but we can use it as long as I don't have to touch the codebase"
3. "Do we have to use git? It was so hard the last time."
Come one, you're 3rd/4th year students with quite a lot of studies in java/scala, how hard can it be to grasp the basics of python.
It's gonna be long two weeks... Oh well, it's a learning experience.1
I would like your opinion on this fellow devranters. Right now at my university I have to pick an elective. My options are AI, Cloud Computing and the .Net framework. I'm leaning towards AI and also considering taking both AI and Cloud computing (if they'll allow me). What do you think I should pick career-wise?6
I can't wait until I feel like Dr. Frankenstein when I build my PC this week. My first real computing rig!!
Some backstory: My main dev machine is my old Lenovo laptop running Ubuntu (my baby). I took Windows off of it when I got a Surface through a job and have been using that for Windows specific work. I'm going to be giving that to my little sister next time I travel home. In short, this is the first computer that will be able to cut through anything I throw at it and play games that aren't at least 6 years old.
The build is centered around an i5-8400 and gtx 1060 6GB, and I'll be running Windows as a primary OS for gaming. However, I fell in love with programming on Linux and there is no way Linux won't be on my machine. I understand the differences between dual-booting and virtualization, but I want to hear how you guys run Linux on your Windows gaming machines or if you go about it another way. I also have heard horror stories about drivers for Linux, and wonder if my graphics card being certified by Ubuntu LTS actually means that it will operate correctly. I have also only ran VMs on crappy computers so I haven't had any experiences where that performed better than dual-booting. I'd love some feedback or to hear about all of your setups, as hardware has never been my strong suit.
I'll post a pic of my setup when it's done too.4
I know it's a stupid question but then also I want to ask because I am very confused...
Recently I started learning about cloud computing and I have question that:- What actually cloud is?? (Please don't tell the advantages or what can we do with cloud, etc.)
Is it collection of hardwares or many companies have built some special servers that are put together for the purpose??7
I don't have any experience in teaching, but I'd venture to say that teaching anything is hard. For most subjects, teaching has been refined over thousands of years to be easier and meaningful. Not CS. As has been mentioned by many people CS is a very new subject when compared to the likes of maths, for example, and education systems haven't been able to cope with it adequately (nor should they be expected to).
That the CS industry is rapidly evolving certainly doesn't help matters, but in reality that shouldn't really be that big of a problem (at least in earlier years of education). The basics of computer systems and programming don't really change that much (please correct me if I'm wrong) and logic stays the same. Even if you learn stuff that's a bit out of date it can still be useful and good lessons should be able to be applied to new technologies and ideas.
Broken computers is a big inconvenience, but a lot of very useful things can be done without a computer, and I should think the situation is a lot better than it was 5 years ago. What I think would be good, instead of trying to use broken computers would be to get students to set up and use a raspberry pi each; you learn about something other than windows, learn how to install an OS and you don't need that much computing power for teaching people computer science.
I think the main problem is a lack of inspiring teachers. Only a very few teachers will be unable to get you through the exams if you put in the effort, but quite a lot of the time students don't put in the effort because they can blame it on the teacher.
My solution would be to try and get as many students into computer science as possible and the rest will follow: more people will become teachers, more will be invested in the subject, more attention will be payed to the curriculum.
That's not to say I don't agree that many of the problems that have been mentioned need to be fixed for CS education to work properly, just that there is no way that I can see to fix them currently without either creating more problems or some very rich person giving a load of money.
This has gone on a lot longer than I expected so I'll stop now.2
Sometimes I love the things you can get away with using the excuse "it's for computing science!" This story is more electronics and less programming but I'm sure a lot of people can relate.
I was in a group at uni making part of an audio system. We had to test it and tune the potentiometers by playing something. I said "I know what we can play!"
We ran nyan cat on loop on a speaker in the computer lab and noone could really stop us.
Welp, how much longer till someone building some magic to crack any modern encryption in blink of an eye.
tl;dr Google claimed it has managed to cut calculation time to 200 seconds from what it says would take a traditional computer 10,000 years to complete.
!rant does anyone know what sustainable computing is? I googled it but I don't think I understand much..like, if I took this as my major for uni then what will my potential careers be like? Is it a better choice then software engineering? 😕😕😕3
I just felt like saying as I'm not sure how prevalent this is but the reason I got into computing and programming is essentially this, I want to change the world and make a better society and to quite frank I honestly believe that what you make and it's impact on people is far more important that your personal character5
Top 12 C# Programming Tips & Tricks
Programming can be described as the process which leads a computing problem from its original formulation, to an executable computer program. This process involves activities such as developing understanding, analysis, generating algorithms, verification of essentials of algorithms - including their accuracy and resources utilization - and coding of algorithms in the proposed programming language. The source code can be written in one or more programming languages. The purpose of programming is to find a series of instructions that can automate solving of specific problems, or performing a particular task. Programming needs competence in various subjects including formal logic, understanding the application, and specialized algorithms.
1. Write Unit Test for Non-Public Methods
Many developers do not write unit test methods for non-public assemblies. This is because they are invisible to the test project. C# enables one to enhance visibility between the assembly internals and other assemblies. The trick is to include //Make the internals visible to the test assembly [assembly: InternalsVisibleTo("MyTestAssembly")] in the AssemblyInfo.cs file.
Many developers build a POCO class in order to return multiple values from a method. Tuples are initiated in .NET Framework 4.0.
3. Do not bother with Temporary Collections, Use Yield instead
A temporary list that holds salvaged and returned items may be created when developers want to pick items from a collection.
In order to prevent the temporary collection from being used, developers can use yield. Yield gives out results according to the result set enumeration.
Developers also have the option of using LINQ.
4. Making a retirement announcement
Developers who own re-distributable components and probably want to detract a method in the near future, can embellish it with the outdated feature to connect it with the clients
[Obsolete("This method will be deprecated soon. You could use XYZ alternatively.")]
Upon compilation, a client gets a warning upon with the message. To fail a client build that is using the detracted method, pass the additional Boolean parameter as True.
[Obsolete("This method is deprecated. You could use XYZ alternatively.", true)]
5. Deferred Execution While Writing LINQ Queries
When a LINQ query is written in .NET, it can only perform the query when the LINQ result is approached. The occurrence of LINQ is known as deferred execution. Developers should understand that in every result set approach, the query gets executed over and over. In order to prevent a repetition of the execution, change the LINQ result to List after execution. Below is an example
public void MyComponentLegacyMethod(List<int> masterCollection)
6. Explicit keyword conversions for business entities
Utilize the explicit keyword to describe the alteration of one business entity to another. The alteration method is conjured once the alteration is applied in code
7. Absorbing the Exact Stack Trace
In the catch block of a C# program, if an exception is thrown as shown below and probably a fault has occurred in the method ConnectDatabase, the thrown exception stack trace only indicates the fault has happened in the method RunDataOperation
8. Enum Flags Attribute
Using flags attribute to decorate the enum in C# enables it as bit fields. This enables developers to collect the enum values. One can use the following C# code.
he output for this code will be “BlackMamba, CottonMouth, Wiper”. When the flags attribute is removed, the output will remain 14.
9. Implementing the Base Type for a Generic Type
When developers want to enforce the generic type provided in a generic class such that it will be able to inherit from a particular interface
10. Using Property as IEnumerable doesn’t make it Read-only
When an IEnumerable property gets exposed in a created class
This code modifies the list and gives it a new name. In order to avoid this, add AsReadOnly as opposed to AsEnumerable.
11. Data Type Conversion
More often than not, developers have to alter data types for different reasons. For example, converting a set value decimal variable to an int or Integer
I hope there's a pill that I could take to master vim and tiling desktop in an instant. I feel so envious just by looking at a co-worker who's good with that and rocking a cool tiny 60% keyboard. I'm TOO damn comfortable with the normies way of computing.2
My first computer exposure was on a mainframe (CDC Cyber 180). My university in Kerala, India had a collaboration with the Indian defence organisation DRDO. The operating system was something called NOS/VE, though as I remember it could run some Unix version virtually. I had Fortran 77 programs to be developed as part of the course. (finite element methods). As I remember, the machine had built in routines for the same. Screen was a green on dark terminal conected to the thing. No windowing or graphics.
Today kids have more powerful machines at home (or in their pockets). The famous computing power law be praised.
Google Cloud Computing services to run a node.js app that is just my blog / personal website?
Is that a terrible idea? I'm slowly learning node.js as well as the whole ethos of hosted services without necessarily having an entire shared or virtual server to keep maintained.7
Ok so Im doing a project about interpreters for college, and need people to answer questions for it.
If youve ever made an interpreter could you answer these, thanks!
1) how long have you been in the computing industry?
2) what got you into interpreters?
3) what do you think is the hardest part about creating an interpreter?
4) what do you think aare the best practices for creating an interpreter
5) do you think its best to create a language or create your own?9
About #wk97, many trends aren't new things for example IoT is a evolution of Ubiquitous computing, NoSQL remember me xml database and oo database; but de good part is that are people improving this things and it's amazing :)
Flowcharting actual computing processes and using flowcharts to code. For someone who is more visual than logical like me, it helps as guide to code and it also serves as documentation to clients.
I think I found out why Cengage hasn't gotten back to me on their root-server issue: They're leased by next.tech (that's their name and URL) and it's literally an iframe from them inside like 7 Cengage iframe wrappers (which is also why it runs like ass apparently!)
next.tech supplies cengage with the actual heavy lifting, and cengage is literally a shitty wrapper for it.
"Our SmartScaled infrastructure ensures your users have a secure computing environment available in seconds." fucking bullshit i'm already root in my own personal server you've handed to me
TLDR: Opinions of area of interest between these subjects (specializations):
2 Programming languages
3 Business analytics
4 Pervasive computing
Hi, I'm about to choose specialisation of my software development masters. I'm almost certain what I'll go with (algorithms), but I wondered what other people thought and would choose if they had the opportunity. I'm still not too experienced in all of these areas, making the choice a bit hard :-)4
I thought of learning about cloud computing. Went looking for AWS courses on Udemy.
It's so confusing which certification I should prepare for for AWS.
Which one should I go for to begin with? Suggestions are appriciated. Thanks.11
Can we make a cluster which is moderately powerful using all free cloud computing services available online like Google Cloud Platform, AWS, Oracle cloud, Microsoft Azure etc.3
Area of focus: Volunteer computing
It's a facinating and beautiful branch of computer science, not known enough. I love working with it, hoping I can change the world.
I would end up on top of any other trend thing needed by humanity.
- building better AI's, building better robots
- neural networking
- quantum computing
- robot dating service
- artificial life
- holodeck design and construction
- free energy (any kind)
- running a private space shipyard
- research into new and unknown technology
And last, if nothing works, I would open up a deli on Mars. The robots would make the food anyway, I would probably only program the menu and fix them when they malfunction.
larry elison laughs the term cloud computing years ago, now offers oracle cloud but failed to come up with an effective strategy, wants to monitize java.
(but still a billionaire though)
Not a rant, but does anyone know of good cluster computing software for Windows? Can't find any on google2
A lot had changed in computing over the last few years. One thing people seem to HATE for varying reasons. I personally don't mind, since they won't be going away and I can handle their little screwups when they happen. But now EVERYONE is doing it. Apple, Google, Microsoft, and their partners have ushered this in era where machine control is placrd in the hands of the OS developers. What I find funny about that is that they say they are doing it to help less tech-savvy people stay safe, yet a good portion of problems people ask me about come right after a forced upgrade. Come on! If you're gonna do it, at least make it worth the problems!
Anyone know why using "OS" instead of "Operating System" in AQA A-Level computing loses you marks?9
I wish I could be a quantum computing programmer! This way I can work on my side project while this other me attends all the boring meetings
Can anyone suggest me some open source projects? I went through a lot of articles where everyone said according to your interest you should select the project
I also went through this:
Still, I am finding it hard to select a project. I am intermediate in python, PHP, openCV, highly interested in OpenCV, cloud computing and web development.6
A prayer from a colleague:
Our silicone god which art in the SSD
Italic be thy name
Thy computing come
Thy bus be done
On the screen
As it is on the hdd
Give us this day our daily blue screen
And forgive us our keystrokes, as we
forgive our keyboards.
And lead us not into restarts, but
deliver us from memory leaks: For thine is the
memory, and the cpu, and the
bus, for ever. Amen
Beautiful is it not :)
This semester I'm taking a class in my university about Cloud computing. You know, how to use the cloud better, when to use it, and we are using AWS in the class. That mother fucking class takes a lot of my time, I couldn't sleep for 2 nights in a row doing homework, and now EVERY TIME I go to YouTube to chill and see a video I GET A UDEMY AD TO LEARN AWS. WHY??3
I finally wrote my exemption test for computer literacy(covers ms office products and basic computing theory ).never have to go to anything concerning that course ever again and got 93%+ for all the tests required to get exempted
Hey, getting bored here, does anyone know of any cool API's that provide computing puzzles, decyphering, math shit like that.
For example, one of the ones that I have completed is the
I'd like something along those lines
Just came across the NERSC Docs (https://docs.nersc.gov), absolutely wonderful open source docs by the National Energy Research Scientific Computing Center.
I believe they were written as a guide for people that would be using their super computers, but it's a very good linux beginners guide.
Plus, it looks nice (no visible dark mode tho...).
Can human language take over programming language and is it utopian to think and write what I want to get my job done in English and get it parsed compiled and run on any computing environment or am I still day dreaming.5
Working on a mix of a toolbox/lxc-like container manager built on systemd-nspawn, and a really cool university research project that basically involves playing with quantum computing frameworks and writing out their comparitive pros and cons
Next year I have to decide what branch I'll be studying , I am between computing and software , I like security too, ¿ any help ?7
So let's say you're theoretically hosting a website on Google's cloud platform with GoDaddy, but you have the code on your local PC. How would you go about updating it?
For now, I've just been SFTPing into the cloud server and updating it.9
F U C K
Recently in our school our final year class choice forms are starting to be handed out. 5 lists, you pick one subject from each.
Now, I really wanted Advanced higher computing, to the point where I nearly begged on the survey choice forms. There's two of us that really want it. What happens? IT'S NOT ON THE FINAL FORM.
The only two subjects I could get was engineering and maths. Three of my other lists are completely to say politely, fucking shite.
Has anyone done Bsc in Computer and Information Systems from University of London International Programmes? Seems like the closest thing to a computer science degree I can get. Would like to know if it's any good.5
I was watching an Ancient Aliens episode called "Beyond Roswell". The show described the idea of some of our tech being seeded slowly by introducing alien technology to specific companies. They suggested that computing technology has advanced very fast and introducing this tech could be part of that.
At first I was kinda pissed about this. I have read about the creation of the first transistor back in the 40s or 50s. WWII really advanced our need for computing devices such as what Turing built. Then I realized a lot of the explosion of computer tech did occur after key ET events. This kind of made me wonder how much is "us" and how much is ET tech. I also realized it can take a lot of effort to understand something really advanced. So reverse engineering can take a LOT of effort to figure these things out. Being seeded by external tech does not take away from humans at all.
A parallel to this is a programmer that learns how to use a C++ compiler. They could go their whole career without ever understanding how the compiler itself is doing its job. I find myself wanting to learn how compilers work and started down this path. I look at the simple grammar I have learned to parse. Then I look at the C++ grammar and think "How can I ever learn to do that?" So I see us viewing potentially advanced things and wondering how the heck can we ever learn to do that. The common reaction when faced with such tech would be disbelief and in some cases ridiculing the messenger. When I was a kid the idea of sending a picture over a phone was laughable. Now this is common and expected. It was literally a scifi concept when I was a kid.
So, back to the alien tech. I am now thinking it would be cool to be working with alien technology through computing. This is like scifi stuff now! So what if what we have was not all invented here (Earth). If anything this will prepare us programmers to get jobs working for alien corporations writing ship level programs and brain interfaces. Think of it as intergalactic resume building. 😉
Not enough disk space error..just when I am done writing code and unzipping the bigger dataset.
Hours later.. Now mounted 200Gigs to machine.
Feels like a boss.!
Anyone ever thought what would happen if the cloud bursts and it starts raining?Well, this guy did.
Guys, i am a beginner in networking. I want to create my own cloud computing server - (IAAS). Currently, I want to provide storage to the users.
How should i proceed.. Any site link or guidance? Thanks in advance.1
am I the only one here, who is looking for cloud computing service, because I cant host things at home?2
Just found some of Andrei Alexander's I'd videos on YouTube. Specifically the cppcon 15 talks. Does anyone else here know of talks or books etc all that can satisfy my near-juvenile love for fast code.
So I'm sitting here, one monitor has a project I am working on for computing, which once done, I need to write somewhere between 30 and 70 pages of documentation, and the other screen has a half completed 6 page documentation for a game I made in Game Dev.
If I go into backend programming, am I really going to need to do all this documentation or is it just one of them things that colleges do that has no relations to reality?
(Also if I go to uni, will I need this level of documentation there too?)10
Longshot: Has anyone ever had experience with a Link 480Z?
They were used in my school and were big, bulky machines reminiscent of the BBC Micro.
Who or what company do you think did the greatest contribution to the computing world we know today? Turing? Xerox?3