AboutI'm a dude who lives in Some Place, Somewhere, codes, and sysadmins. But I've still got a lot to learn, so that's what I do in the mean-time!
SkillsLua/Roblox, Bash, Processing/Arduino, Learning Go
LocationSome place, Somewhere
Joined devRant on 1/5/2018
Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
Alright, part 2/4 of my new project is done!
For those who are curious what this project is, I'll be posting some more info later this week! I just want to finish the v0.1 initial alpha/MVP and then I'll be revealing all the details.9
Just wrote the first part of what will become a rather large Go project!
Almost the whole thing is gonna be pluggable, which adds an extra challenge, but nevertheless, the is gonna be a fun ride!
Also, it's gonna be Open Source, but until I'm at a reasonably stable version, I'm gonna be leaving it closed. Lmk if you're looking for something to do and want to join, this is gonna be a big project.7
I hate this line with my soul.
The fact that I need to convert a dict to a string and then invoke the python parser to read it, feels so wrong!10
I'm quite pleased to show off the initial rendering of my CaaB (Cluster as a Box)!
This is a project that I've put not enough time into, and them finally I decided to get a move on with it. The estimated cost of the whole device/s ranges somewhere around $600. It's gonna be fun!
P.S. Suggestions welcome, particularly if it's about a substitute for the ODroid HC1.10
Why do we still use floating-point numbers? Why not use fixed-point?
Floating-point has precision errors, and for some reason each language has a different level of error, despite all running on the same processor.
Fixed-point numbers don't have precision issues (unless you get way too big, but then you have another problem), and while they might be a bit slower, I don't think there is enough of a difference in speed to justify the (imho) stupid, continued use of floating-point numbers.
Did you know some (low power) processors don't have a floating-point processor? That effectively makes it pointless to use floating-point, it offers no advantage over fixed-point.
Please, use a type like Decimal, or suggest that your language of choice adds support for it, if it doesn't yet.
There's no need to suffer from floating-point accuracy issues.26
Disclaimer: the project I'm about to mention contains the first lines of Go I have ever written.
Still, I'm quite proud of how quickly I got it working considering it's also my first time working with GTK.
This project that I've been working on the past few days is finally done. But it's %50 percent spaghetti, so refactoring time. I decided to have a look at my cyclomatic complexity numbers, and my biggest function (not main()) had it at 7.
As it was quite large, I split it up into to parts: the preparation and the actual timer loop. As I appear to need to use a goroutine, by the time I'm done passing channels and all hell to handle them, my loop function now has a score of 9 for cyclomatic complexity.
So fix one bug, leaves two in its place?
But I still need to better learn Go, anyone have a good (relatively painless, informative, quick-ish) course they can recommend? I've been thinking of trying out codecademy's one...6
Anyone remember my rant about Go and GTK? Well, the main problem was that it was 3 AM, I've since completed it, and have begun to refactor that spaghetti mess.
I feel so good right now.5
Good news: I've finally started learning Golang! I've wanted to do it for a while, and now I am!
Bad news: I'm getting screwed by a GTK messagedialog that when open for the second time, simply panics! The dialog is in Glade because I've got no patience to master the art of giving 5 different flags (maybe that's just 'cause it's 3 AM). Plus, I'm have a similar issue with my about dialog! COME ON!!!!
P.s. just wanna say hi again, haven't been around in a long while, so: Hi!6
I've recently gotten this idea to take a Chromebook and run Linux on it.
Why a Chromebook, you ask? Well ChromeOS is a Linux kernel, boots using coreboot, and Google now requires that OEMs support LVFS (fwupd), i.e. the Chromebook is a very Linux friendly laptop.
Some other reasons I would use a Chromebook are: it's cheap, basically useless otherwise, amongst other things.
So, what do you guys think, good idea or not?15
Argh! (I feel like I start a fair amount of my rants with a shout of fustration)
Tl;Dr How long do we need to wait for a new version of xorg!?
I've recently discovered that Nvidia driver 435.17 (for Linux of course) supports PRIME GPU offloading, which -for the unfamiliar- is where you're able render only specific things on a laptops discreet GPU (vs. all or nothing). This makes it significantly easier (and power efficient) to use the GPU in practice.
There used to be something called bumblebee (which was actually more power efficient), but it became so slow that one could actually get better performance out of Intel's integrated GPU than that of the Nvidia GPU.
This feature is also already included in the nouveau graphics driver, but (at least to my understanding) it doesn't have very good (or none) support for Turing GPUs, so here I am.
Now, being very excited for this feature, I wanted to use it. I have Arch, so I installed the nvidia-beta drivers, and compiled xorg-server from master, because there are certain commits that are necessary to make use of this feature.
But after following the Nvidia instructions, it doesn't work. Oops I realize, xrog probably didn't pick up the Nvidia card, let's restart xorg. and boom! Xorg doesn't boot, because obviously the modesetting driver isn't meant for the Nvidia card it's meant for the Intel one, but xorg is to stupid for that...
So here I am back to using optimus-manager and the ordinary versions of Nvidia and xorg because of some crap...
If you have some (good idea) of what to do to make it work, I'm welcome to hear it.6
Just came across the NERSC Docs (https://docs.nersc.gov), absolutely wonderful open source docs by the National Energy Research Scientific Computing Center.
I believe they were written as a guide for people that would be using their super computers, but it's a very good linux beginners guide.
Plus, it looks nice (no visible dark mode tho...).
What's a good password manager for Linux?
A few (optional) conditions (in order of preference):
1. It's free
2. It supports ssh, gpg, etc.
3. It has a GUI (a nice one with gtk/qt support)
4. It's (properly) secure
5. It has FIDO U2FA support (i.e. supports physical security keys like Yubikey or Solo)
6. It has a browser extension
7. It's compatible/non-conflicting with gnome-keyring17
Everybody saw this coming! A privacy breaching telegram vulnerability!
I find myself using explainxkcd more often on my phone...
Because Firefox doesn't show the whole title text...
I was casually browsing some issues for a project on github, and I certainly across an issue where someone wanted support for for URI handling (there's a good reason for his request).
I was bored, and I just wrote a script to do exactly that. I'm gonna polish it on Sunday, and then upload it, (plus add a PR).
I just started maintaining a few AUR packages, and I got to say it's rather fun and rewarding, just to know that your responsible for making sure that something is up to date, and that no one using the package is getting anything bad.2
Any one ever heard of the Solo? It's basically an open source FIDO compliant U2FA usb (with planned support for PGP/SSH key storage!).
The guys who made it are now miniaturizing it into the "Somu" (Secure Tomu).
Please support it! It's a great project and a great (and cheap) addition to basic system security.
What would all you guys say is a good (preferably easy) language for writing CLI applications? Something that runs fast, the less dependencies at runtime the better, and (this goes lower on the list)of thess logic required for argument handling the better.26
Tl;Dr This guy thinks apple is poised to switch the Macs to a custom arm based chip over x86! He's now on my idiot list.
"They've made a custom GPU", great! That's as helpful as "The iPad is a computer now", and guess what Arm Mali GPUs exist! Just because they made their own GPU doesn't make it suitable for desktop graphics (or ML)!
"They released compilation tools right when they released their new platform, so developers could compile for it right away", who would be an idiot not too...
"Because Android apps run in so many platforms, it's not optimized for any. But apple can optimize their apps for a sepesific users device", what!? What did I miss? What do you optimize? Sure, you can optimize this, you can optimize that... But the reason why IOS software is "optimized", and runs better/smoother (only on the newest devices of course) is because it's a closed loop, proprietary system (quality control), and because they happen to have done a better job writing some of their code (yes Android desperately needs optimization in numerous places...).
I could go on... "WinTel's market share has lowly plataued", "tHeY iNtRoDuCeD a FiElD pRoGrAmMaBlE aRrAy"
For apple to switch Macs to arm would be a horrible idea, face it: arm is slower than x86, and was never meant to be faster, it was meant to be for mobile usage, a good power to Wh ratio favoring the Wh side.
Laides and Gentelmen! It my pleasure to present to you the next level in the Linux desktop! MATERIAL SHELL!!!
Demo video: https://i.imgur.com/2UVZTnk.mp426
I've been waiting all week for the Manjaro maintainers to release the next update wave.
Ahh... finally something to do in the terminal :)6
I just upgraded to an SSD (finally! I know right?), and holy crap, SSDM showed up in less than 2 seconds!
Oh I've been missing out on life!
IT'S SOOO FAST!!!!4