Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "8 bits"
-
Things have been a little too quiet on my side here, so its time for an exciting new series:
practiseSafeHex's new life as a manager.
Episode 1: Dealing with the new backend team
It's great to be back folks. Since our last series where we delved into the mind numbing idiocy of former colleagues, a lot has changed. I've moved to a new company and taken a step up as a Dev manager / Tech lead. Now I know what you are all thinking, sounds more dull and boring right? Well it wouldn't be a practiseSafeHex series if we weren't ...
<audience-shouting>
DEALING! ... WITH! ... IDIOTS!
</audience-shouting>
Bingo! so lets jump right in and kick us off with a good one.
So for the past few months i've been on an on-boarding / fact finding / figuring out this shit-storm, mission to understand more about what it is i'm suppose to do and how to do it. Last week, as part of this, I had the esteemed pleasure of meeting face to face with the remote backend team i've been working with. Lets rattle off a few facts to catch us all up:
- 8 hour time difference to me
- No documentation other than a non-maintained swagger doc
- Swagger is reporting errors and several of the input models are just `Type: String`
- The one model that seems accurate, has every property listed as optional, including what must be the primary key
- Properties go missing and get removed at the drop of a hat and we are never told.
- First email I sent them took 27 days to reply, my response to that hasn't been answered so far 31 days later (new record! way to go team, I knew we could do it!!!)
- I deal directly with 2 of them, the manager and the tech lead. Based on how things have gone so far, i've nick named them:
1) Ass
2) Hole
So lets look at some example of their work:
- I was trying to test the new backend, I saw no data in QA. They said it wouldn't show up until mid day their time, which is middle of the night for us. I said we need data in our timezone and I was told: a) "You don't understand how big this system is" (which is their new catch phrase) b) "Your timezone is not my concern"
- The whole org started testing 2 days later. The next day a member from each team was on a call and I was asked to give an update of how the testing was going on the mobile side. I said I was completely blocked because I can't get test data. Backend were asked to respond. They acknowledged they were aware, but that mobile don't understand how big the system is, and that the mobile team need to come up with ideas for the backend team, as to how mobile can test it. I said we can't do anything without test data, they said ... can you guess what? ... correct "you don't understand how big the system is"
- We eventually got something going and I noticed that only 1 of the 5 API changes due on their side was done. Opened tickets. 2 days later asked them for progress and was told that "new findings" always go to the bottom of the backlog, and they are busy with other things. I said these were suppose to be done days ago. They said you can't give us 2 days notice and expect everything done. I said the original ticket was opened a month a go *sends link* ......... *long silence* ...... "ok, but you don't understand how big the system is, this is a lot of work"
- We were on a call. Product was asking the backend manager (aka "Ass") a question about a slight upgrade to the new feature. While trying to talk, the tech lead (aka "Hole") kept cutting everyone off by saying loudly "but thats not in scope". The question was "is this possible in the future" and "how long would it take", coming from management and product development. Hole just kept saying "its not in scope", until he was told to be quiet by several people.
- An API was sending down JSON with a string containing a message for the user with 2 bits of data inside it. We asked for one of those pieces to also come down as a property as the string can change and we needed it client side. We got that. A few days later we found an edge case and asked for the second piece of data to be a property too. Now keep in mind, they clearly already have access to them in order to make the string. We were told "If you keep requesting changes like this, you are going to delay the release of the backend by up to 2 weeks"
Yes folks, there you have it, the most minuscule JSON modifications, can delay your release by up to 2 weeks ........ maybe I should just tell product, that they don't understand how big the app is, and claim we can't build it on our side? Seems to work for them
Thats all the time we have for today,
Tune in for more, where we'll be looking into such topics as:
- If god himself was an iOS developer ... not
- Why automate when you can spend all day doing it by hand
- Its more time-efficient to just give everything a story point of 5
- Why waste time replying to emails ... when you can do nothing instead
See you all next week,
practiseSafeHex14 -
If money is lost,
Nothing is lost.
If health is lost,
Something is lost.
If CHARACTER is lost
8 bits are lost.2 -
guys my linux is not booting !
reply:
Find a forest, where no human has ever set foot.
Wait until the full moon rises, and then sacrifice x virgins, where x is the month of the year.
Spill their blood on your device, and wrap in parma ham (if your religion forbids you to make contact with ham, replace it with high quality carpaccio. If you're vegan get youreslf a rope).
Then, build an altar to the gnu God, with feet spelling GPL, and a head that like of a gnu.
When this is done, you shall bow down three times to the altar (thirty if you use tabs), place your wrapped bloody system on the altar, and proceed with dancing on Staying Alive, except you will have adapted the lyrics to your system.
When you are done dancing and chanting, you shall lie down in front of the altar, and you shall not gaze upon your system till daybreak.
Then when the sun rises (sorry if you're in the uk, or one of the poles) you will marvel at your system, thanking the ever potent gnu god forever.
The funniest shit I have ever read 🤣 ... had to share3 -
>>>> Followed link to a post
* Do you Accept Cookies?: Yes
* Our customer supports online: Okay, I know
* Subscribe to Newsletters?: Click Click Accept
* Website wants to turn on Notification?: Okay
* Seen Our New Product?: No, not today
* We require you to be over 18?: Yes, I am
* We value your privacy?: I Agree
* Looks like you're using ad-Blocker?: Turn Off
* Don't forget to follow us on...: Okay!!! I get it already, just show me the f*cking post!
* What next
***** 1 million ads appear around a single post broken to bits having (1-2-3-4-5-6-7-8-9 next>>) *****
Just wondering who invented this money making strategy.8 -
If money is lost,
Nothing is lost...
If Health is lost,
Something is lost...
But, if character is lost,
.
.
.
8 BITS are lost 😜1 -
I finally fucking did it!
I strapped up, strapped in, strapped on... uh wait what?
I finally made the full dedicated switch to linux on my personal computer. Blew away Windows and installed linux. I was able to get about 95% of the games that I actually play on PC to run under a combo of proton/lutris-wine.
I feel like after working in a (primarily) linux shop for almost 2 years now, I've learned enough to be able to actually troubleshoot if/when something goes wonky. I've been a windows/sysadmin type in my career for about 8 years and only touched small bits of linux here and there or for fun little projects like a retropie setup.
But thanks to this gig I'm working at now, as a devops engineer, I've learned so got'damn much about linux and I've been developing scripts/tools that run on linux I figured I could, or better yet 'should', take the full plunge.
So, I've decided that if there's something I absolutely need on Windows that Linux doesn't support, instead of knee-jerking and going back to Windows, I'm going to just setup a VM of windows and daily drive Linux from now on.
Some gfx tweaks for games were definitely necessary, it's still not quite as plug and play as Windows for games, but the fact that it only took like 1.5 hours to sort out all of my games performance is really impressive. Especially, considering none of these games actually supports linux out of the box and Wine/Proton is being used to get them to work.9 -
So this is what happened!
It was a rainy Friday, I was asked to add a quick bug fix to a js application, I spent my Friday coding, testing ..., baam the patch is ready ... I wrote a nice commit message explains the problem and the fix but I didn't push the code.
On Monday the fuckin code disappeared, no commit no code no nothing no trace ... To be honest I don't know what happened. I rewrote everything on that Money morning (you can only imagine how pest I was)
I use vim with tmux.
I have done everything I could to figure out what happened to that commit, I even doubted If had did wrote the fix that Friday, but it's not possible to forget few hours of a day
I checked my commit history on the different branches i did everything
No trace ...
Conclusion
My machine is hunted ...
Or I have multiple personalities and one of them is a programmer and he is fucking with me5 -
Follow-up to my previous story: https://devrant.com/rants/1969484/...
If this seems to long to read, skip to the parts that interest you.
~ Background ~
Maybe you know TeamSpeak, it's basically a program to talk with other people on servers. In TeamSpeak you can generate identities, every identity has a security level. On your server you can set a minimum security level you need to connect. Upgrading the security level takes longer as the level goes up.
~ Technical background ~
The security level is computed by doing this:
SHA1(public_key + offset)
Where public_key is your public key in Base64 and offset is an 8 Byte unsigned long. Offset is incremented and the whole thing is hashed again. The security level comes from the amount of Zero-Bits at the beginning of the resulting hash.
My plan was to use my GPU to do this, because I heared GPUs are good at hashing. And now, I got it to work.
~ How I did it ~
I am using a start offset of 0, create 255 Threads on my GPU (apparently more are not possible) and let them compute those hashes. Then I increment the offset in every thread by 255. The GPU also does the job of counting the Zero-Bits, when there are more than 30 Zero-Bits I print the amount plus the offset to the console.
~ The speed ~
Well, speed was the reason I started this. It's faster than my CPU for sure. It takes about 2 minutes and 40 seconds to compute 2.55 Billion hashes which comes down to ~16 Million hashes per second.
Is this speed an expected result, is it slow or fast? I don't know, but for my needs, it is fucking fast!
~ What I learned from this ~
I come from a Java background and just recently started C/C++/C#. Which means this was a pretty hard challenge, since OpenCL uses C99 (I think?). CUDA sadly didn't work on my machine because I have an unsupported GPU (NVIDIA GeForce GTX 1050 Ti). I learned not to execute an endless loop on my GPU, and so much more about C in general. Though it was small, it was an amazing project.1 -
imagine having kernel memory leaks in 2020
AT&T or Huawei, whichever, pushed an update for my already-struggling-to-exist phone that made the kernel memory leak go from 480KB/hr avg to 22.5MB/hr avg. When my free RAM is never under 50% of 2GB after the kernel starts loading other shit and i'm able to express free RAM, at any time in use, in megs, with 8 bits... this means my phone crashes, with no apps running aside from a trimmed list of stock apps, every 3-4 hours due to running out of RAM. The only usable (read: not R/O because unrooted) swapfile is located on a tmpfs, so it's completely fucking useless (and eats another 100MB of RAM that I could be using for LITERALLY anything else, that's like another 3 hours of full idle between crashes) and I can't unlock the bootloader to fix any of this as Huawei no longer hands out keys and it'd take 7 years or so to brute (32-bit @ 10/sec)
tl;dr: fuck15 -
ArmA 3, a great sandbox that I "wasted" a lot time scripting modding or, if you like to call it that: developing for.
A game so great, the Multiplayer-server-browser stores the amount of players on a server in an 8 bit integer.
Someone complaint a few years ago.
Response: " be happy, it were 4 bits not to long ago"
There were Servers who ran into that problem.
To clarify, that only affected the shown number, not the amount of players, at least not directly.
Who likes to be lonely in a multiplayer game.1 -
I went out partying with couple of friends last night, it was nice ...
So I met this girl, she was nice and beautiful, out going basically we were getting each other's vibes and the mood was right ... We had a drink together and everything was cool until I learned her last name ...🤣
I couldn't help it I cracked up it was so intense ...
her last name was "chrooto"
I know that I have ruined my chances with her, but I don't think I could've been able to hold it5 -
!rant !notrant !confession_maybe? Bit of a read.
Last year, around September (around 8 months into my first job in the industry), I started loosing motivation to be a developer. By then I had consistently dropped out of 3 or 4 courses for my degree (no penalties as it was pretty much within the starting weeks of the each course). I was think that I do not want to do this. It got so bad that I was looking for other jobs and even trade apprenticeships (I am old-ish so chances of that are so bloody low).
I had my mind set. Including not wanting to finish the degree I had started, which only had 1 year as full time to complete.
My missus supported me in my decision making, but she insisted that I finish the degree as the years I spent on it would have been a waste if I don't. So I agreed, with the idea that I will do this part time when I find another job.
Fast forward to New Years and a very spontaneous decisions was made. I resigned from my dev job and we ended up moving away to another city, two weeks later. By this point on I was so certain that I did not want to be in the IT industry. I had not done any dev work (personal projects or learning new technology etc) outside of the job for months. It had been months since I've visited devrant (to be honest it was not even installed on my phone, mainly because I broke my phone and after having it replaced I had not reinstalled a large portion of the apps I used). I had sold my custom built pc thinking that we do not need two PC's (we kind of don't, she's fine with her laptop) which meant no more dev stuff as none of this stuff was set up on my missus pc. I was looking for all kinds of jobs outside of the IT industry, anything really.
But then something happened. And this is that something. I mean this, deverant. I was flicking through the apps list on google play store, and I saw devrant, and I choose to reinstall it. I began reading rants and comments and I am certain that this made me realise why I want to be a developer. Within about 2 weeks of redownloading deverant I was enrolled full time as a uni student fully motivated to earn my degree.
There are bits and pieces left out of the story. I don't regret leaving my first ever dev job and moving away, it does seem drastic but it changed me for the better I believe. I have the experience from that role and I new fresh start so to speak. I think my missus new this was just a phase, although it felt so certain about it.
I am more of a lurker than a ranter or a commenter on this social platform but I felt that I need to share this. Thanks for reading this. Not really sure what to tag this. Has anyone else experienced this before?5 -
CONTEST - Win big $$$ straight from Wisecrack!
For all those who participated in my original "cracking prime factorization" thread (and several I decided to add just because), I'm offering a whopping $5 to anyone who posts a 64 bit *product* of two primes, which I cant factor. Partly this is a thank you for putting up with me.
FIVE WHOLE DOLLARS! In 1909 money thats $124 dollars! Imagine how many horse and buggy rides you could buy with that back then! Or blowjobs!
Probably not a lot!
But still.
So the contest rules are simple:
Go to
https://asecuritysite.com/encryptio...
Enter 32 for the number of bits per prime, and generate a 64 bit product.
Post it here to enter the contest.
Products must be 64 bits, and the result of just *two* prime numbers. Smaller or larger bit lengths for products won't be accepted at this time.
I'm expecting a few entries on this. Entries will generally be processed in the order of submission, but I reserve the right to wave this rule.
After an entry is accepted, I'll post "challenge accepted. Factoring now."
And from that point on I have no more than 5 hours to factor the number, (but results usually arrive in 30-60 minutes).
If I fail to factor your product in the specified time (from the moment I indicate I've begun factoring), congratulations, you just won $5.
Payment will be made via venmo or other method at my discretion.
One entry per user. Participants from the original thread only, as well as those explicitly mentioned.
Limitations: Factoring shall be be done
1. without *any* table lookup of primes or equivalent measures, 2. using anything greater than an i3, 3. without the aid of a gpu, 4. without multithreading. 5. without the use of more than one machine.
FINALLY:
To claim your prize, post the original factors of your product here, after the deadline has passed.
And then I'll arrange payment of the prize.
You MUST post the factors of your product after the deadline, to confirm your product and claim your prize.99 -
Typical code life?
1. Write rough comments
2. Write more detailed comments
3. Write pseudo code
4. Write semi-working but definitely ugly code
5. Write working but very ugly code
6. Refactor the code to be nicer, check for patterns, bottlenecks and other bits and pieces
7. Push to git the "final" code
8. After few months blame whoever wrote the code
9. Refactor all the things!
---
This happened in my career more than once and still - it seems like the best option out there to get things done. What do you guys think? Should something be added/removed from this? Is this over-complicated or what?2 -
1 - Spend 6 months building an app with Flutter
2 - Try to add in-app purchase. Must upload an apk to google console and register a product
3 - Must launch the app so the product is activated
4 - Console complains apk must contain 64 bits version
5 - Go to the issues tab on github and find a solution
6 - Implement the solution, recompile and send 2 apk files to Console
7 - Did not work...
8 - Find out maybe the console will allow just on closed alpha and beyond
9 - Put apks on closed alpha and fail because the Console wants a new apks with increased version codes...
10 - Recompile and send apks
11 - Console won't allow launching unless the format is using the new App Bundle
12 - Flutter does not support App Bundle with 32 and 64 bits...
13 - See issue about it saying the possible fix is in beta version, just need to update... What could go wrong?
I just wanted to release a damn app
I hate that shit5 -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
Age 8 - Gets first computer and struggles with dial up Internet and my parents yelling at that they ended to use the phone
Ages 12 to 18 - Gets first laptop, starts messing around and interested in websites, gets involved with SMF, and open source message board system written in Php, and starts helping people out, eventually getting paid work for setting up websites etc.. which lead onto learning html/CSS and picking up bits and pieces of Php (and also Photoshop/Illustrator etc..)
Age 18 - Goes to college to study Multimedia, refreshes knowledge of HTML/CSS, learns a bit of Actionscript and some PHP
Age 20 - finishes Multimedia degree, ends up working as an IT consultant for a small business, which leads me to pick up a bit of bash scripting, small hit more PHP. Leaves this after 3months and decides to do a small Software Dev course. Get my first taste of Java and Visual Basic there
21 - Enter into a Software Dev degree. Dive deep into Java and a small bit of Javascript.
23 - After 2nd year of college get taken on an internship with a large multinational where I learn and get hands on experience with Angular, JS, Coffeescript and C#
Present Day - currently coming up to the end of my degree and can switch between Java, C#, Python, Coffeescript/Javascript (front-end or Node) , C and Golang, C and Python introductions from college modules which I kept playing with in my spare time, Golang I just heard of and decided to write a few things in it because why not, I've picked up various frameworks (spring, echo, express etc.) at some point. I basically learn by doing, if something interests me and I enjoy it, I seem to pick it up quickly by diving in and trying to use it.1 -
C# has become shit.
I work since 2013 with C# (and the whole .NET stack) and I was so happy with it.
Compared to Java it was much lean, compared to all shitty new edge framework that looked like a unfinished midschool project, it was solid and mature.
It had his problems,. but compared to everything else that I tried, it was the quickes and most robust solution.
All went in a downhill leading to a rotten shit lake when all this javascript frenzy began to pop up and everyone wanted to get on the trendy bandwagon.
First they introduced MVC, then .NET Core, now .NET 5-6-7-8.
Now I'm literally engulfed with all these tiny bits of terror javascript provoked and they've implemented in all the parts of their framework.
Everything has to be null checked at compilation time, everything pops up errors "this might be nulll heyyyyy it's important put a ! or a ? you silly!!!" everywhere.
There are JS-ish constructs and syntax shit everywhere.
It's unbearable.
I avoid js like a plague whenever I can (and you know it's not a luxury you get often in the current state of a developer life) and they're slowly turning in some shit js hybrid deformed creature
I miss 2013-2018, when it wass all up to me to decide what to do with code and I did some big projects for big companies (200-300k lines of code without unit tests and yes for me it's a lot) without all this hassle.
I literally feel the need c# had to have some compiler rule you can quickly switch called "Senior developer mode" that doesn't trigger alarms and bells for every little stupid thing.
I'm sure you can' turn on/off these craps by some hidden settings somewhere, but heck I feel the need to be an option, so whoever keeps it on should see a big red label on top of the IDE saying "YOU HAVE RETARDED DEV MODE ON"
So they get a reminder that if they use it they are either some fresh junior dev or they are mentally challenged.20 -
Early 1970s, when I was around 8 years old. I read about Artificial Intelligence and it blew me away. I knew nothing about computers, other than I wanted to program them.
I still have old computer magazines, starting from around 1978 not long after the microcomputer revolution started.
My first computer had 2K RAM. That's 2048 bytes. I expanded the memory 1K at a time, and it took 2 chips - they were 4 bits by 1024 so you needed 2 chips to have 8 bit wide memory.
2114 static ram, 300ns.
I think they still make them!6 -
I had to reinstall windows because (hear me out) an old webcam, so old I used to use it for Messenger when I was like 8, couldn't get its drivers right for Windows 10 64 bits.
Not 100% tested, but I'm 88.9% sure it was that what corrupted my OS.
After reinstalling, I had to look up the model (super generic but distinctive enough to make me search for a whole 2 hours), learnt a lot (wow) and now it's... let's call it working.
Now I have to reinstall the 6 programs and 10 games I had that could be worse but still, damn.7 -
Jesus fucking christ.
I posted two rants today, both about trying to get my SD card reader to work on my arduino due...
After spending my whole day debugging, rewriting my code multiple times, cutting off anything that wasn't strictly necessary, writing my code procedurally, I finally decided to go and see what the library I HAVE to use is doing under the hood.
APPARENTLY...
uint_fast8_t is NOT 8 bits wide on my due...
The bitshifts were "overflowing" (not really, just taking more space, which it had access to) meaning that my data was getting corrupt.
FUCK YOU FOR TAKING A DAY OFF MY LIFE1 -
!rant
I tend to do a lot of sketching and note taking and like to use pen on paper. But sick of tearing out notes and accumulating bits if paper that contain notes here and there. I was thinking of going digital with this specific task (for cheap) i don't own an ipad so was thinking of getting a Fire HD 8" and a Boxwave EverTouch stylus. Im all Apple so don't know android, would this hardware do the trick and what would be a decent note taking and organising app for it? Appreciate any advice comments. Other uses for it are irrelevant.5 -
Ideas I've had over the years that could pan out and be useful:
SMS-DB: Stands for SMS-Data Burst. Used to allow those with low cell signal or no data plan to transfer data between a phone and some client via the standard SMS text space. Would be slow, but would act kinda like dial-up over SMS (as mobile lines are compressed on all service levels, even LTE, so traditional dial-up wouldn't work!) I have a general idea on how packets would be laid out, but that's about it so far...
everything2PNG: Allows one to transpose any file's data into a PNG with a 3 byte per pixel (full color RGB), which allows for a "compression" of sorts (about 91, 93% on preliminary tests) AND allowing further, more efficient compression of the resulting file. (Plus... it's just kinda cool to see files transposed as PNGs.) I actually have a simple transposer to go to PNG, but can't yet go back. Large files (around 600MB) use upwards of 4GB with efficient paging and other optimizations via NumPy so far, so it's not *viable* yet, but it's coming along nicely.
RPi-GPIO Interconnection Bus: A master/slave or round robin method to allow for Raspberry Pis to communicate using GPIO, which can help free up network bandwidth in RPi cloud computing clusters. At most, this'd allow for 4 bits used for pushing to the GPIO "bus", and 4 bits used for pulling from the "bus". 8 pins total are usually unused minimum, so either 3 or 4 pins for upload, 3 or 4 for download, and potentially 1 or 2 for commands, general non-data communication, etc. I made a version of this concept using Round Robin for a client, but it was horribly slow. (I also don't have distribution rights for the code, so i'm working from scratch.) Definitely doable. -
Started playing around with HTML and CSS when I was about 8. Tried JavaScript but it never stuck. Started to learn a bit of Python when I was about 13 and enjoyed it, but never applied it to anything other than some maths. Used some basic ActionScript in Flash animations. Wrote some simple VBA in Excel. Learnt Matlab during my Engineering degree. Now I use Mathematica for my PhD work, Python for fun and useful bits of software for myself, and the occasional bit of PHP and whatever else I need at the time to get something working.
-
8 bits is always an octet, and i fucking hate it when people say 8 bits is equal to a byte, because in some some system that's not always true3
-
I earned the title of "sql king" due to the complexity of the sql I write (not proud of that because it makes maintainability hell) . It's pretty cool except sometime I feel like I can write shit code and ship it to production just because none can review my code properly or rather spend enough time understanding it,
basically I am not challenged enough...
What do you guys do if you are not challenged or bored ?
Never contributed to an open source but it might be the solution4 -
If anyone has a moment.
curious if i'm fucking something up.
model:
self.linear_relu_stack = nn.Sequential(
nn.Linear(11, 13),
nn.ReLU(),
# nn.Linear(20, 20),
# nn.ReLU(),
nn.Linear(13,13),
nn.ReLU(),
nn.Linear(13,8),
nn.Sigmoid()
)
Inputs:
def __init__(self, targetx, targety, velocityx, velocityy, reloadtime, theta, phi, exitvelocity, maxtrackx, maxtracky,splashradius) -> None:
# map to 1 and 2
self.Target: XY = XY(targetx, targety)
# map to 3 and 4
self.TargetVel: XY = XY(velocityx, velocityy)
# TODO: this may never be necessary as targeting and firing is the primary objective
# map to 5, probably not yet needed may never be.
self.ReloadTime:float = reloadtime
# map to 6 and 7
self.TurretOrientation: Orientation = Orientation(theta, phi)
# map tp 8
self.MuzzleVelocity:float = exitvelocity
# map to 9 and 10, see i don't remember the outcome of this
# but i feel it should work. after countless bits of training data added.
# i can see how this would fuck up if exact values were off or there was a precision error
# maybe firing should be controlled by something else ?
self.MaxTrackSpeed: Orientation = Orientation(maxtrackx, maxtracky)
# these are for sigmoid output, any positive value of x will produce between 0.5 and 1.0 as return value
# from the sigmoid function.
self.OutMin = 0.5
self.OutMax = 1.0
# this is the number of meters radius that damage still occurs when a projectile lands.
# to be used for calculating where a hit will occur.
self.SplashRadius:float = splashradius
Outputs:
def __init__(self, firenow, clockwise,cclockwise,up,down,oor, hspeed, vspeed) -> None:
self.FireNow = float(firenow)
self.RotateClockWise = float(clockwise)
self.RotateCClockWise = float(cclockwise)
self.MoveUp = float(up)
self.Down = float(down)
self.OutOfRange = float(oor)
self.vspeed = float(vspeed)
self.hspeed = float(hspeed)9 -
Started working on a library to allow manipulation of bit sets. It will read in bits in 1 to 8 bit packets and tack them onto a structure that is represented by sequential bits. It will include ways to interpret the bits in 1 to 8 bits per mapping. Each mapping will be able to do logical operations on the bits. The whole point is to be able to take a stream of possibly malformed bits and try and make sense of them.
The inspiration for this is this sequence:
http://therendleshamforestincident.com/...
Yes, it is possible this data is utter bullshit, but I want the library all the same. I think it will be a fun one to write and use for digital forensics of arbitrary data.1