Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "compute"
-
So I accidentally published my AWS keys to GitHub, stupid me. I realize this the next day.
$ git reset
$ git push
Reset keys in AWS
I was too late. Bot already stole the keys and started up 53 EC2 instances. Racked up $4000+ of compute time (probably Bitcoin mining, I'd assume)
4 weeks later, I finally have this shit disputed and settled.
Don't test with hardcoded keys. You WILL forget about them. Env vars always. That is all.29 -
One of my favorite aspects of devRant has always been getting to learn more about the awesome people who use it. Beyond just the awesome stories posted by many here, one of my favorite ways to learn about and feel connected to the people here has always been desk/setup reveals. I personally love seeing different kinds of setups from all over the world, knowing that’s what the people here use to do their work and compute in general.
As an experiment, we want to try a few different things to highlight desk/setup/remote coding location posts. First, we’ve created the first devRant Instagram account, which is completely focused on developer desks/setups/workstations/remote coding. Please check it out here and follow: https://www.instagram.com/devdesks/
I want to use the account to bring more attention to the wide assortment of setups the awesome members of the devRant community post from all over the world. We’ll promote cool desk/setup/remote work images that are posted on devRant to the Instagram account for more exposure/additional audience.
Beyond that, I also want to try to come up with a way to better organize all of the desk/setup posts on devRant and encourage more of them. One kind we don’t see that often that I personally really enjoy is people coding with their laptops in locations that show the culture of their country or something special about the region they are from. Personally, I’m going to try to post some of those for where I live and work.
So how can you help with this effort? It’s easy! We encourage people to post their setups/working remotely pics and we will start featuring them on the Instagram account and hopefully elsewhere in the devRant app for some increased visibility/searchabilty over what we have now (since pics are kind of hard to search).
Also, we plan to make the weekly rant this week “post your setup,” so maybe wait until then to post, and you can work now on getting that awesome shot :) I know a lot of people here love photography like I do, so I think that part is fun too.
Please let me know if you have any ideas or questions about this, and I’m looking forward to seeing the desks/setups of many more devRanters in the next few days!
P.S. not a requirement, but one thing I think makes these photos better looking through a lot of them is when there is code visible in some way.44 -
Person: HTML is a programming language
Me: No it's not
Person: Yes it is it can compute things
Me: No it can't, and what do you mean?
Person: Have you ever heard of a script tag
Me: That's not fucking HTML that's JavaScript.14 -
C'mon people! Spread the word! "The cloud" is not "just someone elses computer", it's a completely different way to compute!
I'm so tired of the oversimplifications done trying to explain the consept. The massive amount of work, sweat and tears put into the orchestration, automation and abstraction layers to deliver truly elastic, scalable and self healing infrastructure, applications and services deserves a fuckload more respect than "just someone elses computer"!
Hosting and time-sharing have been with us almost as long as we have had computers (mainframes etc), but dismissing the effort of thousands upon thousands of devs and ops people to make systems robust and automated enough to literally being able to throw a wrench in the engine any time during production and not have the systems suffer is fucking insane!
The whole reason the term "cloud" is so fitting is not just because it was coined from the cloud-shape used in technical and non-technical drawings and illustrations symbolising the internet, but also because of the illusion of magic it gives the end-user not being able to see "whats inside the music box".19 -
Waisting some times on codewars.com
~~~~
3 kyu challenge:
Given a string with mathematical operations like this: ‘3+5*7*(10-45)’, compute the result
~~~~~
*Does a quick and easy one liner in python using eval()*
*sees people actually writing some 100 lines parsing the string and calculating using priority of operation*
Poor them...
(Btw, passed to lvl 4 kyu thx to this)14 -
Yesterday we started coding with my eldest son, with some board game (based in scratch), and it was so fucking amazing! I'm partial, but he's a fucking code genius!!!
In the game, the child "code" some functionality with cards and the adult (me) compute them 'doing the actions'.
I'm so fucking proud!!! Well, I'm always proud of my children, and there the English language doesn't convey very well my thinking as the verb to be doesn't differentiate the intrinsic state of a subject and a passing state:
SOY TAN ORGULLOSO DE MIS HIJOS!11 -
Is it really unreasonable that I wish aws would just name their fucking products after what they are? Why the fuck is dns called route 53? Why the fuck is a vm an elastic cloud compute node? Stop being pretentious dicks and just name things what they are!
Am I being unreasonable?7 -
Accidentally left an AWS RDS connection open overnight
I finally understand those memes about how AWS is paying for what you forgot to turn off4 -
First rant: but I'm so triggered and everyone needs a break from all the EU and PC rants.
It's time to defend JavaScript. That's right, the best frikin language in the universe.
Features:
incredible async code (await/async)
universal support on almost everything connected to the internet
runs on almost all platforms including natively
dynamically interpreted but also internally compiled (like Perl)
gave birth to JSON (you're welcome ppl who remember that the X in AJAX stood for XML)
All these people ranting about JS don't understand that JS isn't frikin magic. It does what it needs to do well.
If you're using it for compute-heavy machine learning, or to maintain a 100k LOC project without Typescript, then why'd you shoot yourself in the foot?
As a proud JS developer I gotta scroll through all these posts gushing over the other languages. Why does nobody rant about using Python for bitcoin mining or Erlang to create a media player?
Cuz if you use the wrong tool for the right job, it's of course gonna blow up in your face.
For example, there was a post claiming JS developers were "scared" of multithreading and only stick in their comfort zone. Like WTF when NodeJS came out everything was multithreaded. It took some brave developers to step out of the comfort zone to embrace the event loop.
For a web app, things like PHP and Node should only be doing light transforms between the database information and HTML anyways. You get one thread to handle the server because you're keeping other threads open to interface with databases and the filesystem. The Nexus.js dev ranting on all us JS devs and doesn't realize that nobody's actual web server is CPU bound because of writing HTML bodies, thats why we only use 1 thread. We use other worker threads to do the heavy lifting (yes there is a C++ bridge look it up)
Anyways TL;DR plz respect JS developers we're people too. ES7 is magic and please don't shit on ES3 or we'll start shitting on the Python 2-3 conversion (need to maintain an outdated binary just cuz people leave out ()'s in their print statements)
Or at least agree that VB.NET is an abomination and insult to the beauty that is TI-84 BASIC13 -
TL;DR
I accidentally surpassed(?) my user permissions and closed some of my classmates browsers and locked up a terminal for me
In school we have 2 primary operating systems: Windows and Ubuntu. Windows is hell in general and but not as hell as the firefox installation on Ubuntu.
"Just loaded this page. Now wait half a minute so that I can render it"
"Woah, woah, woah. Slow there. You just made an input event. Give me those 5 seconds to compute what you just did"
Executing "top" or "htop" shows you a long list of firefox processes with a cpu usage of 99.9%, since the whole school shares that linux environment.
Anyway, one day it was way more servere than normally and I way forced to kill my firefox instances. So I pressed CTRL+ALT+T for that terminal, waited 5 minutes until it accepted input typed "killall firefox" with a delay of half a minute per character and smahed that enter key.
At this very point in time I could hear confusion from every corner of the room. "What happened to firefox?"
Around 30% of the opened browsers where abruptly stopped. I looked back to my screen noticed I was logged out. I couldn't login from that terminal for the rest of that day.
Our network admin, which happened to be there, since the server is just next door, said that this was just convenience, but the timing was too perfect so I heighly doubt that.
I felt like a real hackerman even if it was by accident :)8 -
Personally the coolest was the program I built for my fathers use on his job.
It was my first to be used commercially in the real.
That was a very big thing, I was 17 at the time an used turbo pascal 5.5 and he used it to compute how well all machinery was doing, they rented out diggers and other construction equipment to construction sites and manually compute this with a calculator took up to three days. (This was 1987 so there was not very many ready made programs for business, you often had to build your own)
With this program he had it done in around 30 minutes.
The next best was recently when I got my raft distributed consensus cluster server working. Its a little bit like zookeeper.
Building that purely from the research paper was rewarding but a bit of a challenge.3 -
Casually debugging some cuda code today. Something's not working so I add a breakpoint in the suspicious kernel. For some reason I set the display GPU as the active device from my code *GENIUS* ( I have two GPUs installed, one for compute, one for the monitors).
Starts cuda debugging... Control flow reached the kernel and eventually the breakpoint. Suddenly the whole system freezes. Mouse doesn't move, keyboard seems dead. I realize I have unsaved code on the open text editor😲 *panic*. Keyboard shortcut to stop debugging doesn't work *panic^2*. My colleague says I have to hard reset the machine *panic^3*. I don't remember the last time I saved *panic^4*.
I take a deep breath. I reset. *sidenote: WINDOWS DECIDED TO FUCKING UPDATE ON REBOOT* Once I login, 50% of my code was lost. I didn't save 😢
Fuck you Nvidia 😢7 -
I give math lessons to high school people in my "free" time. One of my guys needs to use calculator to compute sums with 0 and yet he wants to become a programmer 🤦🏻♂️🤦🏻♂️🤦🏻♂️3
-
Follow-up to my previous story: https://devrant.com/rants/1969484/...
If this seems to long to read, skip to the parts that interest you.
~ Background ~
Maybe you know TeamSpeak, it's basically a program to talk with other people on servers. In TeamSpeak you can generate identities, every identity has a security level. On your server you can set a minimum security level you need to connect. Upgrading the security level takes longer as the level goes up.
~ Technical background ~
The security level is computed by doing this:
SHA1(public_key + offset)
Where public_key is your public key in Base64 and offset is an 8 Byte unsigned long. Offset is incremented and the whole thing is hashed again. The security level comes from the amount of Zero-Bits at the beginning of the resulting hash.
My plan was to use my GPU to do this, because I heared GPUs are good at hashing. And now, I got it to work.
~ How I did it ~
I am using a start offset of 0, create 255 Threads on my GPU (apparently more are not possible) and let them compute those hashes. Then I increment the offset in every thread by 255. The GPU also does the job of counting the Zero-Bits, when there are more than 30 Zero-Bits I print the amount plus the offset to the console.
~ The speed ~
Well, speed was the reason I started this. It's faster than my CPU for sure. It takes about 2 minutes and 40 seconds to compute 2.55 Billion hashes which comes down to ~16 Million hashes per second.
Is this speed an expected result, is it slow or fast? I don't know, but for my needs, it is fucking fast!
~ What I learned from this ~
I come from a Java background and just recently started C/C++/C#. Which means this was a pretty hard challenge, since OpenCL uses C99 (I think?). CUDA sadly didn't work on my machine because I have an unsupported GPU (NVIDIA GeForce GTX 1050 Ti). I learned not to execute an endless loop on my GPU, and so much more about C in general. Though it was small, it was an amazing project.1 -
fml
All week working with sun outside and when you're supposed to enjoy the weekend the weather gods don't let you :/5 -
Anything I (am able to) build myself.
Also, things that are reasonably standardized. So you probably won't see me using a commercial NAS (needing a web browser to navigate and up-/download my files, say what?) nor would I use something like Mega, despite being encrypted. I don't like lock-in into certain clients to speak some proprietary "secure protocol". Same reason why I don't use ProtonMail or that other one.. Tutanota. As a service, use the standards that already exist, implement those well and then come offer it to me.
But yeah. Self-hosted DNS, email (modified iRedMail), Samba file server, a blog where I have unlimited editing capabilities (God I miss that feature here on devRant), ... Don't trust the machines nor the services you don't truly own, or at least make an informed decision about them. That is not to say that any compute task should be kept local such as search engines or AI or whatever that's best suited for centralized use.. but ideally, I do most of my computing locally, in a standardized way, and in a way that I completely control. Most commercial cloud services unfortunately do not offer that.
Edit: Except mail servers. Fuck mail servers. Nastiest things I've ever built, to the point where I'd argue that it was wrong to ever make email in the first place. Such a broken clusterfuck of protocols, add-ons (SPF, DKIM, DMARC etc), reputation to maintain... Fuck mail servers. Bloody soulsuckers those are. If you don't do system administration for a living, by all means do use the likes of ProtonMail and Tutanota, their security features are nonstandard but at least they (claim to) actually respect your privacy.2 -
My preferred stack is Rails/NginX/Postgres, or Node using the same.
I have a fair amount of material for this week's rant, but in my stack's defense, the quantity is primarily because I've been using it for so long, and I'm apparently a talented breaker. I may share other stories if the motivation arises.
However, today I ran into something definitely deserving of calling out.
The default datatype for a Date+Time column in Postgres is `datetime` which means "date+time without timezone". (while `datetimetz` instead stores the timezone).
Apparently when comparing a datetime with a datetimetz, Postgres doesn't compute the timezone difference correctly, leading to some very unexpected and confusing query results.
Today, I had a record that was both pending (expires_at > now) and expired (expires_at <= now), where now is a DateTime (with tz) literal from Rails. After half an hour's frustrated delving and baffled expressions at query results, I finally figured out that the database's math was incorrect when comparing UTC (+0) and PST (-7).
This during a semi-high-priority bugfix that's blocking for a coworker.
While Time and all of its nuances are honestly extremely difficult to handle correctly, I didn't expect Postgres to get this relatively simple part wrong.
Shame on you, Postgres.
I expected better.3 -
Can someone explain me AI/ML/DL in traditional algorithmic way without AI jargons?
What I currently understand is that they convert the training data to numbers based on a complex black boxed mathematical algorithm and then when a new data comes in, the same conversion is done and a decision is taken based on where the the new number fits in within the geometry/graph plot of the old numbers from training. The numbers are then updated. Is this what they call AI? Nearest number/decision search?
Kindly try to avoid critic, I am having a difficult time understanding the already trending AI stuff. People say that the algo exists from long back but only now we have the compute power.20 -
Our company has internal webpage to request software, be it freeware or licensed.
Today, I found there "Software engineering bundle" designated for "software developers and data scientists who require advanced compute and data processing tools".
The software bundle contains PuTTY, 7-zip and Notepad++.6 -
Sometimes I just don't know what to say anymore
I'm working on my engine and I really wanna push high triangle counts. I'm doing a pretty cool technique called visibility rendering and it's great because it kind of balances out some known causes of bad performance on GPUs (namely that pixels are always rasterized in quads, which is especially bad for small triangles)
So then I come across this post https://tellusim.com/compute-raster... which shows some fantastic results and just for the fun of it I implement it. Like not optimized or anything just a quick and dirty toy demo to see what sort of performance I can get
... I just don't know what to say. Using actual hardware accelerated rasterization, which GPUs are literally designed to be good at, I render about 37 million triangles in 3.6 ms. Eh, fine but not great. Then I implement this guys unoptimized(!) software rasterizer and I render the same scene in 0.5 ms?!
IT'S LITERALLY A COMPUTE SHADER. I rasterize the triangles manually IN SOFTWARE and write them out with 64-bit atomic image stores. HOW IS THIS FASTER THAN ACTUAL HARDWARE!???
AND BY LIKE A ORDER OF MAGNITUDE AT THAT???
Like I even tried doing some optimizations like backface cone culling on the meshlets, but doing that makes it slower. HOW. Im rendering 37 million triangles without ANY fancy tricks. No hi-z depth culling which a GPU would normally do. No backface culling which a GPU with normally do. Not even damn clipping of triangles. I render ALL of them ALL the time. At 0.5 ms7 -
New idea: Fuck raytracing for global illumination because you just need too many rays for it to converge
What if we do surfels (to keep the number of probes down and relevant to our scene) and we update the 4x4-ish sized hemisphere irradiance maps not by tracing a single ray per frame per surfel. I have a fast as shit compute shader rasterizer... What if I just raster each surfel each frame? Should be around the same number of pixels as the primary visibility so totally feasible....
Each frame just jitter the projection a bit and voila. Should have extremely high quality diffuse global illumination at well below 1 ms. Holy shit this might just work3 -
Writing an efficient, modern renderer is truly an exercise of patience. You have a good idea? Hah, fuck you, GPUs don't support that. Okay but what if I try to use this advanced feature? Eh, probably not going to support exactly what you would like to do. Okay fuck it I'm gonna use the most obscure features possible. Congratulations, it doesn't work even on the niche hardware that supports that extension
If I sound jaded, ya better believe I f*cking am! I cannot wait for more graphics cards to support features like mesh shaders so we can finally compute shader all the things and do things the way we want to god dammit -
Dear Coffee,
I ask for your help.
I need to pass this exam, and at the same time a client is angry.
I invoke you.
Like the function I'm in.
A function of time, a function that will probably never halt but you cannot prove it. You hope it will stop soon, but deep inside you know it will continue to compute.
I beg you, Coffee. Make this function of procrastination stop. Please.
I see no escape.
It is a tail-recursive function. You realize it as soon as you reach the end.
You can do nothing about it, you're trapped inside this loop. At each iteration you hope to reach the bottom, but you never know. You can only hope that the bottom is close.
This is the last one, you keep repeating to yourself.
Please Coffee, let it be a non-pure function.
Make the environment change.
Only then we can be saved.3 -
A few months ago I ranted about how my first encounter with Assembly was hopefully the last one
Here I am, again, with my second Assembly encounter. However, this time I'm able to read and understand it more, such that I'm even able to compute stack layouts. I don't even hate it that much anymore.
I guess I'm walking the path I couldn't defeat
*cries in %rax*6 -
While testing the newly discovered "primesieve" C library, I forgot to change the limit variable from 1e10 to just 10 when giving the value through pow to make it more explicit.
Now my PC is dying in front of me while trying to compute the Gogol nth prime number. Nice.3 -
I'm reinventing the wheel by making yet another neural network library. It's not any good yet but I learn as I go along.
The only documentation that exists now is the admittedly quite comprehensive code comments. I'm it because Keras (using TensorFlow) requires a 3.5 compute ability rating for CUDA acceleration (which I don't have) and it doesn't support OpenCL. Eventually, I will make my implementation support both with varying levels of acceleration for different compute capabilities with the oldest supported being my hardware. If I ever get around to it.
I'd say wish me luck but determination would be infinitely more useful.2 -
At the institute I did my PhD everyone had to take some role apart from research to keep the infrastructure running. My part was admin for the Linux workstations and supporting the admin of the calculation cluster we had (about 11 machines with 8 cores each... hot shit at the time).
At some point the university had some euros of budget left that had to be spent so the institute decided to buy a shiny new NAS system for the cluster.
I wasn't really involved with the stuff, I was just the replacement admin so everything was handled by the main admin.
A few months on and the cluster starts behaving ... weird. Huge CPU loads, lots of network traffic. No one really knows what's going on. At some point I discover a process on one of the compute nodes that apparently receives commands from an IRC server in the UK... OK code red, we've been hacked.
First thing we needed to find out was how they had broken in, so we looked at the logs of the compute nodes. There was nothing obvious, but the fact that each compute node had its own public IP address and was reachable from all over the world certainly didn't help.
A few hours of poking around not really knowing what I'm looking for, I resort to a TCPDUMP to find whether there is any actor on the network that I might have overlooked. And indeed I found an IP adress that I couldn't match with any of the machines.
Long story short: It was the new NAS box. Our main admin didn't care about the new box, because it was set up by an external company. The guy from the external company didn't care, because he thought he was working on a compute cluster that is sealed off behind some uber-restrictive firewall.
So our shiny new NAS system, filled to the brink with confidential research data, (and also as it turns out a lot of login credentials) was sitting there with its quaint little default config and a DHCP-assigned public IP adress, waiting for the next best rookie hacker to try U:admin/P:admin to take it over.
Looking back this could have gotten a lot worse and we were extremely lucky that these guys either didn't know what they had there or didn't care. -
The people that religiously put "missing docstring" on code reviews... Like, I get why but... Who hurt you? Especially in typed languages, where there's a function called "can compute stuff" with a foo type, and I add a docstring that's "whether we can compute stuff for the given foo"...5
-
The PS3 has 2 OS types: GameOS (the XMB menu and what you use to play games) and OtherOS (anything else you'd wanna load, usually Linux.) There's a problem with this: There's a build of GDB meant for OtherOS. That's great, but I need some background debugger for GameOS. Why, oh fucking WHY, has no one made a debugger like this? We have the ability to reserve compute units (SPUs) and/or areas of RAM for code to continue running when something else is loaded, why the FUCK isn't there a game debugger???10
-
"Fauna's free plan has been adjusted from 5GB to 1GB going forward"
huh? I thought "tech was eating the world" "cloud is everything" "things are getting more efficient"
so why the hell do all these cloud providers keep DECREASING the free tier. annoying as all hell
where's all the storage and compute going? fucking crypto?
each day the 🤡🌎🎪 grows stronger13 -
Okay I'm doing the whole leetcode bs, interviewing with a faang like company.
I'm genuinely curious to see if their engineers are actually any good. It seems backwards to me to hire someone based on something they most likely know by heart.
It's like trying to stress test an API by calling a cached endpoint. It will look fast AF, and it will be, but it won't compute shit.
Anyway, if I get the job and the engineers aren't crappy, then I'll forever stfu about how lame this is. But if I get the job and the devs are crappy, oh boy you'll hear me for a long time.3 -
Node: The most passive aggressive language I've had the displeasure of programming in.
Reference an undefined variable in a module? Prepare to waste your time hunting for it, because the runtime won't tell you about it until you reference a property or method on the quietly undefined module object.
Think you know how promises work? As a hiring manager, I've found that less than 5% of otherwise well-experienced devs are out of the Dunning Kruger danger zone.
Async causes edge cases and extra dev effort that add to the effort required to make a quality product.
Got a bug in one of your modules? Prepare yourself for some downtime because a single misplaced parentheses can take out the entire Node process, killing unrelated pages and even static file hosting.
All this makes for a programming experience that demands much higher cognitive load, creates more categories of bugs, and leads to code bloat/smell much more quickly than other commonly substituted languages.
From a business perspective, the money you save on scaling (assuming your app is more compute efficient under Node) is wasted on salaries and opportunity costs stemming from longer dev time, more QA, and more frequent outages.
IMO, Node is an awesome experiment, a fun language, a great tool for specific use cases, and a terrible fucking choice for an entire website.8 -
PSA: The smaller the compute shader workgroups the more efficient they are, down to the wave size (32 on nvidia). Not exactly sure why, but looks like if you don't need group shared memory always have your workgroups be wave sized
Just this alone gave me a 30%+ performance increase. And combined with a few other changes got me from 50 µs to 10 µs, yay!5 -
24" Vertical - Dell U2415 (1920x1200)
31.5" - HP Z32x (3840x2160)
30" - HP Z30i (2560x1600)
Compute:
MBPr - 15" - Mid 2015
HP Z620 (Xeon CPU E5-2620 v2 @ 2.10GHz x 6 Core, 128GB RAM, 512GB SSD
To be honest, the Mac is just a nice ssh console to the workstation that runs Ubuntu.
I was using Ubuntu directly on the workstation, but needed to use things like outlook, lync, and other tools that dislike Linux :(1 -
AI so far....
2012: We can do more than 5 layers whoa
2013: It works on text too!
2014: Let’s build infras with frameworks & cloud compute
2015: AlphaGo! Singularity!
2016: Wait it’s racist & sexist
2017: Deepfakes scary
2018: No idea how it works
2019: Whatevs time to productize $$$
2020: ??5 -
Name one thing more fun than atomically writing values into a gpu buffer and them mysteriously vanishing into the aether immediately after the compute shader invocation
I can literally see them in the buffer using RenderDoc and then as soon as I go to the next command the buffer is completely filled with zeros again as if the values never existed
?? like how ??11 -
I have an idea
Imagine an objective social media platform
which is like a cms
everything you add to the platform can be anything you want it to be
depending on the properties of any given thing, the client will render it differently, and the blockchain will compute differently.
You go to site.com
you create some "thing"
you give that thing a name,
a facebook post
the platform looks up all the schema's used for thing's named "a facebook post", and suggests the most popular, which would be a "thing"/"object" with the properties comments, with the type list of other things, reactions, which is a list of reactions, reactions being likes, loves, laughs, etc.. a property called shares.. etc.. etc..
so the platform is a cms which can adapt, create, and display data based on what that thing is objectively depending on its properties. You could have tweets, reddit posts, youtube videos, all on the same platform.
If you get my drift, hit me up, ireply@myleisure.com.au,
first principles7 -
So I got my compute shader rasterizer working pretty well now which is great. I now also have a fallback to hardware rasterization for triangles which are a bit sussy (mostly just too large) and getting that implemented without tanking performance (gazillion threads hitting the same atomic variable at the same time) involved some tricky workgroup/subgroup hackery but I'm happy with it
Only problem... I have like 90%+ SM occupancy (which is great) but I also have 90%+ SM occupancy which means the nvidia drivers think I'm mining cryptocurrency and start bottlenecking my compute performance at random. It slowly goes up to 3x, then it slowly goes down again, then it slowly goes up again... argh
Thanks, miners 😐8 -
I've decided to switch my engine from OpenGL to Vulkan and my god damn brain hurts
Loader -> Instance -> Physical Devices -> Logical Device (Layers | Features | Extensions) | Queue Family (Count | Flags) -> Queues | Command Pools -> Command Buffers
Of course each queue family only supports some commands (graphics, compute, transfer, etc.) and everything is asynchronous so it needs explicit synchronization (both on the cpu and with gpu semaphores) too4 -
"The value used to compute the x position of either side of the rectangle."
and
"The value used to compute the x position of the other side of the rectangle."
I get it ... but man documentation if you don't already get it ... like nobody is gonna get that.4 -
In other to sharpen my algorithm and data structure skill.
I implemented the complete *eval()* function for arithmetic Expression in java
It can compute any kind of arithmetic Expression even with parenthesis grouping
Here is the github repo
https://github.com/Afrographic/...1 -
Tryna decide what I want my next job to be, I currently span some performance stuff, some data stuff. I'm torn between going hardcore c++ high performance compute or pure data science.6
-
I guess you could say that my speciality is cloud at scale. I’d say it chose me more than I chose it.
Looking back on it though, I think what I like about my speciality is the unique challenges it brings.
Every speciality has its own set of challenges, like tight resource limits in embedded, or client-server synchronisation in native/mobile.
The challenge of cloud at scale is throughput. Designing systems that can support 100K users making a bazillion requests a second, or a data pipeline firing events that you need to process in near real time without dropping a single one.
The real challenge of course is doing all this within a sensible budget. We have virtually infinite compute but we dont have infinite dollars to spend on it.
Its a fun problem to solve.3 -
2 days ago I started solving the problems on https://projecteuler.net/ recommended to me by @AlmondSauce and I already regret knowing python, a relatively simple calculation -ish took roughly 4 hours to compute7
-
After a year of using mongo in prod and personal projects I have realised some things. Its really nice early on the project, especially when there are changing requirements and for small projects or proof of concepts.
But when you make commercial software things tend to get more complex and relational. Stakeholders want reporting and even a report building which a document store isn't the best at.
With most projects projects when they get big things get relational and this becomes more and more expensive to handle in terms of compute power and developer time.
I don't doubt mongo has its place, maybe as an secondary specialised data store or if the project is inherently document oriented.
Blog over.7 -
I thought I found a way to compute PI, but I actually just found a super shitty way to print a variable..
const precision = 1000000
// convert degree to radians
function rad(degree){
return degree * (Math.PI/180);
}
function calculatePI(){
// [x, y]
// take first point on start of unic circle
const point1 = [Math.cos(0) , Math.sin(0)];
// take second point at 0.001 degree
const point2 = [Math.cos(rad(1/precision)), Math.sin(rad(1/precision))];
// Estimate 0.001 degree of circle
const dist = Math.sqrt((point2[0] - point1[0] ) ** 2 + (point2[1] - point1[1]) ** 2);
// Calculate full circle
const perimeter = dist * precision * 180;
return perimeter;
}
console.log(calculatePI());4 -
I love this documentation about System. nanoTime()
Differences in successive calls that span greater than approximately 292 years (2^63 nanoseconds) will not correctly compute elapsed time due to numerical overflow.
Can I confidently say that no service will ever break this limit?5 -
Testing requirements, some of these are pretty specific such as 'don't do X before you compute Y'... OK, check that off
Now we have some independent analyst saying how can you prove 'X' isn't done... "Look at the source code we've provided"
"OK, but where in the source should I look [for something which isn't there]?" -
So I'm currently working on a chat app that deals with astrology..dealing in the sense we are building an AI which gives prediction based on ones date of birth, time of birth and place of birth, you can ask it questions (currently only career related) and you get some prediction..it's an in-house project, we have a client who is an astrologer who gives us the logic to compute the predictions ..it's still a long way from being an AI ...so our CEO walks in one day with his huge plans for the product...decides to ditch the app completely, on which we have invested 4 months of our time and instead make an appointment scheduling webapp for our client as he felt that would fetch us some green stuff..so I was like why ditch the app when we can have the same module in the app itself and ask the astrologer to make his clients install if they want to book future appointments, he completely disregarded my idea and said that is bad marketing and all other shit and he went on to explain his other ideas ...I didn't think much of it at that time , then the CEO and the director of technology had a separate meeting where the director has made the same points which I had told him(ceo) that it is a bad idea to ditch the app (I wasn't aware of this meeting untill later)...so after a week we have a team meeting with the CEO, director of technology ...where he starts telling how it is not so wise to Chuck the existing application and build a new one which is totally unnecessary and we can have it as a module in the existing one...and I'm like sitting there thinking to myself da fck is he talking about...so i decided to stay silent and listen to his bs...my marketing lead leans over and ask y so silent ....I tell her whatever he is talking now is the same thing I told last week which he rejected blatantly... And then he had the nerve to ask me any inputs to this plan...I couldn't hold back ...I told him that this is the exact same thing I told u last week , to which his reply was focus on the future and forget the past ....I was like mother fckr woooooot ...I realised the power of position !! Fuckol man3
-
*rewrites rust mpsc*
you did it wroooong
I thought my threads were locking if I had thousands of jobs spawning thousands of more jobs. turns out it's fine. actually if I organize my data locks in the way everyone wants to do them my CPU fans go off but my original way you don't feel jack shit and processes faster
turns out it's because 320k jobs is a bit much for mpsc. because my jobs can spawn more jobs the whole thing just grinds to a halt. and there's sync-mpsc which allows you to have a maximum number of data you send through it, therefore I can just have 245 sent jobs instead of 320k but then this locks all the threads because for a thread to finish it needs to finish sending jobs, but a sync mpsc won't let you send a job if current jobs are over the specified limit. so all the threads get stuck sending jobs. smart. not. what's even the point of that?!
and evidently there's no built-in way to prioritize certain jobs. the AI thinks you should just send jobs in and each thread should have a priority queue. I don't know sounds dumb to me. then you could by random luck have threads with lots of jobs that need to be prioritized to be done and other threads stuck hanging waiting for previous jobs / the other threads. no thanks
so clearly the solution is to rewrite mpsc but allow prioritization when a thread goes in to ask for a job to do
since my jobs are intended to start other jobs, it makes sense to have no actual upper bound limit to the number of jobs in the queue but to favour doing jobs that won't start new jobs to lower the RAM and compute necessary to juggle all this
hope this is the actual problem. cuz the code works for like 200 jobs spawning 500 jobs each, which is 100k jobs total
but it stalls to a halt doing 8300 jobs spawning 500 jobs each (which if I do the math -- in my tests it stalls at 320k jobs and seems the number should be 4,150,00 jobs -- yeah I think this is probably the damned problem)8 -
!rant, more of an incredulous/cruelly amused "you had ONE job..."
so: biggest IT/PC/electronics store in my (and neighboring) country. their webpage, of course with the function to buy online, because of course.
the big green "Buy" button does nothing. doesn't work. doesn't react. I keep clicking it multiple times, shorter, longer, etc, because maybe their JS scripts are just shit so they slow.
nope.
okay. open devtools, JS console.
hover over the button: "Error: isMobile is not a function".
click the button: "Error: isMobile is not a function"
WAT.
search for isMobile in the script.
173 occurences.
fuck this.
console: isMobile = function(){return false;}
because I'm not on my phone.
click the "Buy" button.
works flawlessly.
...HOW?
THE WHOLE PAGE IS AN ESHOP YOU COMIC RELIEF INCOMPETENTS! =D
173 uses of non-existing function that blocks business-critical feature, THE ONLY CORE FEATURE FOR WHICH YOUR SITE EVEN EXISTS, and NOBODY, not the dev who fucked it up, NOT EVEN QA, noticed it??? =D =D
if I was the boss of the devs, or even boss of the whole company...
git blame
...and then i'd go the whole chain from the dev who caused the bug, through all of the QA people who "tested" that version before deploy, and I would personally, on the spot, fire each and every single one of them.
mainly because of who knows how much money this stupid not even a proper bug lost them.
but secondarily, because clearly none of those people give a single shit (n)or have an idea how to do their jobs.
=D =D
yeah but I was a good guy, filed a bug report in the "Complaints" section of their Contact form.
it goes to some call-center-like peon, so it starts with a sentence "forward this to your site's dev people outright to file as a bug, thank you".
but... HOW.... =D
HOW can you let something like this through? =D
the bottleneck of your whole user interaction, which forms first of the three steps OF THE MAIN AND MOST IMPORTANT FUNCTION of your whole business... =D
...I...
...does not compute =D
...BUT THEY USING ANGULAR, SO THEY ALL MODERN AND HIGH-TECH AND EVERYTHING'S FINE!!! =D =D1 -
Using company's google cloud compute as proxy server for browsing 9gag.
I don't know if it's something to be proud or ashamed of.2 -
PLZ tell me I'm not the only idiot who without a fkn exception has to rewrite environemnt, I mean enviroment, SHIT, sorry environment multiple times cause my brain doesn't compute the fkn spelling of it.
9/10 times I have to google it....3 -
Any built a self-hosted compute server before? I’m debating making one. I am a big fan of laptops and don’t have a PC. I also don’t play video games so I would mostly be using the PC for code projects which lead me to the idea of a self hosted compute server17
-
My first rant. Which isn't really a rant but it is kind of...
Took a new job supposedly as a software developer. Ends up being CTO position. Now responsible for understanding the code of 6 people in a different country so as to move code dev to the country we're in...(not retaining the 6 after 2.0 release) Been 3 months.. Too much data. Cannot compute. Had to learn too many new things and the fuckers switched the front-end midway from Vue to React. First weeks essentially wasted. Now at the end and I'm supposed to know everything.
Also, I hate Symfony with a passion now. Loved it when it was hidden under Laravel. -
So I'm working on an assignment for my Computer Science class, and we have to basically compute strings into hash values and then modulo it by 1 million and put it into the hash table. But the value keeps overflowing and turning into negative values, anyone know how I can calculate the key?
(BTW, the hashcode is the same code that is calculated by the .hashCode function in Java)5 -
Ok just wanna share things that got me stuck for hours on my recent project and their solution. I hope it’s gonna help someone.
To start with, when I was implementing svg to png, i set an image object’s source with a data url. Normally this is going to trigger the onload hook. However for some fucked up reason it never triggered. The solution is to use setAttribute function and then the hook will be triggered.
Second, you can get rounded triangle by setting stroke width and set stroke linejoin and line cap as round. But remember, if stroke width is 6, then it’s 3 inside and 3 outside.
Third, if you have a rotation of svg element, and later on you want to manually compute the rotated point’s position, it’s most likely some vanilla code is not going to work. You see, when you rotate for x degree, it is actually rotating -x degree. I’m not sure if it’s a bug of my code, but it’s there.
And now the worst thing: if you look up how transform on svg is performed, stackoverflow is going to tell you it’s by order. But that’s somehow not true for my project. If I do set transform to do translation then rotation, the order it was applied is actually reversed. It’s rotation first then translation, like ffs why? Who the fuck said it was in order? It’s clearly in reverse fucking order.
Ok last thing, you can scale svg around it’s center, but absolutely don’t do that because it’s gonna fuck up tanslation and rotation applied to this svg. If you need to scale, translate it first then scale it will be better.
Anyway just some things i encountered. I’m gonna stay away from svg for at least two months now1 -
My own server infra without configmgmt. Thanks to puppet i have been able to finally give my brain some more compute resources for other things in life, because managing them all by hand is almost the equivalent of a medieval monk copying the bible over and over again.
Now i can manage in the tens of thousands at relative ease. -
Fixing some terrible rushed code from a group assignment last minute. what could've taken hours to compute, finished in under a second afterwards.
-
Client: Any way we can speed up that image load? Maybe compress the image?
Me: No, we need to compute if they have earned that image...that's what takes so long.
Client: Oh, I thought it would just somehow show if they've earned it! lol -
Anyone ever use gRPC with Objective-C and can help me out?
Following this tutorial:
https://cloud.google.com/solutions/...
Been dealing with this for a few days and can't figure it out 😫1 -
Sound off below - I need recs for a good cloud compute service that gives me VMDK (or similar) golden image import and complete control over network topology, other than AWS, Azure, GCP or DO. Linode is also preferably off the table unless someone has a good reason for them (they are very privacy invasive).
What do you recommend?2 -
The 1080ti is rated at a little more over 11 teraflops. GPUs with over 1 teraflops of compute performance was released in the early 2000's.
It's 2017 and we are stuck with fancy gen xxx cpus.
I smell a huge compute performance wastage.1 -
Trying to...
- Visual Studio 2017 released in 2016 with internal version number 15.9.38 with MSVC v14.0
- CUDA 8.0 with NVidia Nsight VS integration 5.3
- GTX 1080 GPU with compute capability 6.1
- Windows 10 SDK with 10.0.17763.0
Will it work? I don't fucking know because your versioning and documentation SUCKS!
For some time now it has become a number one mission for basically every tech company to rebrand, reversion and what not their products. It's obviously done with the purpose of confusing the customers, leading them on to buy/work with the wrong item, which of course leads to another purchase and hours of frustration and wasted time. This is not how business should be conducted, you dumbasses! -
WHAT THE FUCK IS THE GOOGLE SDK FOR?
ya I get it you connect to it.
It doesnt give local directory to Google Directory, it doesnt run ssh commands nor python commands. WHAT THE FUCK IS IT FOR?
DO I MAKE A BUCKET NOT COMPUTE ENGINE?
DO I SHOOT MYSELF IN THE FOOT AND DELETE THE PROJECT DUE TO HAVING AN OVERFLOW OF PYTHON FILES IN WRONG DIRECTORIES?
LIKE FOR REAL