Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "shared bandwidth"
-
!rant
I was in a hostel in my high school days.. I was studying commerce back then. Hostel days were the first time I ever used Wi-Fi. But it sucked big time. I'm barely got 5-10Kbps. It was mainly due to overcrowding and download accelerators.
So, I decided to do something about it. After doing some research, I discovered NetCut. And it did help me for my purposes to some extent. But it wasn't enough. I soon discovered that my floor shared the bandwidth with another floor in the hostel, and the only way I could get the 1Mbps was to go to that floor and use NetCut. That was riskier and I was lazy enough to convince myself look for a better solution rather than go to that floor every time I wanted to download something.
My hostel used Netgear's routers back then. I decided to find some way to get into those. I tried the default "admin" and "password", but my hostel's network admin knew better than that. I didn't give up. After searching all night (literally) about how to get into that router, I stumbled upon a blog that gave a brief info about "telnetenable" utility which could be used to access the router from command line. At that time, I knew nothing about telnet or command line. In the beginning I just couldn't get it to work. Then I figured I had to enable telnet from Windows settings. I did that and got a step further. I was now able to get into the router's shell by using default superuser login. But I didn’t know how to get the web access credentials from there. After googling some and a bit of trial and error, I got comfortable using cd, ls and cat commands. I hoped that some file in the router would have the web access credentials stored in cleartext. I spent the next hour just using cat to read every file. Luckily, I stumbled upon NVRAM which is used to store all config details of router. I went through all the output from cat (it was a lot of output) and discovered http_user and http_passwd. I tried that in the web interface and when it worked, my happiness knew no bounds. I literally ran across the floor screaming and shouting.
I knew nothing about hiding my tracks and soon my hostel’s admin found out I was tampering with the router's settings. But I was more than happy to share my discovery with him.
This experience planted a seed inside me and I went on to become the admin next year and eventually switch careers.
So that’s the story of how I met bash.
Thanks for reading!10 -
At office we sometimes lose our internet connection, the strange part is that it's not fully gone, if you (for example) ping an ip directly, it's fine. But if you try to load any web pages, or do any other kind of internet usage, it won't.
We finally know why...
It's because another company in the same building is uploading some huge thing and using all of the available upload bandwidth (200 mb/s)
So that's nice... Let's put a limiter on that so they DON'T FUCKING KICK US OFFLINE WHEN THEY NEED TO UPLOAD SOME.... WHAT EVER THEY MAKE...3 -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20