Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "such ram"
-
"Hey nephew, why doesn't the FB app work. It shows blank white boxes?"
- It can't connect or something? (I stopped using the FB app since 2013.)
"What is this safe mode that appeared on my phone?!"
- I don't know. I don't hack my smartphone that much. Well, I actually do have a customised ROM. But stop! I'm pecking my keyboard most of the time.
"Which of my files should I delete?"
- Am I supposed to know?
"Where did my Microsoft Word Doc1.docx go?"
- It lets you choose the location before you hit save.
"What is 1MB?"
- Search these concepts on Google. (some of us did not have access to the Internet when we learned to do basic computer operations as curious kids.)
"What should I search?"
- ...
"My computer doesn't work.. My phone has a virus. Do you think this PC they are selling me has a good spec? Is this Video Card and RAM good?"
- I'm a programmer. I write code. I think algorithmically and solve programming problems efficiently. I analyse concepts such as abstraction, algorithms, data structures, encapsulation, resource management, security, software engineering, and web development. No, I will not fix your PC.7 -
!(short rant)
Look I understand online privacy is a concern and we should really be very much aware about what data we are giving to whom. But when does it turn from being aware to just being paranoid and a maniac about it.? I mean okay, I know facebook has access to your data including your whatsapp chat (presumably), google listens to your conversations and snoops on your mail and shit, amazon advertises that you must have their spy system (read alexa) install in your homes and numerous other cases. But in the end it really boils down to "everyone wants your data but who do you trust your data with?"
For me, facebook and the so-called social media sites are a strict no-no but I use whatsapp as my primary chating application. I like to use google for my searches because yaa it gives me more accurate search results as compared to ddg because it has my search history. I use gmail as my primary as well as work email because it is convinient and an adv here and there doesnt bother me. Their spam filters, the easy accessibility options, the storage they offer everything is much more convinient for me. I use linux for my work related stuff (obviously) but I play my games on windows. Alexa and such type of products are again a big no-no for me but I regularly shop from amazon and unless I am searching for some weird ass shit (which if you want to, do it in some incognito mode) I am fine with coming across some advs about things I searched for. Sometimes it reminds me of things I need to buy which I might have put off and later on forgot. I have an amazon prime account because prime video has some good shows in there. My primary web browser is chrome because I simply love its developer tools and I now have gotten used to it. So unless chrome is very much hogging on my ram, in which case I switch over to firefox for some of my tabs, I am okay with using chrome. I have a motorola phone with stock android which means all google apps pre-installed. I use hangouts, google keep, google map(cannot live without it now), heck even google photos, but I also deny certain accesses to apps which I find fishy like if you are a game, you should not have access to my gps. I live in India where we have aadhar cards(like the social securtiy number in the USA) where the government has our fingerprints and all our data because every damn thing now needs to be linked with your aadhar otherwise your service will be terminated. Like your mobile number, your investment policies, your income tax, heck even your marraige certificates need to be linked with your aadhar card. Here, I dont have any option but to give in because somehow "its in the interest of the nation". Not surprisingly, this thing recently came to light where you can get your hands on anyone's aadhar details including their fingerprints for just ₹50($1). Fuck that shit.
tl;dr
There are and should be always exceptions when it comes to privacy because when you give the other person your data, it sometimes makes your life much easier. On the other hand, people/services asking for your data with the sole purpose of infilterating into your private life and not providing any usefulness should just be boycotted. It all boils down to till what extent you wish to share your data(ranging from literally installing a spying device in your house to them knowing that I want to understand how spring security works) and how much do you trust the service with your data. Example being, I just shared most of my private data in this rant with a group of unknown people and I am okay with it, because I know I can trust dev rant with my posts(unlike facebook).29 -
1. I wish that people start taking back their device ownership. Right to repair is an extremely important thing. Like that Nexus 6P that I've recently repaired by jamming another battery into it, now it's at 110-ish% health according to AccuBattery. And it cost me.. €10 or so? All the while if I wasn't able to get in there, it would've been a €120 paperweight (and that's not even considering the €300-ish (? Someone please fill me in on that) price it retailed at back in 2015 when it was a flagship).
(edit the so many'th: according to https://express.co.uk/life-style/... the base model was apparently £449 at release, haven't been able to verify it though.. point is, a paperweight at such prices would've been quite a bummer, I mean for me it was even one given that it failed a mere few months after purchase for €120.. €40/m for a phone ain't nothing :/)
Right to repair is an extremely important thing, and the ability to do so shouldn't ever be impeded. Users should become able again to service the devices that they own.
2. I wish that people start caring about their privacy again. Google and Facebook and the likes are large companies, but at the end of the day, that's all they are. Large companies. And they're hungry for your data, not because they're selling it, rather because they're collecting it to an extent which they shouldn't. Over at DDG (https://spreadprivacy.com/duckduckg...) they explain a very much viable alternative revenue model pretty well. Additionally, there's several tools which you can use to limit the amount of data that's being collected about you. These include but are not limited to Firefox, NoScript, ad blockers (I personally use uBlock), a trustworthy VPN (ideally one of your own), and Tor.
3. I wish that software would become less inefficient. It really pains me to see that applications with functionality that could be implemented in a couple of MB at most come at a size of several hundreds of MB. 1% efficiency, even the inefficient as fuck tungsten light bulbs weren't that awful!!! Imagine what could be done with all the hardware we have available nowadays, if every piece of software would be around 80% efficient as is a common norm in electronics. Just looking at Linux which is still in many ways convoluted, modern desktops with a couple hundred MB of RAM usage? You've got it! So why can't OS's like Windows (although I have to say, huge improvements have been made there over the last few years) and browsers like Firefox and Chrome be more like that? I really don't understand.
There's several more wishes I have of course, but those are the most important ones.. hopefully I'll be able to see at least one of them come true during my life.10 -
Today, I was told to investigate why the software doesn't work on "some" computers. I had no previous experience with that particular software but I just had to make some tests... easy, right? As soon as I ran the software, my computer crashed (I literally had to restart the pc). I asked my colleagues if I did something wrong but the set up seemed ok.
Later, in a random discussion about the software I found out it does "a little memory allocation". I opened the performance tab in task manager and ran the software again. In an instant, the RAM went from 1.3GB to 7.66GB (my pc has 8GB of RAM).
In an attempt to find how such a monstrosity was creater, I found out the developer that made the software had 16GB of RAM on his pc.
I have found something that eats RAM more than Chrome... brace yourselves.8 -
I have this little hobby project going on for a while now, and I thought it's worth sharing. Now at first blush this might seem like just another screenshot with neofetch.. but this thing has quite the story to tell. This laptop is no less than 17 years old.
So, a Compaq nx7010, a business laptop from 2004. It has had plenty of software and hardware mods alike. Let's start with the software.
It's running run-off-the-mill Debian 9, with a custom kernel. The reason why it's running that version of Debian is because of bugs in the network driver (ipw2200) in Debian 10, causing it to disconnect after a day or so. Less of an issue in Debian 9, and seemingly fixed by upgrading the kernel to a custom one. And the kernel is actually one of the things where you can save heaps of space when you do it yourself. The kernel package itself is 8.4MB for this one. The headers are 7.4MB. The stock kernels on the other hand (4.19 at downstream revisions 9, 10 and 13) took up a whole GB of space combined. That is how much I've been able to remove, even from headless systems. The stock kernels are incredibly bloated for what they are.
Other than that, most of the data storage is done through NFS over WiFi, which is actually faster than what is inside this laptop (a CF card which I will get to later).
Now let's talk hardware. And at age 17, you can imagine that it has seen quite a bit of maintenance there. The easiest mod is probably the flash mod. These old laptops use IDE for storage rather than SATA. Now the nice thing about IDE is that it actually lives on to this very day, in CF cards. The pinout is exactly the same. So you can use passive IDE-CF adapters and plug in a CF card. Easy!
The next thing I want to talk about is the battery. And um.. why that one is a bad idea to mod. Finding replacements for such old hardware.. good luck with that. So your other option is something called recelling, where you disassemble the battery and, well, replace the cells. The problem is that those battery packs are built like tanks and the disassembly will likely result in a broken battery housing (which you'll still need). Also the controllers inside those battery packs are either too smart or too stupid to play nicely with new cells. On that laptop at least, the new cells still had a perceived capacity of the old ones, while obviously the voltage on the cells themselves didn't change at all. The laptop thought the batteries were done for, despite still being chock full of juice. Then I tried to recalibrate them in the BIOS and fried the battery controller. Do not try to recell the battery, unless you have a spare already. The controllers and battery housings are complete and utter dogshit.
Next up is the display backlight. Originally this laptop used to use a CCFL backlight, which is a tiny tube that is driven at around 2000 volts. To its controller go either 7, 6, 4 or 3 wires, which are all related and I will get to. Signs of it dying are redshift, and eventually it going out until you close the lid and open it up again. The reason for it is that the voltage required to keep that CCFL "excited" rises over time, beyond what the controller can do.
So, 7-pin configuration is 2x VCC (12V), 2x enable (on or off), 1x adjust (analog brightness), and 2x ground. 6-pin gets rid of 1 enable line. Those are the configurations you'll find in CCFL. Then came LED lighting which required much less power to run. So the 4-pin configuration gets rid of a VCC and a ground line. And finally you have the 3-pin configuration which gets rid of the adjust line, and you can just short it to the enable line.
There are some other mods but I'm running out of characters. Why am I telling you all this? The reason is that this laptop doesn't feel any different to use than the ThinkPad x220 and IdeaPad Y700 I have on my desk (with 6c12t, 32G of RAM, ~1TB of SSDs and 2TB HDDs). A hefty setup compared to a very dated one, yet they feel the same. It can do web browsing, I can chat on Telegram with it, and I can do programming on it. So, if you're looking for a hobby project, maybe some kind of restrictions on your hardware to spark that creativity that makes code better, I can highly recommend it. I think I'm almost done with this project, and it was heaps of fun :D12 -
Storytime
A story about an Android TVbox which decided to become an iPad
Several years ago we've bought an android tv-box.
It served me and my family well for several years.
Specs are not that important in this story, but there they are:
Android 4.4
1GB RAM
Amlogic quadcore 1.4HGz
8GB memory.
This device served us well - online TV, browsing, music, file sharing and so on. But recently cheap Chinese memory deciteed to take a break and damaged ROM. Because of that device won't boot. The only option was to take it apart and "short circuit" certain legs on memory chip and make it boot from SD card and install new firmware. After such operations tv-box worked well again.
Hoverer, memory glitched again and again and this algorithm was repeated for several months.
But that is not what is this story about.
One day memory went completely crazy and there was no way to install new firmware on it. It just hanged on install. (BTW, it was official firmware for this device)
But after countless attempts it finally worked! It installed the firmware and booted into launcher and connected to WiFi!
But now comes the most interesting part.
It was not android anymore.
It decided to became an iPad.
My dad logged in to his Google account via tv-box and got mail that someboby connected from our IP via iPad (we don't have an iPad) and using safari browser! Stock browser is not safari browser.....
"Ok, nvm, crazy glitch." - we thought.
But preinstalled play marked wont launch. Because he told us, that we're trying to connect from iPad.
And Google chrome page suggested to download chrome for iPad
And everything was acting like it is an iPad.
OK, downloaded iTunes, why not??? ._.
Tried to install elixir for android via apk from flash, but then memory glitched one more time, everything went black and tv-box had damaged ROM again...
After that we decided to not torment it anymore...
That's it. Poor Android TVbox that all his life dreamed to become an iPad. Rest in peace.2 -
Once it really hit me hard. The father of my brothers wife once told me that I'm not fit for IT in general. He thinks that I have pseudo knowledge of IT and Programming.
He just works parttime at home as "computer scientist" and sells routers, pc and such stuff to some private customers. Before he used Filemaker and sayd that he already coded his own CRM with it.
When he said that it really made me sad. But after we talked I looked back what I already achieved:
1. I build for me and friends custom PC's with Case mods and Hard Tube watercooling
2. I can programm in HTML5, CSS3 and PHP
3. I raised a Community with over 60 people in it. We got 2 dedicated Linux Roots (I7-6700K, 64GB RAM, SSD)
4. I manage the Linux Servers on my own with VoIP, Mail-, Web-, MySQL- and Gameservers
5. I built up a complete Community Solution with Game Groups, Forum, Tournament System and a lot of custom scripts.
6. Now Im almost finished learning the C++ Basics to code and manage to learn the beginning of GUI/UX programming.
7. Next thing Im gonna learn is Javascript (Browser) and Java, so I can complete my Web Skills and also can code Java Desktop Apps and Java game plugins (don't rant, Javascript is not the same as Java, I know 😉)
So I thought to myself "maybe in the eyes of others Im not a computer scientist, but then Im on the way to be one at least"
But please dont be a douche (the father) and prejudice me, before you don't know what I already can and achieved.
Just because you're are selling computer parts and installing them doesn't mean, that you are a computer scientist and telling me that I'm not 😉
In IT you're the smith of your own merit!7 -
I tried to convince my boss that using 3d rendering to display information on webpage is unnecessary luxury.
The web browser would hang if the user is using an average pc and there is too much data to render.
This product is aimed for average joe, but he argues that computers in foreign countries are high end devices ONLY.
Such a bullshit.
I asked what if someone with low spec laptop tries to view the webpage.
He said, we will set a min spec requirements for using the website.
Are you fucking kidding me?! RAM and Graphics requirements for a webpage?!
My instinct says that the thing I'm working on would probably end up as waste of time.
But I'd probably learn cool tricks of threejs.5 -
I love Linux, but its community can be so full of incompetent assholes..
Just now I asked in Freenode ##linux how to get the process ID of my current running process in bash. I got my answer - it's a shell built-in called "$$".
Then people start to nitpick some more - why do you need it? How is that different from an exit? - to which my response was.. well I know the whole idea behind exit codes, and I'd use it whenever possible, in all defined behavior that allows my program to terminate itself whenever it can. This pidfile however would be used to exit itself and provide diagnostic information whenever the program enters undefined behavior - a segfault in C language. Scenarios in which I don't have full control over the script's behavior anymore, such as the system entering an unworkable state where the system stalled, still got some binaries in RAM but the rootfs got unwritable, such as now - very helpfully, thanks HP! - when my laptop likely overheated and shat itself. I issued sudo reboot into it, but even that wouldn't issue properly anymore due to the /sbin/poweroff binary becoming inaccessible too. I had to issue a hard power cycle.. one of the few times in which I'm thankful to HP for actually causing shit like this, lol.
Point is, that undefined behavior is what I'm trying to mitigate against. I certainly can't let any files other than diagnostics remain in nonvolatile storage like that, especially when their state should be predictable in order to ensure good operation (like files expressing whether the script is already running or not, i.e. lock files).
Back to that IRC chat. Aside from the answer, I got ridicule from people who probably don't even know how to properly compile a kernel. Ubuntu users, overconfident scum. Sometimes I feel like I should ask questions in channels like #archlinux only, where such incompetency is ridiculed on its own.13 -
So, today for my SO's father who is already over 70 and wants to try Linux. However, he doesn't want Linux on his main PC for now, rather on the old one so that he can take his time to get familiar, which is a reasonable plan.
But holy crap, what a machine! Intel Core2 Duo 4400, 2 GB DDR2(!) RAM, 250 GB IDE(!) HDD, DVD RW drive. Graphics, sound and LAN integrated on the mobo chipset. It's half a miracle that it doesn't run on steam. The machine had been delivered with Vista and has always been painfully slow.
It doesn't even support booting from USB, but I had prepared a DVD just in case. Surprise: it booted from DVD without issues and with full HW support!
Partitioned and installed, deleted Vista in the process (felt good). I went with the full blown Mint 20 Cinnamon edition because XFCE isn't as beautiful. Also, having XFCE now and then Cinnamon looking different on the other PC would be confusing.
Installation took some time, but worked. Cinnamon's RAM usage is at 750 MB idle, and at 1.1 GB with Firefox started. Once the PC is booted, it runs pretty OK with reduced swappiness and noatime on all file systems, plus unnecessary startup applications disabled. Updates took long, but ran through successfully. Installed LibreOffice and some small games, Firefox got uBlock Origin, Youtube worked OOTB.
That PC somehow had escaped disposal several times - and now has a proper OS for the first time in its miserable existence. It runs so much better than it ever has. Just wow, a "big" Linux desktop from 2020 blows a contemporary Vista out of the water on such an old machine!16 -
What kind of person doesn’t install Windows 10 for a free pre-installation of Candy Crush Soda Saga thrown into the mix? I really enjoy it when my Operating System comes preloaded with bullshit. It’s almost as if I’m losing rights to choose what I want installed on my operating system. It’s really enjoyable when Candy Crush Soda Saga appears in the background in task manager despite never opening this “””game”””. I find it amazing that after building such a powerful computer I can know that my fast 16gb ram is being used to keep bloatware running in the background. Every night I dream of the people who buy new computers with a fresh copy of Windows 10 pre-installed on it to find it has a copy of Candy Crush Soda Saga already waiting to be played! The joy and tears that must come to such a persons eye to know that Bill Gates was kind enough to bless the world with every middle-aged persons favourite game, Candy Crush Soda Saga, to be the first app that appears on their start menu. The thoughts running through every developers mind at Microsoft as they pre-load a copy of Candy Crush Soda Saga onto every copy of Windows 10. They must really feel alive and definitely would not consider doing anything else for a living but copying the files of Candy Crush Soda Saga across onto Official Windows 10 Installations. The rush of blood into their mind as they know that thousands, if not millions, of users from around the world open their brand new computer for the first time to see that King managed to bribe Microsoft with more money that you’ll ever get your hands onto into making them add a free copy of Candy Crush Soda Saga onto their computer. As thousands of those users move their mouse over this work of art, right click it and press uninstall without a second of doubt in their mind, rendering Kings investment to be a waste of time, money and effort. This is a story we will tell for generations and generations in the future of how the worlds most popular Operating System was not preloaded with a free copy of McAfee, but instead a copy of Candy Crush Soda Saga for the entire world to rejoice. Good day to you all.11
-
Anyone care to explain why programs nowadays use so much bloody RAM? We went to the moon on what amounted to a bunch of potatoes wired to each other, Linux (a whole bloody OS) with a graphical interface consumes only a couple hundred MB of RAM, but my IDE needs 1+GB?
Seriously, unless you're handling very large amounts of data (like a high res image or doing some insanely crazy math, I doubt there's any need for such high usage. I get it, 8/16GB is commonplace, but that doesn't mean more should be used for shits and giggles...33 -
Fucking shit, this university's website is so damn slow! Basically Every Semester, every student need to enroll to certain classes in University Website.
But the Infrastructure is not enough to handle such a big amount of students, we have approx. 7000 students enrolling at the same fucking time.
And here i am can't enroll to any class at all this semester. Fuck such a waste of time. This always been a thing since they digitalize enrolling system.
I don't want this to happen again. The student always be a victim since they cannot handle the request. Now, as a dev, i want to propose something better to optimize the server, i have some connection to pass some bureaucracy. I am going to do some brainstorming and I will need some solution.
Here some data i gather when i am mad from my univ infrastructure division :
1. The Server is a simple Local Server Forwarded to the Internet.
2. The Server use Windows Server 2007.
4. Web Server Using Microsoft IIS
3. The Website built using ASP.NET
4. The connection is not SSL encrypted (yes its fucking use the http)
5. Hardware Spec (not confirmed officialy, i got this information from my professor) :
- Core i5 4460
- 4 GB Ram
- 1 Gbit NIC
I will summon some expert here and i hope want to help me(us all) out.24 -
After a few weeks of being insanely busy, I decided to log onto Steam and maybe relax with a few people and play some games. I enjoy playing a few sandbox games and do freelance development for those games (Anywhere from a simple script to a full on server setup) on the side. It just so happened that I had an 'urgent' request from one of my old staff member from an old community I use to own. This staff member decided to run his own community after I sold mine off since I didn't have the passion anymore to deal with the community on a daily basis.
O: Owner (Former staff member/friend)
D: Other Dev
O: Hey, I need urgent help man! Got a few things developed for my server, and now the server won't stay stable and crashes randomly. I really need help, my developer can't figure it out.
Me: Uhm, sure. Just remember, if it's small I'll do it for free since you're an old friend, but if it's a bigger issue or needs a full recode or whatever, you're gonna have to pay. Another option is, I tell you what's wrong and you can have your developer fix it.
O: Sounds good, I'll give you owner access to everything so you can check it out.
Me: Sounds good
*An hour passes by*
O: Sorry it took so long, had to deal with some crap. *Insert credentials, etc*
Me: Ok, give me a few minutes to do some basic tests. What was that new feature or whatever you added?
O: *Explains long feature, and where it's located*
Me: *Begins to review the files* *Internal rage wondering what fucking developer could code such trash* *Tests a few methods, and watches CPU/RAM and an internal graph for usage*
Me: Who coded this module?
O: My developer.
Me: *Calm tone, with a mix of some anger* So, you know what, I'm just gonna do some simple math for ya. You're running 33 ticks a second for the server, with an average of about 40ish players. 33x60 = 1980 cycles a minute, now lets times that by the 40 players on average, you have 79,200 cycles per minute or nearly 4.8 fucking cycles an hour (If you maxed the server at 64 players, it's going to run an amazing fucking 7.6 million cycles an hour, like holy fuck). You're also running a MySQLite query every cycle while transferring useless data to the server, you're clusterfucking the server and overloading it for no fucking reason and that's why you're crashing it. Another question, who the fuck wrote the security of this? I can literally send commands to the server with this insecure method and delete all of your files... If you actually want your fucking server stable and secure, I'm gonna have to recode this entire module to reduce your developer's clusterfuck of 4.8 million cycles to about 400 every hour... it's gonna be $50.
D: *Angered* You're wrong, this is the best way to do it, I did stress testing! *Insert other defensive comments* You're just a shitty developer (This one got me)
Me: *Calm* You're calling me a shitty developer? You're the person that doesn't understand a timer, I get that you're new to this world, but reading the wiki or even using the game's forums would've ripped this code to shreds and you to shreds. You're not even a developer, cause most of this is so disorganized it looks like you copy and pasted it. *Get's angered here and starts some light screaming* You're wasting CPU usage, the game can't use more than 1 physical core, and after a quick test, you're stupid 'amazing' module is using about 40% of the CPU. You need to fucking realize the 40ish average players, use less than this... THEY SHOULD BE MORE INTENSIVE THAN YOUR CODE, NOT THE OPPOSITE.
O: Hey don't be rude to Venom, he's an amazing coder. You're still new, you don't know as much as him. Ok, I'll pay you the money to get it recoded.
Me: Sounds good. *Angered tone* Also you developer boy, learn to listen to feedback and maybe learn to improve your shitty code. Cause you'll never go anywhere if you don't even understand who bad this garbage is, and that you can't even use the fucking wiki for this game. The only fucking way you're gonna improve is to use some of my suggestions.
D: *Leaves call without saying anything*
TL;DR: Shitty developer ran some shitty XP system code for a game nearly 4.8 million times an hour (average) or just above 7.6 million times an hour (if maxed), plus running MySQLite when it could've been done within about like 400 an hour at max. Tried calling me a shitty developer, and got sorta yelled at while I was trying to keep calm.
Still pissed he tried calling me a shitty developer... -
Must nearly every recently-made piece of software be terrible?
Firefox runs terribly slowly on a four-core 1.6GHz processor when given eight (8) gigabytes of RAM. Discord's user interface is awfully slow and uses unnecessary animations. Google's stuff is just falling apart; a toaster notification regarding MRO stock was recently pushed such that some markup elements of this notification were visible in the notification, the download links which are generated by Google Drive have sometimes returned error 404, and Google's software is overall sluggish and somewhat unstable. Today, an Android phone failed to update the Google Drive application... and failed to return a meaningful error message. Comprehensive manuals appear to be increasingly often not provided. Microsoft began to digest Windows after Windows XP was released.
Laziness is not virtuous.
For all computer programs, a computer program should be written such that this computer program performs well on reasonably terrible hardware... and kept simple. The UNIX philosophy is woefully underappreciated.37 -
Stranger Things Fan: Have you seen Stranger Things? It's really good. You NEEEEED to watch it. Now.
Firefox Quantum Fan: Have you used FireFox Quantum? It's safe, fast, and uses less RAM. I don't know why it has such a low market share. You neeeeeeeed to use it! NOW.4 -
Just like GMO foods have to be labelled as such in some regions, electron apps should be labelled too so I don't accidentally contract RAM-herpes9
-
Why the fuck is android studio such an unoptimized POS?
It sucks up resources like a sponge.
And Gradle just pushes the cores to the max.
Simulator is a memory hogging Pos too.
I had to buy a new laptop just to run it faster.
I had
AMD A8 4500M
8gb ram, running on Ubuntu.
Now I have macbook-pro 15 in. And it still can ramp us the fans.
Ffs get your shit together jetbrains.15 -
Have higher technical schools with computer science as one of their topics (such as the one I visit) introduce people to how RAM works in detail earlier than in fourth grade, which would really help some of my classmates.2
-
This is a guide for technology noobies who wants to buy a laptop but have no idea what the SPECS are meaning.
1. Brand
If you like Apple, and love their !sleek design, go to the nearest Apple store and tell them "I want to buy one. Recommendations?"
If you don't like Apple, well, buy anything that fits you. Read more below.
2. Size
There are 11~15 inches, weight is 850g ~ 2+kg. Very many options. Buy whatever you like.
//Fun part coming
3. CPU
This is the power of the brain.
For example,
Pentium is Elementary Schoolers
i3 is Middle Schoolers
i5 is High Schoolers
i7 is University People
Dual-core is 2 people
Quad-core is 4 people
Quiz! What is i5 Dual-core?
A) 2 High Schoolers.
Easy peasy, right?
Now if you have a smartphone and ONLY use Messaging, Phone, and Whatsapp (lol), you can buy Pentium laptops.
If not, I recommend at least i3
Also, there are numbers behind those CPU, like i3-6100
6 means 6th generaton.
If the numbers are bigger, it is the most recent generation.
Think of 6xxx as Stone age people
7xxx as Bronze age people
8xxx as Iron age people
and so one.
4. RAM
This is the size of the desk.
There are 4GB, 8GB, 16GB, 32GB, and so one.
Think of 4GB as small desk to only put one book on it.
8GB as a desk to put a laptop with a keyboard and a mouse.
16GB as a normal sized desk to put some books, laptop, and food.
32GB as a boss sized desk.
And so one.
When you do multitasking, and the desk is too small...
You don't feel comfortable right?
It is good when there are spacious space.
Same with RAM.
But when the desk becomes larger, it gets expensive, so buy the one with the affordable price.
If you watch some YouTube videos in Chrome and do some document words with Office, buy at least 8GB. 16GB is recommended.
5. HDD/SSD
You take out the stuffs such as books and laptop from the basket (HDD/SSD), and put in your desk (RAM).
There are two kinds of baskets.
The super big ones, but because it is so big, it is bulky and hard to get stuffs out of the basket. But it is cheap. (HDD)
There are a bit smaller ones but expensive compared to the HDD, it is called SSD. This basket is right next to you, and it is super easy to get stuffs out of this basket. The opening time is faster as well.
SSDs were expensive, but as times go, it gets bigger as well, and cheaper. So most laptops are SSD these days.
There are 128GB, 256GB, 512GB, and 1024GB(=1TB), and so one. You can buy what you want. Recommend 256GB for normal use.
Game guy? At least 512GB.
6. Graphics
It is the eyesight.
Most computers doesn't have dedicated graphics card, it comes with the CPU. Intel CPUs has CPU + graphics, but the graphics powered by Intel isn't that good.
But NVIDIA graphics cards are great. Recommended for gamers. But it is a bit more expensive.
So TL;DR
Buying a laptop is
- Pick the person and the person's clothes (brand and design)
- Pick the space for the person to stay (RAM, SSD/HDD)
- Pick how smart they are (CPU)
- Pick how many (Core)
- Pick the generation (6xxx, 7xxx ....)
- Pick their eyesight (graphics)
And that's pretty much it.
Super easy to buy a laptop right?
If you have suggestions or questions, make sure to leave a comment, upvote this rant, and share to your friends!2 -
!Rant
The Eve V is a Community Developed Windows Tablet which will be released in Ocotober this year.
The best specs you can get is i7 6th Gen, 16 GB Ram and 500 GB SSD for around 1400$.
He wants 32 GB RAM in such a slim and small Tablet. Also wants a 7th Gen Intel CPU and 1 TB SSD in it. He's ready to pay 1000 $ more for his special edition.
The Eve V is already finished with tooling and they are wating for the screens to arrive. They already have all the other parts.
Dude you're crazy and that specs just for VM's? What are you running? 10x WinShitBox?
He's insulting everyone on the forum with a different opinion and says, retooling will be such easy.
That guy is a real douchebag and doesn't know shit. If you would build in different parts like this, you would need to remodel the whole tablet. You can stick your 1000$ in your ass, retooling would cost more than 10'000$
If it's so easy you shithead, then do it yourself and dont say it's possible and say to other people they should do! 😂😂2 -
This is true incident.
I fried the motherboard of my new Windows 10 Home ASUS UX303UA laptop having 8 fucking gigs of RAM and 1TB HDD with dedicated Nvidia Graphics Card and video memory by just trying to repeat what I love to do which is :
Install and play Crysis on EA Origin paid channel
And
Install and program on Linux VM using Virtualbox
And
Listen to music
I am so fucking scared now that I am not going to repeat it again.
I fuck the fear of using such machines.21 -
Man wk89 awesome... bringing back a lot of memories. The one thing really stands out to me though is the software.
I see a lot of rants about people shocked that turboC is still in use or other DOS programs are still in production. A lot can of bad be said here but I think often it's a case of we truly don't build things like we did in the good old days.
What those devs accomplished with such limited resources is phenomenal and the fact that we still haven't managed to replicate the feel and usability of it says a lot, not to mention just how fucking stable most of it was.
My favourite games are all DOS based, my most favourite of all time Sherlock is 103kb in size. When I started coding games I made a clone of it and to this day I am still trying to figure out what sorcery is in the algorithm that generates/solves puzzles that makes it so fast and memory efficient. I must have tried 100+ ways and can't even come close. NB! If you know you can hint but don't tell me. Solving this is a matter of personal pride.
Where those games really stand out is when you get into the graphics processing - the solutions they came up with to render sprites, maps and trick your eyes into seeing detail with only 4-16 colours is nothing short of genius. Also take a second to consider that taking a screen shot of the game is larger than the entire game itself and let that sink in...
I think the dramatic increase in storage, processing power and ram over the last decade is making us shit developers - all of us. Just take one look at chrome, skype or anything else mainline really and it's easy to see we no longer give a rats ass about memory anywhere except our monthly AWS/GCE bill.
We don't have to be creative or even mindful about anything but the most significant memory leaks in order to get our software to run now days. We also don't have constraints to distribute it, fast deliver-ability is rewarded over quality software. It's only expected to stay in production 3-4 years anyway.
Those guys were the true "rockstars" and "ninja" developers and if you can't acknowledge that you can take ya React app and shovit. -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
Error mesage on my 4GB ram "smart" phone, running a "smart" application.
'Oh no ! Something went wrong. Re-try'
Whatttttttt!!!
WHY DOES A "SMART" APPLICATION GIVE SUCH A STUPID ERROR MESSAGE.
IT IS MY PHONE, PLEASE PROVIDE A HELPFUL ERROR MESSAGE, WHY IS IT SO HARD.
I JUST HATE SOFTWARE THAT TREATS THE END USERS AS STUPID ANIMALS.9 -
ACPI YOU FUCKING CUNT
STOP IMPOSING SUCH SHITTY STANDARDS THAT REQUIRE AN OPERATING SYSTEM TO SCAN THE RAM FOR SOMETHING AND TO INTERPRET BYTECODE BY ITSELF!
JUST BECAUSE YOU WERE MADE TO UNIFY THINGS DOESNT MEAN THAT YOU SHOULDNT BE REDESIGNED FROM SCRATCH YOU FUCKING MOTHERFUCKING BALLS BUSTING SHITTY STANDARD ^1
ALSO, PLEASE FFS DOCUMENT YOUR SHIT-KNÖDEL WELL, PLEASE. WOULD MAKE IT AT LEAST BEAREABLE
^1 I realized I didnt use enough swear words1 -
So for a while I have wanted to build a raspberry pi cluster. In the spirit of shia labeouf I got started last saturday.
I had two pies lying around so I figured I'd run some experiments before I invested in a lot of hardware. After about a day I had turned the two pies into a shared cluster when disaster struck....
I had completely ignored the fact that you cannot run 32 or 64bit software on an arm processor (I know... I'm a java developer). So when I booted my service and the load balancer, I found that nothing worked. So pretty bumbed out, I quit the project.
Later that day I found a crazy guy who had bought a batch of 400 small form factor PSUs (300W) and internally I laughed at him a little. I mean, who's gonna sell 300W irregular power supplies. Then, just as I was about to go to bed I found this guy, he was selling from a batch of CPU-onboard motherboard for 10 bucks each and everything clicked!
I did some quick calculations and decided I could probably gather enough cash to get: 10 motherboards, 10 2GB ram dimms, 10 Sata disks and 14 PSU (in case some fail) and some misc hardware for networking and such.
So... Long story short, I am going to build a cluster computer, the first version is going to have 10 nodes and I am waiting for delivery right now!12 -
This always gets me:
Developers complaining that their 4 year old / cheap ass computer is slow.
Get. A. New. One.
It's not that hard.
Here, let me do one for you:
https://computeruniverse.net/en/...
I just went to a site that delivers across Europe, and selected a cheap laptop with a decent CPU and SSD. Short on RAM, sure, and without a Windows License. But you can buy RAM for an additional 50$, and that brings you to a total of 550€, delivery included. And it will WORK. And it will be fast.
It's too expensive?
No, not exactly. Wherever you are in the world, if you can code decently, good enough to have the right to complain about development tools, you are eligible to at least 10$ per hour income as a freelancer across the globe. I've had such opportunities offered to me by many organizations, especially non-profit ones that need cheap employees. I actually was offered more but let's stick to 10$ per hour.
So that's 1600$ per month. Enough to buy 3 such laptops. Oh, taxes, I forgot. So you get 2 laptops. Wait! You need food and everything else. Well if you're in a country where that offer actually makes sense, then it's likely that you can live off of 400$ per month quite well. Maybe 800$ if you need to pay rent.
So that's roughly 1 month of work for a laptop that will make you not waste time on waiting for stuff.
Sweet! 1 Month! What does it get me?
Well assuming that you have no laptop, it gets you A JOB that pays you 1600$ per month.
But if you DO have a laptop, you can sell it for cheap, and benefit from the following:
1. Boot-up time from 30-60 seconds to 10 seconds.
2. Installing software - from 1 minute to 10 seconds.
3. Opening a browser - from 10 seconds to 1 second.
4. Opening an advanced text editor (Atom, VS.Code) - from 10 seconds to 1 second.
5. Searching for a file on your entire hard drive - from 1 hour to 2 minutes.
....
You get the point. Waiting is reduced by several times.
So how much do you really wait when coding?
Well are you compiling? Are you opening a new project and the IDE needs to re-index the files? Are you opening programs like a terminal emulator, browser and such? Are you using virtual machines for dev environments?
Well all of these processes become several times faster. Depending on how often you do it, you'll be saving yourself from 1 hour per day to upto 4 hours per day (my case, where a HDD would be just out of the question).
How much is that time worth? At least 10$ per day. If you're working for 20 days per month, 240 days per year, that's a total of 2400$. And for the life time of that crappy laptop of 2 years, that's 4800$ saved. And that's with hugely conservative numbers. Nobody pays 10$ per hour any more, except if you've just started in the industry. I know because I've been there.
Please, for all that's sacred to you, justify right here, right now, HOW THE FUCK can you not afford to get that 8GB of RAM, that cheap ass SSD for 100$, or even a brand new laptop (hey! it's even portable and has FHD graphics on it!) for 550$.
That's why every time I hear someone who is a professional developer complain that they don't have money for a decent machine, I have to ask: why the fuck are you wasting yours and everyone else's time?!10 -
This is the best example of google giving a fuck about their own guidelines.
They always ram their expectations of you making your apps fit the guidelines a 100% into you, but then they give a fuck about heir guidelines in their own software.
They use a ListView here in google contacts. It's completely outdated for a large amount of data, such as my 200 contacts. They literally push you not to use outdated techniques such as ListViews in your app. Use RecyclerViews, our completely new solution instead. ListViews are very very bad in performance.
I KNOW THIS SOUNDS PICKY, BUT THIS IS JUST AN EXAMPLE!!! THEY DON'T CARE ABOUT THEIR OWN GUIDELINES IN EVRY WAY! BEST OTHER EXAMPLE IS GOOGLE PLAY STORE. BAD PERFORMANCE 100%. BUT AT LEAST IT HAS RUCKLING ANIMATIONS.4 -
It's 2022 and Firefox still doesn't allow deactivating video caching to disk.
When playing videos from some sites like the Internet Archive, it writes several hundreds of megabytes to the disk, which causes wear on flash storage in the long term. This is the same reason cited for the use of jsonlz4 instead of plain JSON. The caching of videos to disk even happens when deactivating the normal browsing cache (about:config property "browser.cache.disk.enable").
I get the benefit of media caching, but I'd prefer Firefox not to write gigabytes to my SSD each time I watch a somewhat long video. There is actually the about:config property "browser.privatebrowsing.forceMediaMemoryCache", but as the name implies, it is only for private browsing. The RAM is much more suitable for this purpose, and modern computers have, unlike computers from a decade ago, RAM in abundance, which is intended precisely for such a purpose.
The caching of video (and audio) to disk is completely unnecessary as of 2022. It was useful over a decade ago, back when an average computer had 4 GB of RAM and a spinning hard disk (HDD). Now, computers commonly have 16 GB RAM and a solid-state drive (SSD), which makes media caching on disk obsolete, and even detrimental due to weardown. HDDs do not wear down much from writing, since it just alters magnetic fields. HDDs just wear down from the spinning and random access, whereas SSDs do wear down from writing. Since media caching mostly invovles sequential access, HDDs don't mind being used for that. But it is detrimental to the life span of flash memory, and especially hurts live USB drives (USB drives with an operating system) due to their smaller size.
If I watch a one-hour HD video, I do not wish 5 GB to be written to my SSD for nothing. The nonstandard LZ4 format "mozLZ4" for storing sessions was also introduced with the argument of reducing disk writes to flash memory, but video caching causes multiple times as much writing as that.
The property "media.cache_size" in about:config does not help much. Setting it to zero or a low value causes stuttering playback. Setting it to any higher value does not reduce writes to disk, since it apparently just rotates caching within that space, and a lower value means that it just rotates writing more often in a smaller space. Setting a lower value should not cause more wear due to wear levelling, but also does not reduce wear compared to a higher value, since still roughly the same amount of data is written to disk.
Media caching also applies to audio, but that is far less in size than video. Still, deactivating it without having to use private browsing should not be denied to the user.
The fact that this can not be deactivated is a shame for Firefox.2 -
A few days ago I decided to install Windows 7 on a VM (bad idea as it turned out). All fine and dandy and I ran Windows Update a few times to get it at least as up-to-date as it'll get.
I noticed that out of the 4GB RAM I had allocated, an svchost process responsible for the updates was gobbling up all the available memory, just leaving 82MB for everything else. The process itself was as you might imagine consuming over 3GB RAM just for itself. That's how an OS should work right after installation, I'm sure you'll agree.
So I complained about it. Haven't used Windows anywhere for a while so I wasn't used anymore to this level of efficiency. Disk activity went through the roof, though to be fair the underlying disk wasn't an SSD (qcow2 on ZFS on a spinning drive). RAM consumption is something I already covered. CPU temperature shot up to 95C.
So as any idiot would do, I disabled the service related to that process (the svchost process for wuauserv) and the problem went away. But I complained of course, saying that such amazing system utilization metrics wasn't something I expected. I mean for 4GB allocated, having as much as 82MB usable to get stuff done with! 95C on the CPU, on a lot of chips that's the junction temperature! Absolutely beautiful.
When I complained I heard that I had to replace the thermal grease. I do that twice a year. I wrote a custom fan driver for my system that works absolutely great. It was obviously shit. I must be a horrible sysadmin for solving a problem by eliminating the cause, and companies hiring me must be ashamed of themselves. My hardware must be shit (that's a common one with Windows users) despite being a business laptop and the guest system being a VM. Oh and I'm an idiot of course for complaining about such amazing system metrics in Windows.
I love Windows and its community...8 -
Dell is such awful machine to use with ubuntu even it was officially ubuntu installed machine it has so much issues and my work is suffering because of this machine which costs me 70k PKR. Having 8gb ram 500hdd, core i3 processor 4th gen.
I'm suffering from wifi getting disconnected time to time I couldn't find help on ubuntu forms nor on official dell site
I guess both sucks pretty bad
will atleast never buy dell machine again nor with stupid ubuntu os just3 -
RethinkDB is such a rediculous overengineered BIGGEST BULLSHIT I HAVE EVER UNFORTUNATELY USED.
Does anyone even use this total shit????
This shit eats RAM memory for just 1 CRUD operation as if you opened 10,000 google chrome tabs. Who the fuck thought that kind of technology is a good idea?
Yes it IS very fast, a real time database. But you'd have to have a multi-million dollar supercomputer to be able to handle so much data like a relational database can....5 -
I know Electron apps sometimes tend to be slow and consume a lot of RAM, but Jesus Christ Microsoft Teams is such garbage - it consumes a whole CPU core just for itself. My laptop fans start whirring and after a half an hour of MS Teams sitting in the background idle dmesg starts telling me CPU temperature has reached a treshold and is now thermal throttling. :(((((((2
-
someone on discord asked me "why do you code for such shit computers? they can't even play games and you can only do one thing at a time, just program for new computers"
because if i'm gonna suffer in the name of curiosity i'm gonna make shit for other people to look at thx
back to suffering from a 1KB RAM limit5 -
What are some good tiny/mini/micro computers for a homelab?
My requirements would be
- x86_64
- 8GB RAM or upgradable to such
- upgradable ssd
- can install linux distros on it15 -
I fucking hate foglight!
Fucking piece of over bloated shit software that can't do anything right!
This pile of harkonnen vomit is such a source of stress and frustration.
Can you believe that a monitoring tool, that needs to monitor 550 agents (host,dB's, webservices,whatever) needs a whopping 20+GB ram and 4vcpu's just to receive metrics and sent it to the main server, for just being a middleman???
Fucking piece of shit.
And worse of it all, due to some fuckery at mgmt-customer relationship level, they want to I stall MORE of this shit
Fucking shit
Quest software, never engage with them. You'll lose equal amounts of money as for dynatrace but be very bad off1 -
So I recently finished a rewrite of a website that processes donations for nonprofits. Once it was complete, I would migrate all the data from the old system to the new system. This involved iterating through every transaction in the database and making a cURL request to the new system's API. A rough calculation yielded 16 hours of migration time.
The first hour or two of the migration (where it was creating users) was fine, no issues. But once it got to the transaction part, the API server would start using more and more RAM. Eventually (30 minutes), it would start doing OOMs and the such. For a while, I just assumed the issue was a lack of RAM so I upgraded the server to 16 GB of RAM.
Running the script again, it would approach the 7 GiB mark and be maxing out all 8 CPUs. At this point, I assumed there was a memory leak somewhere and the garbage collector was doing it's best to free up anything it could find. I scanned my code time and time again, but there was no place I was storing any strong references to anything!
At this point, I just sort of gave up. Every 30 minutes, I would restart the server to fix the RAM and CPU issue. And all was fine. But then there was this one time where I tried to kill it, but I go the error: "fork failed: resource temporarily unavailable". Up until this point, I believed this was simply a lack of memory...but none of my SWAP was in use! And I had 4 GiB of cached stuff!
Now this made me really confused. So I did one search on the Internet and apparently this can be caused by many things: a lack of file descriptors or even too many threads. So I did some digging, and apparently my app was using over 31 thousands threads!!!!! WTF!
I did some more digging, and as it turns out, I never called close() on my network objects. Thus leaving ~30 new "worker" threads per iteration of the migration script. Thanks Java, if only finalize() was utilized properly.1 -
I have this beautiful Sony Vaio C1XS ultraportable sitting around, I can't turn it on because it uses a non-standard RAM stick Sony only made in the late 90s... Such a pretty device3
-
Hey guys! I want to purchase a new phone. I owned an Asus zenfone2 and was happy with it. I want and android phone with at least 4 gb of ram and a decent chipset. I am not interested in samsung s8 google pixel or such. I was looking into oneplus 3 or 5 or lg g6. Could you provide some suggestions?9
-
Let's increase my ram in my acer laptop. 1. It took me a week online to find out how many slots there are. 2. Then finding a manual with pictures for how to do it. No such thing. Technical manual from Acer? No such thing. I had to find a forum with one comment about my model.
There is no access to the location of ramslots. I had to completely unscrew everything because it's located at the bottom of the motherboard.4 -
No no no, that's not for what I came to linux.
Opened couple of tasks including one browser with < 10 tabs, sublime, skype and it starts fucking hang.
I have 8g ram, core i3 .
I'm using Ubuntu 16 lts. Its such a shame6 -
A beginner in learning java. I was beating around the bushes on internet from past a decade . As per my understanding upto now. Let us suppose a bottle of water. Here the bottle may be considered as CLASS and water in it be objects(atoms), obejcts may be of same kind and other may differ in some properties. Other way of understanding would be human being is CLASS and MALE Female be objects of Class Human Being. Here again in this Scenario objects may differ in properties such as gender, age, body parts. Zoo might be a class and animals(object), elephants(objects), tigers(objects) and others too, Above human contents too can be added for properties such as in in Zoo class male, female, body parts, age, eating habits, crawlers, four legged, two legged, flying, water animals, mammals, herbivores, Carnivores.. Whatever.. This is upto my understanding. If any corrections always welcome. Will be happy if my answer modified, comment below.
And for basic level.
Learn from input, output devices
Then memory wise cache(quick access), RAM(runtime access temporary memory), Hard disk (permanent memory) all will be in CPU machine. Suppose to express above memory clearly as per my knowledge now am writing this answer with mobile net on. If a suddenly switch off my phone during this time and switch on.Cache runs for instant access of navigation,network etc.RAM-temporary My quora answer will be lost as it was storing in RAM before switch off . But my quora app, my gallery and others will be on permanent internal storage(in PC hard disks generally) won't be affected. This all happens in CPU right. Okay now one question, who manages all these commands, input, outputs. That's Software may be Windows, Mac ios, Android for mobiles. These are all the managers for computer componential setup for different OS's.
Java is high level language, where as computers understand only binary or low level language or binary code such as 0’s and 1’s. It understand only 00101,1110000101,0010,1100(let these be ABCD in binary). For numbers code in 0 and 1’s, small case will be in 0 and 1s and other symbols too. These will be coverted in byte code by JVM java virtual machine. The program we write will be given to JVM it acts as interpreter. But not in C'.
Let us C…
Do comment. Thank you6 -
During my small tenure as the lead mobile developer for a logistics company I had to manage my stacks between native Android applications in Java and native apps in IOS.
Back then, swift was barely coming into version 3 and as such the transition was not trustworthy enough for me to discard Obj C. So I went with Obj C and kept my knowledge of Swift in the back. It was not difficult since I had always liked Obj C for some reason. The language was what made me click with pointers and understand them well enough to feel more comfortable with C as it was a strict superset from said language. It was enjoyable really and making apps for IOS made me appreciate the ecosystem that much better and realize the level of dedication that the engineering team at Apple used for their compilation protocols. It was my first exposure to ARC(Automatic Reference Counting) as a "form" of garbage collection per se. The tooling in particular was nice, normally with xcode you have a 50/50 chance of it being great or shit. For me it was a mixture of both really, but the number of crashes or unexpected behavior was FAR lesser than what I had in Android back when we still used eclipse and even when we started to use Android Studio.
Developing IOS apps was also what made me see why IOS apps have that distinctive shine and why their phones required less memory(RAM). It was a pleasant experience.
The whole ordeal also left me with a bad taste for Android development. Don't get me wrong, I love my Android phones. But I firmly believe that unless you pay top dollar for an android manufacturer such as Samsung, motorla or lg then you will have lag galore. And man.....everyone that would try to prove me wrong always had to make excuses later on(no, your $200_$300 dllr android device just didn't cut it my dude)
It really sucks sometimes for Android development. I want to know what Google got so wrong that they made the decisions they made in order to make people design other tools such as React Native, Cordova, Ionic, phonegapp, titanium, xamarin(which is shit imo) codename one and many others. With IOS i never considered going for something different than Native since the API just seemed so well designed and far superior to me from an architectural point of view.
Fast forward to 2018(almost 2019) adn Google had talks about flutter for a while and how they make it seem that they are fixing how they want people to design apps.
You see. I firmly believe that tech stacks work in 2 ways:
1 people love a stack so much they start to develop cool ADDITIONS to it(see the awesomeios repo) to expand on the standard libraries
2 people start to FIX a stack because the implementation is broken, lacking in functionality, hard to use by itself: see okhttp, legit all the Square libs, butterknife etc etc etc and etc
From this I can conclude 2 things: people love developing for IOS because the ecosystem is nice and dev friendly, and people like to develop for Android in spite of how Google manages their API. Seriously Android is a great OS and having apps that work awesomely in spite of how hard it is to create applications for said platform just shows a level of love and dedication that is unmatched.
This is why I find it hard, and even mean to call out on one product over the other. Despite the morals behind the 2 leading companies inferred from my post, the develpers are what makes the situation better or worse.
So just fuck it and develop and use for what you want.
Honorific mention to PHP and the php developer community which is a mixture of fixing and adding in spite of the ammount of hatred that such coolness gets from a lot of peeps :P
Oh and I got a couple of mobile contracts in the way, this is why I made this post.
And I still hate developing for Android even though I love Java.3 -
I got a very low power Netbook lately for basically no money.
I thought about using it for some server monitoring / server access via ssh console.
Which Linux distros would you recommend for such a use case. Tried Something like core-os and Debian(lxde) yet but wasn't very satisfied with both options. Both could not display the battery capacity and Debian didn't detect the Intel WiFi.
The Netbook has 512mb of ram which should be fine for a lightweight gui and more than enough for a ssh connection 😅
Thanks a lot for the recommendations :)12 -
Tried to install gitlab on my RPi3. Not gonna try this again. Didn't really get it running and lost quite some time installing, configuring and removing it.
Afterwards someone told me that I should have at least 3.5Gb of ram, if I want to selfhost it, for a goodish user experience. Might have been the reason why thay was so awful, but holy fuck, that's a lot of memory. I wonder if that's RoR's fault or simply because gitlab is a such huge software 🤔
I'll try to install gitgud tomorrow and see how that goes 🙏8 -
I have one question to everyone:
I am basically a full stack developer who works with cloud technologies for platform development. For past 5-6 months I am working on product which uses machine learning algorithms to generate metadata from video.
One of the algorithm uses tensorflow to predict locale from an image. It takes an image of size ~500 kb and takes around 15 sec to predict the 5 possible locale from a pre-trained model. Now when I send more than 100 r ki requests to the code concurrently, it stops working and tensor flow throws some error. I am using a 32 core vcpu with 120 GB ram. When I ask the decision scientists from my team they say that the processing is high. A lot of calculation is happening behind the scene. It requires GPU.
As far as I understand, GPU make sense while training but while prediction or testing I do not think we will need such heavy infra. Please help me understand if I am wrong.
PS : all the decision scientists in the team basically dumb fucks, and they always have one answer use GPU.8 -
UWP suck, I don't wanna hurt yall feeling but it's time to face the truths:
+ SandBox
+ Less Job Offer
+ Development more Complicated than Web App
+ Microsoft not create perfect hardware to make sure our app get to more consumers (the Pro X is failure)
+ Poor Optimized
Poor Optimized ?
the Windows 10 optimization is joke, all my surface laptop, pro, book I have tested. They claim that consume less Ram, but when using it along side electron and Win32 app. It feel so much choppy and lag. I mean WTF ?
UWP was made for optimize low specs SoC such as ARM base, now my laptop running on a core I5 + GPU still lag ??
I'm sorry but this is just sad. Im moving back to win32. WinRT sooner or later will end supported
And Microsoft will improve the Win32 Api6 -
nano or IDLE, depending on need.
nano is the best terminal text editor by far, as i don't wanna have to learn a new command line and 2 control modes just to type in ffffffffffucking vim and it's just powerful enough to do what is needed without extra crap on top.
IDLE is super-light-weight, has a somewhat-handy debugger if i need to see what's up when my code interacts with modules or some such, takes up very little RAM and is open-source. Works exactly as needed and no more. -
Ok so.
You know you have to deal with annoying things when you take on a guard duty role and yes, we signed up for it because of the mullah.
However, you also want to do this with a reliable and robust monitoring and alerting systemthat you can depend on! And no i am not going to advertise a product for this... What i will tell you is which one to avoid.
Meet Quest "Foglight" ... It does EVERYTHING! It monitors, it alerts, it does trend watching it does fancy shmancy graphics, it does reporting, it is very extendable... WAUW, right! right?
Well, if you were stuck somewhere in 2005-2010 maybe... But this fucklight is cutting short on EVERYTHING
Today , i got called up at 3:30 in the morning (i am typing this after the incident) because this shit of a system has "HIgh Availability" by basically letting the FMS server suck each others jaggons and hope it somehow respons. This is a sort of keepalived thing, but on proprietary java tech..
Oh, yes, it's written on java and... yes.. Java 6
This means that, effectively we are running RHEL5 machines (yes, RHEL 5!!!) because something more modern in place? nope.
I have no idea anymore what i am ranting about, i'm tired, i'm tired of this shit, i'm tired of getting called up just because of some dude has been cussing up a sales representative, sucked each others jaggons and pushed the federal goverment with a shit solution for almost a decade now.
Fuck Foglight
Fuck Quest software, because did you really think you would get enterprise level support for an enterprise product which you payed enterprise euro's for it? You are so naive, how cute...
And consequently : Fuck Dell and Good job Dell.. For purchasing quest software, mess around with it, and then dump it back to the market... Srsly Dell , you were like me when i had this hot ass chick as a girlfriend but later seemed to be too crazy to justifiably tolerate compared to her hotness. Dump it like it's trump.
Oh, and, wauw! Foglight graced us with a successful startup process after .. what.. 6 times restarting? In 2 hours... With 12 CPU's and 128 GB ram and .... oh fuck this you don't deserve such resources.4 -
Hi devRant. Wanna rant with some shit about my company. First some good parts. I work in company with 600+ employees. It's one of the best companies in my region. They provide you with any kind of sweets(cookies, coffee, tea, etc), any hardware you need for your work (additional monitor, more ram, SSDs, processor, graphics card, whatever), just about everything you need to make your work faster/comfortable. Then, we have regular reviews (every 6 months), which rise salary from $0.75 to $1.5 per hour. (I live in poor country, where $15 per hour makes your more solvent then 70% of people, so having 100-200 bucks increase every half year is quite good rise).
The resulting increase of review depends on how team leader and project manager are satisfied with my work. And here starts the interesting (e.g. the shit comes in).
1) Seniority level in our company applies depending on the salary you have. That't right. It does not depend on your skill. Except the case when you're applying to vacancy. So if you tell that you're senior dev and prove it during interview, you'll have senior's salary. This is fine if you're just want money. But not if you love programming (as me) because of reasons bellow.
2) You don't need to have lots of programming experience to be a team leader. You can even be a junior team leader (but thanks god, on research projects only). You start from leading research projects and than move to billable if the director of research department is satisfied with your leading skills.
As a consequence our seniors are dumb AF. This pieces me off the most. Not all of them. A would say half of them are real pro guys, but the rest suck at programming (as for a senior). They are around junior/middle level.
I can understand if guy has $15 rate but still remains junior dev. That's fine. But hell no, he is treated as a middle, because his rate is $10+ now! And his mind has priority over middles and juniors. Not that junior have lof of good tougths but sometimes they do.
I'm lucky to work yet on small project so I'm the only dev, and so to speak TL for myself. But my colleague has this kind of senior team leader who is dumb AF. They work on ASP.NET Core project, the senior does not even know how to properly write generic constraints in C#. Seriously.
Just look at this shit. Instead of
MyClass<T> where T: class {}
he does this:
abstract class EnsureClass {}
MyClass<T> where T: EnsureClass {}
He writes empty abstract class, forces other classes to inherit it (thus, wasting the ability to inherit some useful class) just to ensure that generic T is a class. What thA FUCK is wrong with you dude?! You're a senior dev and you don't even know the language you're codding in.
And this shit is all over the company. Every monkey that had enough skill just to not be fired and enough patience to work 4-5 years becomes a senior! No-fucking-body cares and reviews your skill increase. The whole review is about department director asking TL and PM question like "how is this guy doing? is he OK or we should fire him?" That's the whole review. If TL does not like you, he can leave bad review and the company will set you on trial. If you confront TL during this period, pack your suitcase. Two cases of such shit I know personally. A good skilled guy could not just find common language with his TL and got fired. And the cherry on top of the case is that thay don't care about the fired dev's mind. They will only listen to reviewer. This is just absurd and just boils me down.
That's all i wanted to say. Thanks for your attention. -
I have a mini laptop ( i-ball CompBook Excelance). I want to add an extra monitor to it for coding purpose. Can I do it? Someone suggested me not to do so as monitor would require more power which this laptop will not be able to provide and would result in a short circuit. Is it so that I can't add a monitor to this kind of mini laptop? If it's possible then tell me the kind of monitor to add and it's procedure.
Also it has a 2gb sdRAM. Can I run Android Studio in that? Or can I add more RAM to it?
I know I have asked too many questions here! But please help me guys coz I think that this is the platform where I'll get answers to all my queries. The people I am surrounded with are not worthy to ask such questions. Please help!
Thanks in advance!1 -
I really can't find a good and light open source ecommerce solution that doesn't require Wordpress or any other bloated framework.
I got a small company which I just work as a microelectronics/programming teacher and I want an automated solution where people can order and pay for preconfigured kits.
I usually use Nginx with Nodejs. I had a look at Reaction Commerce however it requires 1.5GB RAM as of now (I got a 512mb RAM server). And I don't see how a few visitors should mitigate the use of such an overpowered solution.
How do other developers do ecommerce solutions without using bloaty software? As of now I'm considering to just create a solution myself with a template engine and an API.2 -
at any point in time, have you had over 200 tabs open? why the fuck is such a low number such an edge case, that neither firefuck nor chrome can handle these without leaking RAM (5MB/sec and 100MB/sec respectively).11