Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "processors"
-
We got a sever with two Intel Xeon E5520 processors and each processor has 8 cores and 2.27GHz.
Also the server has 36Gb of internal memory.
What do we do with it? We play Solitair 😎19 -
Boss: Hey build me a server.
Me: OK. How much storage.
Boss: IDK.
Me: How many processors?
Boss: IDK.
Me: How much RAM?
Boss: IDK.
Me: 1U or 2U or bigger?
Boss: IDK
Me: What’s it for.
Boss: [Program]
Me: How many concurrent connections?
Boss: IDK.
Me: Budget?
Boss: IDK.
Me: *eye twitch* oooooookkkkkk18 -
string excuses[]={
"it's not a bug it's a feature",
"it worked on my machine",
"i tested it and it worked",
"its production ready",
"your browser must be caching the old content",
"that error means it was successful",
"the client fucked it up",
"the systems crashed and the code got lost" ,
"this code wont go into the final version",
"It's a compiler issue",
"it's only a minor issue",
"this will take two weeks max",
"my code is flawless must be someone else's mistake",
"it worked a minute ago",
"that was not in the original specification",
"i will fix this",
"I was told to stop working on that when something important came up",
"You must have the wrong version",
"that's way beyond my pay grade",
"that's just an unlucky coincidence",
"i saw the new guy screw around with the systems",
"our servers must've been hacked",
"i wasn't given enough time",
"its the designers fault",
"it probably won't happen again",
"your expectations were unrealistic",
"everything's great on my end",
"that's not my code",
"it's a hardware problem",
"it's a firewall issue",
"it's a character encoding issue",
"a third party API isn't responding",
"that was only supposed to be a placeholder",
"The third party documentation is wrong",
"that was just a temporary fix.",
"We outsourced that months ago.","
"that value is only wrong half of the time.",
"the person responsible for that does not work here anymore",
"That was literally a one in a million error",
"our servers couldn't handle the traffic the app was receiving",
"your machines processors must be too slow",
"your pc is too outdated",
"that is a known issue with the programming language",
"it would take too much time and resources to rebuild from scratch",
"this is historically grown",
"users will hardly notice that",
"i will fix it" };11 -
Hey everyone,
During some backend improvements to the devRant infrastructure, some of our async queue processors (SQS) stopped working which caused many notifs to not go out/stop working. Unfortunately our alerting didn’t pick up on this since there were still queues being processed (just not specific ones) and some aspects of notifs working. Big apologies for this issue!
It is now resolved, and while very delayed, no notifications were lost and all were processed after the queue processors started up again. Sorry for the bulk notifs, but we wanted to make sure all that were supposed to go out went out.
Additional alerting will be put in place to prevent this from happening again.
Thanks for your patience!16 -
Larry Tesler, a computer scientist who created the terms "cut," "copy," and "paste," has passed away at the age of 74 (17 Feb 2020).
In 1973, Tesler took a job at the Xerox Palo Alto Research Center (PARC) where he worked until 1980. Xerox PARC is famously known for developing the mouse-driven graphical user interface and during his time at the lab Tesler worked with Tim Mott to create a word processor called Gypsy that is best known for coining the terms "cut," "copy," and "paste".
In addition to "cut," "copy," and "paste" terminologies, Tesler was also an advocate for an approach to UI design known as modeless computing. It ensures that user actions remain consistent throughout an operating system's various functions and apps. When they've opened a word processor, for instance, users now just automatically assume that hitting any of the alphanumeric keys on their keyboard will result in that character showing up on-screen at the cursor's insertion point. But there was a time when word processors could be switched between multiple modes where typing on the keyboard would either add characters to a document or alternately allow functional commands to be entered.10 -
Woah, did you know some of the older arm processors could execute java bytecode directly on the hardware?1
-
Here are the reasons why I don't like IPv6.
Now I'll be honest, I hate IPv6 with all my heart. So I'm not supporting it until inevitably it becomes the de facto standard of the internet. In home networks on the other hand.. huehue...
The main reason why I hate it is because it looks in every way overengineered. Or rather, poorly engineered. IPv4 has 32 bits worth, which translates to about 4 billion addresses. IPv6 on the other hand has 128 bits worth of addresses.. which translates to.. some obscenely huge number that I don't even want to start translating.
That's the problem. It's too big. Anyone who's worked on the internet for any amount of time knows that the internet on this planet will likely not exceed an amount of machines equal to about 1 or 2 extra bits (8.5B and 17.1B respectively). Now of course 33 or 34 bits in total is unwieldy, it doesn't go well with electronics. From 32 you essentially have to go up to 64 straight away. That's why 64-bit processors are.. well, 64 bits. The memory grew larger than the 4GB that a 32-bit processor could support, so that's what happened.
The internet could've grown that way too. Heck it probably could've become 64 bits in total of which 34 are assigned to the internet and the remaining bits are for whatever purposes large IP consumers would like to use the remainder for.
Whoever designed IPv6 however.. nope! Let's give everyone a /64 range, and give them quite literally an IP pool far, FAR larger than the entire current internet. What's the fucking point!?
The IPv6 standard is far larger than it should've been. It should've been 64 bits instead of 128, and it should've been separated differently. What were they thinking? A bazillion colonized planets' internetworks that would join the main internet as well? Yeah that's clearly something that the internet will develop into. The internet which is effectively just a big network that everyone leases and controls a little bit of. Just like a home network but scaled up. Imagine or even just look at the engineering challenges that interplanetary communications present. That is not going to be feasible for connecting multiple planets' internets. You can engineer however you want but you can't engineer around the hard limit of light speed. Besides, are our satellites internet-connected? Well yes but try using one. And those whizz only a couple of km above sea level. The latency involved makes it barely usable. Imagine communicating to the ISS, the moon or Mars. That is not going to happen at an internet scale. Not even close. And those are only the closest celestial objects out there.
So why was IPv6 engineered with hundreds of years of development and likely at least a stage 4 civilization in mind? No idea. Future-proofing or poor engineering? I honestly don't know. But as a stage 0 or maybe stage 1 person, I don't think that I or civilization for that matter is ready for a 128-bit internet. And we aren't even close to needing so many bits.
Going back to 64-bit processors and memory. We've passed 32 bit address width about a decade ago. But even now, we're only at about twice that size on average. We're not even close to saturating 64-bit address width, and that will likely take at least a few hundred years as well. I'd say that's more than sufficient. The internet should've really become a 64-bit internet too.34 -
Functional Programming. Because Moores Law has moved from making processors faster to multiplying cores, and we may eventually have to code on machines that have 1024 cores or more. Mutable state will cause all kinds of hell in those scenarios. We already have problems with it when we have like 2-3 different threads.4
-
Age 19, got a government sponsored chance to go to India to study. Was called to study for Law. But didn't like it. Decided I wanted to change to Computer Science cause that's what I was interested in. Go to India and apply for computer science course but not law despite Parents wanting me to do law because hey Lawyers job is a good status in society.
Got a spot in BCA (Bachelor of Computer Application) . Totally new in programming. Started with C. Was freaked out with all the new things. Variables, comments, Pre processors files. All was new to me. Although the lecture tried her best, I couldn't understand her well because of language barrier. It was a mixture of Hindi and English.
Luckily she gave me a book to read, Let us C. That book helped me a ton. I realized I really liked programming. When summer holiday came I taught myself C++ . Then next summer Java. Then Android. Then some Web Development. That was last summer. But I kinda settled in Android and did some projects in it. Right now I am about to sit for my final exam. Then I will try my best to get an Internship or a job.10 -
I attended a data science meetup recently. There were many suits walking around the corridors because of some startup night taking place at the same time.
After some time a guy appeared infront of me, telling me he was afraid at first that this was a meetup for suits only. Until he saw all the dev and rock stickers on my notebook. He was reliefed that there was some nerd at least.
He asked what I was doing so I told him about my startup about optimization of heat generation plants jada jada. I asked him back.
He replied, "Well, I'm also part of some small startup. Among the things we develop processors and stuff. It's called Intel."
Well dude, that was nicely played. I had a lot of fun that evening.5 -
!rant
A rather long(it's 8 hrs long to be precise) story
So I just finished an amazing homework assignment. The goal was to open a new shell on Linux using a C program. We were asked to follow instructions from http://phrack.org/issues/49/14.html . However the instructions given were for 32 bit processors and we had to do same for 64 bit machines. In a nutshell we had to write a 64 bit shell code and use buffer-overflow technique to change the return address if the function to our shell code.
I was able to write my own shellcode within 1hr and was able to confirm that it's working by compiling with nasm and all. Also the "show-off-dev" inside me told me to execute "/bin/bash" instead of "/bin/sh"(which everyone else was going to do). After my assembly code was properly executing shellcode, I was excited to put it in my C code.
For that, I needed opcodes of assembly code in a string. Following again the "show-off-dev" inside me, I wrote a shell script which would extract the exact opcodes out of objdump output. After this I put it in my C code, call my friend and tell him that "hell yeah bro, I did it. Pretty sure sir is gonna give me full marks etc etc etc". I compiled the code and BOOM, IT SEGFAULTS RIGHT IN FRONT OF MY FRIEND. Worst, friend had copied a "/bin/sh" code from shellstorm and already had it working.
Really burned my ego, I sat continuously for 8 hrs in front of my laptop and didn't talk to anyone. I was continuously debugging the code for 8 hrs. Just a few minutes ago, I noticed that the shellcode which I'm actually putting in my C code is actually 2 bytes shorter than actual code length. WHAT THE F. I ran objdump manually and copied the opcodes one by one into the string (like a noob) and VOILA ! IT WORKED !!!
TURNS OUT I DIDN'T CUT THE LAST COLUMN OF OPCODES IN MY SHELL SCRIPT. I FIXED THAT AND IT WORKED !!
THE SINGLE SHITTY NUMBER MADE ME STRUGGLE 8 HRS OF MY LIFE !! SMH
Lessons learnt :
1)Never have such an ego that makes you think you're perfect, cuz you're retarded not perfect
2)Examine your scripts properly before using them
3)Never, I repeat NEVER!! brag about your code before compiling and testing it.
That's it!
If you've read this long story, you might as well press the "++" button.6 -
Our college has PC's with Pentium Core 2 Duo processors and 1 GB RAM. We are made to code Java on windows using default notepad and cmd. There's nothing more infuriating than that.
Me: Ma'am, can we use any IDE for our mini project or finals?
She: No kid, you can't just use that. This is code you have to write it.
Me: Wut?7 -
So... GDPR.
And the deadline.
And I have no idea what to do.
What does it mean for one-man indie projects? Data protection officers? Companies? Controllers? Processors? EU employees? Argh.
Look, please, EU. Not everyone can afford to hire an entire team for this, when their current team is literally one person.
Yes, the GDPR is probably a step in the right direction, but I think I'll just stop collecting the data altogether.
(All data I collect is just user settings stored in a database, nothing more.)
Can someone point me in the right direction?8 -
Most kids just want to code. So they see "Computer Science" and think "How to be a hacker in 6 weeks". Then they face some super simple algebra and freak out, eventually flunking out with the excuse that "uni only presents overtly theoretical shit nobody ever uses in real life".
They could hardly be more wrong, of course. Ignore calculus and complexity theory and you will max out on efficiency soon enough. Skip operating systems, compilers and language theory and you can only ever aspire to be a script kiddie.
You can't become a "data scientist" without statistics. And you can never grow to be even a mediocre one without solid basic research and physics training.
Hack, I've optimized literal millions of dollars out of cloud expenses by choosing the best processors for my stack, and weeks later got myself schooled (on devRant, of all places!) over my ignorance of their inner workings. And I have a MSc degree. Learning never stops.
So, to improve CS experience in uni? Tear down students expectations, and boil out the "I just wanna code!" kiddies to boot camps. Some of them will be back to learn the science. The rest will peak at age 33.17 -
So processors have Moores Law, I'm starting to think Web Development has one too.
"There shalt always be a new, better, framework, that would have saved weeks of time, but only after you've hit the point of no return in a project"
Anyone else know the feeling of "damn...I may as well just rewrite everything...."3 -
So today I started editing my CV because I want to apply for an internship in the UK; and because I've already got a Dutch CV, I just opened that document and started working from there.
*selects Dutch part*
*presses crtl + /*
Comment out the Dutch parts! 🤦3 -
So I can see everything thinks CS should be taught differently this week.
Based on all of the ways we could change it, something no one seems to be mentioning much is security.
Everyone has many ways of learning logical processors and understanding how they work with programming, but for every line of code taught, read or otherwise learnt you should also learn, be taught how to make it less vulnerable (as nothing is invulnerable on the internet)
Every language has its exploits and pitfalls and ways of overflowing but how you handle these issues or prevent them occurring should be more important than syntaxually correct code. The tools today are 100000x better then when I started with notepad.exe, CMD and Netscape.
Also CS shouldn’t be focused on tools and languages as such, seeing as new versions and ideals come out quicker then CS courses change, but should be more focused on the means of coming to logical decisions and always questioning why or how something is the way it is, and how to improve it.
Tl;dr
Just my two cents. -
I fucking HATE all those extremely high level abstractions, IT IS ALMOST IMPOSSIBLE to find anything low level, especially for ARM... IT CANT BE THAT HARD TO JUST FUCKING FIND SOMETHING THAT DOES NUT USE 100000 HEADER FILES, and stupid large frameworks. I feel like everyone is fucking retarded, I want to learn the real stuff, but everything is bloated with high level stuff, and some kind of cult that gets a horny from using extremely easy bullcrap, that completely takes away the interesting parts of processors and embedded systems, IVE Been searching for days to FIND SOMETHING FUCKING USEFULL, even an MOTHERFUCKING 'LOW LEVEL' book GOES AND USE A BILLION HEADER FILES, and STUPID IDE's from which you learn absolutely nothing, IF i wanted to do nothing and learn nothing I WOULD USE ARDUINO IDE, but no i wont, I want to learn something, and I dont have access to university or anything, and it literally is impossible to find anything usefull, every idiot uses library's for everything, and builds their crap on frameworks as large as the mount everest.. Fuck me, why cant this be different ?13
-
Didn't had to fix anything (so far). But took my dev laptop to show all the projects I've been working on to the dad of my girlfriend and he showed me his projects.
He's not mainly a dev, but an electrical ingenieur. He designs his own pcb boards (with 8bit processors on them, as "that's more than enough power to do almost anything!") and then programs that stuff with basic (he writes his own firmware for it).
He also creates desktop application to get data off the devices using Delphi.
Love that guy and have a shittln of respect for him!1 -
I now understand why we have multi-core processors. So that a process that dun shit the bed doesn't hog the whole fucking CPU! Of course at the expense of "yeah our shitty software can hog the CPU no problem, there's now several cores anyway". Hardware solving the crap that software presents, yet again.6
-
> "Just use power saving mode, bro! It will extend the life of your non-replaceable battery!"
Of course I bought a smartphone with powerful processors just to limit their performance for the sake of delaying the expiry of its non-replaceable battery.10 -
Random fact #1
AMD (Advanced Micro Devices) was producing Intel 8080 clones (AMD Am9080) before developing own CPUs. Originally they were produced without Intel license. This clone was developed basing on pictures of Intel 8080 itself and pictures of logic diagrams. These processors were much cheaper than the original model. Later AMD and Intel came up with agreement and the Am9080 was fully licensed making AMD official second party vendor.
And yeah, few years later and we got a war between two of those giants. Remember when in mid 2000s AMD almost beat the Intel marketshare?
Bonus Fact: there is AMD logo on Ferrari Formula 1 cars since 2002 (look at the front wing)6 -
What a new years start..
"Kernel memory leaking Intel processor design flaw forces Linux, Windows redesign"
"Crucially, these updates to both Linux and Windows will incur a performance hit on Intel products. The effects are still being benchmarked, however we're looking at a ballpark figure of five to 30 per cent slow down"
"It is understood the bug is present in modern Intel processors produced in the past decade. It allows normal user programs – from database applications to JavaScript in web browsers – to discern to some extent the layout or contents of protected kernel memory areas."
"The fix is to separate the kernel's memory completely from user processes using what's called Kernel Page Table Isolation, or KPTI. At one point, Forcefully Unmap Complete Kernel With Interrupt Trampolines, aka FUCKWIT, was mulled by the Linux kernel team, giving you an idea of how annoying this has been for the developers."
>How can this security hole be abused?
"At worst, the hole could be abused by programs and logged-in users to read the contents of the kernel's memory."
https://theregister.co.uk/2018/01/...22 -
As we're all going about our various Easter, Passover, et al., family celebrations, I have the perfect solution to help train your families to stop asking you for help with mundane computer stuff:
Every time someone asks you to do/fix something, give them a full talk about what is going on in their computer around that system.
Don't forget you can talk about lots of things too:
- concurrency
- TCP IP / socket networking
- multi-threaded programs vs. single threads
- RISC vs. CISC processors
- Why linux is better than Windows or Mac
- algorithms
- logarithmic runtime
- teach them how to convert between hex, binary, and base ten
Really pour it on too. Soon they'll either figure out that you are a highly-skilled individual who is not their personal geek squad, or they'll be too afraid of a big lecture to ask for help.
Works with my in-laws like a charm.5 -
Me: "It's a balance between three things: you either optimize for computation, memory use, or programming effort. Computers don't have a infinitely fast processors with an infinite amount of memory."
Coworker: "Did anybody tell Java?"3 -
I wish the Congress would run all legislation by a team of programmers. Regardless of political leanings one thing is indisputable: We are very keen when it comes to finding bugs in a piece of code -- especially if we didn't write said code!
After all: What is the law if not code for people instead of processors?5 -
A developer might think "now that computers have more RAM and an abundantly strong CPU, I am free to create resource-hungry inefficient software!"
This sets a dangerous precedent.
Computers can only get faster if the software stays efficient while the processors get faster and the RAM increases.
If computers get more powerful but software also gets more bloated and less efficient, it defeats the performance benefit.
Also, software must be efficient to extend the battery time on portable devices.
Jody Bruchon video: https://youtube.com/watch/...9 -
Personal project: I design and build single-board computers with old processors like Z80, 6502 etc when I'm not being too lazy. A few run CP/M. One that's been more interesting in terms of digging deeper has been an 80C188, for which I've written a BIOS (despite the chip's built-in peripherals and interrupts being at non-standard addresses) mostly in C, which it can use to boot DOS from an image file on an SD card (bit-banged off the UART chip with FatFs). (Yes it's slow, but so is a 5.25" floppy.)
Work: My first project at my current job. Not particularly exciting compared to some stuff on here, but it got me into making useful contributions to the open-source CRM we used at the time. Was building a basic extension to deal with duplicated organisation names. So learned CiviCRM fairly deeply, a bit of Drupal, a bit of PHP. It's a shame we don't use that system any more, the community was cool.7 -
Got an assignment in school to make an easy project in c for embedded real time processors with a free complexity level (it was really early in the course and many had never been programming before).
Since I've been working a few years in development I decided to create an own transmitter and receiver for an own protocol between processors (we had just spent a week to understand how to use existing protocols, but I made my own).
The protocol used only 1 line to communicate with half-duplex and we're self adjusting the syncing frequency during the transmission. I managed to transmit data up to 1 kbps after tweaking it a bit (the only holdback was the processors clock frequency).
Then I got the feedback from our teacher, which basically said:
"Your protocol looks like any other protocol out there. Have you considered using an UART?"
Like yeah, I see the car you built there looks like any other car out there, have you considered using a Volvo instead?1 -
X86 or X64. Well, from what I understand, there's no fucking X in front of 64. X86 refers to instruction sets for *86 professor architectures, not bits. Am I justified in this? Is "x64" willful mislabeling?4
-
PayPal = GayPal
PHASE 1
1. I create my personal gaypal account
2. I use my real data
3. Try to link my debit card, denied
4. Call gaypal support via international phone number
5. Guy asks me for my full name email phone number debit card street address, all confirmed and verified
6. Finally i can add my card
PAHSE 2
7. Now the account is temporarily limited and in review, for absolutely no fucking reason, need 3 days for it to be done
8. Five (5) days later still limited i cant deposit or withdraw money
9. Call gaypal support again via phone number, burn my phone bill
10. Guy tells me to wait for 3 days and he'll resolve it
PHASE 3
11. One (1) day later (and not 3), i wake up from a yellow account to a red account where my account is now permanently limited WITHOUT ANY FUCKING REASON WHY
12. They blocked my card and forever blocked my name from using gaypal
13. I contact them on twitter to tell me what their fucking problem is and they tell me this:
"Hi there, thank you for being so patient while your conversation was being escalated to me. I understand from your messages that your PayPal account has been permanently limited, I appreciate this can be concerning. Sometimes PayPal makes the decision to end a relationship with a customer if we believe there has been a violation of our terms of service or if a customer's business or business practices pose a high risk to PayPal or the PayPal community. This type of decision isn’t something we do lightly, and I can assure you that we fully review all factors of an account before making this type of decision. While I appreciate that you don’t agree with the outcome, this is something that would have been fully reviewed and we would be unable to change it. If there are funds on your balance, they can be held for up to 180 days from when you received your most recent payment. This is to reduce the impact of any disputes or chargebacks being filed against you. After this point, you will then receive an email with more information on accessing your balance.
As you can appreciate, I would not be able to share the exact reason why the account was permanently limited as I cannot provide any account-specific information on Twitter for security reasons. Also, we may not be able to share additional information with you as our reviews are based on confidential criteria, and we have no obligation to disclose the details of our risk management or security procedures or our confidential information to you. As you can no longer use our services, I recommend researching payment processors you can use going forward. I aplogise for any inconvenience caused."
PHASE 4
14. I see they basically replied in context of "fuck you and suck my fucking dick". So I reply aggressively:
"That seems like you're a fraudulent company robbing people. The fact that you can't tell me what exactly have i broken for your terms of service, means you're hiding something, because i haven't broken anything. I have NOT violated your terms of service. Prove to me that i have. Your words and confidentially means nothing. CALL MY NUMBER and talk to me privately and explain to me what the problem is. Go 1 on 1 with the account owner and lets talk
You have no right to block my financial statements for 180 days WITHOUT A REASON. I am NOT going to wait 6 months to get my money out
Had i done something wrong or violated your terms of service, I would admit it and not bother trying to get my account back. But knowing i did nothing wrong AND STILL GOT BLOCKED, i will not back down without getting my money out or a reason what the problem is.
Do you understand?"
15. They reply:
"I regret that we're unable to provide you with the answer you're looking for with this. As no additional information can be provided on this topic, any additional questions pertaining to this issue would yield no further responses. Thank you for your time, and I wish you the best of luck in utilizing another payment processor."
16. ARE YOU FUCKING KIDDING ME? I AM BLOCKED FOR NO FUCKING REASON, THEY TOOK MY MONEY AND DONT GIVE A FUCK TO ANSWER WHY THEY DID THAT?
HOW CAN I FILE A LAWSUIT AGAINST THIS FRAUDULENT CORPORATION?12 -
!dev
Guys, we need talk raw performance for a second.
Fair disclaimer - if you are for some reason intel worker, you may feel offended.
I have one fucking question.
What's the point of fucking ultra-low-power-extreme-potato CPUs like intel atoms?
Okay, okay. Power usage. Sure. So that's one.
Now tell me, why in the fucking world anyone would prefer to wait 5-10 times more for same action to happen while indeed consuming also 5-10 times less power?
Can't you just tune down "big" core and call it a day? It would be around.. a fuckton faster. I have my i7-7820HK cpu and if I dial it back to 1.2Ghz my WINDOWS with around lot of background tasks machine works fucking faster than atom-powered freaking LUBUNTU that has only firefox open.
tested i7-7820hk vs atom-x5-z8350.
opening new tab and navigating to google took on my i7 machine a under 1 second, and atom took almost 1.5 second. While having higher clock (turbo boost)
Guys, 7820hk dialled down to 1.2 ghz; 0.81v
Seriously.
I felt everything was lagging. but OS was much more responsive than atom machine...
What the fuck, Intel. It's pointless. I think I'm not only one who would gladly pay a little bit more for such difference.
i7 had clear disadvantages here, linux vs windows, clear background vs quite a few processes in background, and it had higher f***ng clock speed.
TL;DR
Intel atom processors use less power but waste a lot of time, while a little bit more power used on bigger cpu would complete task faster, thus atoms are just plain pointless garbage.
PS.
Tested in frustration at work, apparently they bought 3 craptops for presentations or some shit like that and they have mental problems becouse cheapest shit on market is more shitty than they anticipated ;-;
fucking seriously ;-;16 -
I like model railways. I also like embedded electronics.
I therefore enjoy combining the two in my free time quite a bit - putting arm processors on trains and getting them to do cool stuff. I'm also happy to dish out electronics advice to other model railway guys, a lot of whom are older and have literally no clue (but will create stunningly realistic scenes that are a million miles away from my lowly efforts.)
But bloody hell - is it hard to do so without being drowned out by incompetent sods who think they know it all because... reasons. I think I may just stop attempting to help beyond this point. This is the latest nugget I've had to contend against - I guess he's heard about skin effect, but since DCC works at around 8Khz and using anything more than 1.5mm core cable is ridiculous for a model railway, even that is complete baloney.
Other electrical nuggets I've heard from this group include "only washing machines run on AC, everything else is DC", "the colour of the wire matters otherwise it could short circuit", and "driving your old trains with a DCC signal will make them run better, because it's more modern". 😬26 -
I'm freaking done trying to get Linux on my machine. I've tried every distro with many different versions of the kernel and I always run into the same problem on my desktop.
The computer super stutters for 2 seconds ish than freezes.
I've spent DAYS looking into this issue trying to find something. The worst part is that it can happen 5 minutes when I boot or 5 hours. At first I thought it was Compton. Then I thought I installed arch wrong. Maybe an update to the BIOS? How about downloading updated microcode? Maybe this obscure bug with AMD processors and setting power idle to typical? Nothing. I'm now behind on my school work because of the massive amount of time ive spent getting this fixed. It works just fine on my laptop, but it doesn't work on the machine I built to code with. I'm done. Give me Force Lightning, a red lightsaber, and call me a Sith baby because I'm joining the dark side. Here I come Windows.
For those who are wondering my setup:
Ryzen 7 1700
Rx 480
Asus x-370 prime
16 gb Corsair RAM
And no, Windows has never had this bug.31 -
A few weeks back I used Typora because I didnt find any word processors. Now I cant write without Typora1
-
When someone asks me a tecnical question in something i'm interested in:
"It works like this ... Oh i also should explain you how processors work ... Anyway, when a bufferoverflow arises ... And thats how crypthography works ... and so does blockchain work ... anf thats why bitcoin is causing way too many stress on the power grid."7 -
So true ...
"Many developers on a schedule aren't making efforts to write clean and efficient code, relying on idiot-proof languages (and consequently more RAM and faster processors) to make up for their malaise."2 -
Users running Linux on laptops with Intel processors should avoid Linux Kernel 5.19.12 due to an error that might physically harm the display. Fortunately, kernel 5.19.13 has already fixed the issue. Versions 6.0 and 6.1 have also begun rolling out with many significant changes.4
-
If Apple computers have cores (in their processors), where are the seeds and stems?
Welp this sounds a lot more stupid after having typed it and read it, but I'm still posting it :D1 -
Last year, we had computers architecture class where we study about the architecture of processors like RISC, CISC, SIMD... The teacher was a nice person but didn't have much knowledge on the field. I read some of Patterson&Hennessey book (computer organisation and design iirc) and learned how to use openmp and mpi, and then in the last lab we were required to optimize matrix multiplication using 4 threads in openmp, the best students optimiseed for 4 times at best, meanwhile I made 16 times optimisation and showed the teacher how fast it was. She was really impressed lol1
-
Me waiting for my neural network model to finish fitting. Omg, what do I need? A computer the size of the enigma machine just FILLED with graphics processors? And my validation accuracy rate is falling as I wait. Imma cry!4
-
I am so sad. I've been having problems with Linux installs on my desktop since i built it. It just hangs at random times and the journals don't mention any problems. I finally catch a lead and it turns out it's a bug with the microcode of the Ryzen processors. There was a possible work around, but it didn't work for me.
Guess I'm just going to have to use Windows exclusively on my desktop. I hope for a fix but the bug has been around for a year. :(4 -
The number of scripters and 'data scientists' that call themselves developers will increase, the true art of development will become sidelined and the world's code will become progressively more bloated and inefficient as the rift between hardware and software widens to an echoey chasm.
Then quantum processors will come along, requiring new logic, languages and practices, and once again the true developers will rise up and pave the way for a bunch of entitled, know-it-all and self-promoting QuarkaScripters to come along decades later and pretend like they invented programming. -
Someone once told me the following:
Processors, as good as they are, will always make mistakes.
When processors (i3, i5, i7, ...) are tested before distribution, they are categorized on the amount of mistakes they make. An i7 7700k for example which makes very few mistakes is labeled as a 'Type A', while another i7 7700k with the exact same name and specs makes a little bit more mistakes, and is labeled as 'Type B'.
All the 'Type A' processors are used and sold in business class laptops and workstations while the 'Type B' ones are sold to consumers.
After some research I couldn't find anything on it on the internet.
Anyone know if this is true or straight up bullshit?7 -
A friend of me said that intel Pentium and i3 is the same, that all laptops with intel`s logo have the same processor, you just have to view the info. of yours...2
-
Battery life worth some sloppy seconds is part of all mobile devices nowadays, mainly because it's standard by now to charge all your devices in your dedicated charging room, stacked with millions of chargers, where you connect thousands of devices before you go sleep. (dont forget to put your smart pillow on charge too)
Having a day or two worth of battery life in a laptop with normal use or a phone that can easily power through heavy usage for 3-4 days or more is really just so rare.
I can see how all mobile processors jumped multiple thousands of generations with power consumption, but that doesnt help, if companies just put a thin layer of battery to actually power it.
I am so glad I am finally again able to have both a laptop and a mobile phone that don't force me to charge all the time or carry around my huge battery packs.
A full day of my new phone gets me only down to 75-80% and I really started appreciating again, how just a slightly thicker phone can make such a huge change.1 -
!rant
Last night I found two Athlon XP processors in my house. They're a 2400 and a 2700, but I don't know if they work because I don't have other parts to test them.
Nothing too interesting, I just thought it was kinda cool7 -
The AMD song, to the tune of Sam Riegel's DnD Beyond jingle:
You got the perfect casing
Its drive bays and supplies
But you need something to run your stuff
Cause you're late for that deadline
You click open a web page
You've heard about Phoronix test suite
And now you see a red company rise
In a field of blue and green
It's AMD! (AMD)
Yeah! AMD Radeon!
Yeah! AMD! (AMD)
Yeah! AMD Radeon!
You've got your motherboard
You've got your processors
And you've got Socket AM4!
It's AMD (AMD)
AMD (AMD)
AMD Radeon -
It never ceases to amaze me just how big 64 bit memory space is. It's so unrealistically big that on contemporary processors you can't address the middle and the size of that dead spot (the number of high bits that must be the same in a valid address) is barely worth mentioning.1
-
While teaching theory is actually good, it doesn't mean that there is no room for any practical education either. Students needs to be exposed to modern programming languages like Python, Ruby while at the same time be trained in the pioneers of programming like C, C++, Java. It is only then would they be able to make informed decisions on who they really want to be. If you had one practical lab session on C and Java and then the rest of the semester about HTML, students would end up moving away from programming.
Concepts like programming and networking concepts should be included whereas ancient technologies like programming micro-processors (x386, x486, etc) should be excluded. Who programs x386 and x486 micro-processors anymore? While the understanding of how micro-processors and other low-level components in the computer systems work is very essential, doing practicals on them isn't really a good use of students' time, energy or effort. -
Anyone hear about EMIB from Intel? It seems like it might be a game changer for getting workstation power into Ultrabook form factors. They worked with a team at AMD to engineer it, and even made it sound like they're using it to combine core i processors with AMD graphics.
I ask because there has apparently been news about it since August but I've only just heard about it today.
https://anandtech.com/show/12003/... -
What should I do to practice being a "good coder" vs a "code Googler" who slaps other people's code into the site just because "it's enough to get the damn thing working"?
I feel really overwhelmed with all that Ive learned thus far. At this point I feel width with know depth when it comes to my knowledge of websites.
I've been messing around with html/css/js for a while and played with plenty of other languages,pre-processors, frameworks, etc. I never went to school for programming and have done work for small businesses independently for some time. Most of what I know comes from codecademy treehouse and similar sites. I can refer to Google on a lot of things but I feel like there are habits that I should be implementing so I don't have to re-do things later. I love the book apart series but I still feel like it's missing the foundational knowledge that I'm looking for.
After all of the time I've spent going through courses I feel like my experiences have given me solutions to build a few things and now I'm just jamming those solutions onto whatever I can until something I like comes on to the browser.
It's really easy to sit down and bang my head against the keyboard until something comes out that looks the way I want it to. However, I know there is way more going on that could help me make better decisions. I just feel like I'm missing something. Maybe it's experience, or maybe it's just the lack of commroddery from working alone and not being able to approach problems with a team.
I hate pulling up my css file and feeling like it's rubbish, and feeling like I don't completely understand things like flex, or display, or position. I've been pushing at this for a while but I don't think I've found a resource that has really made me feel like I'm anywhere close to being a competent coder.
There are tons of watch and learn and do type classes that show you how to make stuff, but I guess what I want to know now is why we make it that way.
At some point do you just sit down and read the MSN start to finish?
I wonder sometimes if my brain has been reprogrammed because I grew up in Google world and don't actually have to solve anything for myself. I read about a guy who locked himself away for hours with books on code and he just sat there and wrote his code on paper until he was confident that he was getting it right.2 -
Build my own phone and support the Zerophone project by writing code.
Seriously what the fuck is going on with the development of major companies smartphones. Every year all there is are larger displays, better and more cameras, faster processors and some more 'AI' thrown into the mix.
What the heck am I supposed to do with a phone costing multiple hundreds of euros but locked down with an OS spying on you. The processing power available is hardly ever used because most people just use apps like Instagram, WhatsApp or other messaging services.
I get why larger screens are useful but at some point it gets ridiculous.
Better cameras are useful to some degree as well but there's a limit to it.
If you really want to get into photographing then please buy an actual camera.
Another aspect I'd of course like to talk about is privacy. It's hardly existent on IOS or Android smartphones with Google services. Of course one can install different ROMs like Lineage OS but if I already pay multiple hundreds for a device then I'd prefer it working for and not against me.
And dare you break a single part of your phone. You can't really repair it yourself anymore and one can't even change its battery. Most people either have it repaired or just buy a new one and throw it away. There is so much electronic waste, very difficult and expensive to dispose of, just buried in the ground somewhere.
Summing up: I don't really know where the development of smartphones is heading. A phone is a device you carry around with you almost everyday so I'd like it to be tailored to me and not spy on me.
I hope the Librem phone will be a success and other open source phone projects will gain more attention. I want a phone I can repair myself and tailor the software running on it to my needs. I'd like to write messages, listen to music, make calls, run a WiFi hot-spot on the phone and maybe play some tiny games on it once in a while.6 -
So apparently the Android emulator only works well on Intel processors
And there was me trying to be the good guy and buy AMD5 -
Got in a somewhat heated discussion earlier a'd wanted to get some more input...
Friend of mine has a community site for a game, and is running adds to pay for the hosting costs etc... He however has recently changed adds provider and now they've become more profitable but also a lot more obtrusive...
I suggested perhaps looking into getting something like coinhive, mining monero coins with your users browsers... He was really averse to it, but I think that it can be viable alternative to adds, as long as you allow your users to not participate and don't go all out with their processors but throttle it to say 5% orso...
Anyhow, he wouldn't have it, and I was wondering if I was alone in thinking I'd rather have some coins mined using my processor than seeing adds, especially if it's not at full speed, and with consent (and not on mobile)5 -
Regarding processors, 2019 is going to be interesting :
Arm vs Intel vs AMD
https://youtube.com/watch/... -
!rant
https://github.com/rohitshetty/...
I am a young dev trying my hands around in different stuff.
So I would appreciate any criticism or comments that would allow me to Learn more :) or good practices I can follow.
Here is one project where I tried to create a structured frameworkish way to write mqtt processors.
Mqtt processors are standalone apps that process mqtt requests that has to be acted upon (like add sensor data to db sent from sensor node, read from db, turn some gpio on or off if the app is on some embedded device like raspi ) etc.
This project creates a structure where you can just focus on writing subscribed topic listeners in a clean neat way. (Hopefully)6 -
As always IE and Edge holding back progress. Dying to use CSS variables but can't because of these 2 wank browsers. Once this is fully supported it should wipe out the need for pre-processors (IMHO). Not that I feel the need to use pre-processors anyway.1
-
Programming is like religion.
- you are making sombody to obey your will
- you have limited amount of resources (churches, processors, RAM, holy items)
- when you torture it, usually it will comply but not in way you are expecting
- smashing nail with hammer is not good idea (you can run out of resources again. Getting new CPU is costly as same as getting new believer)
So, who's in for new Church of Unhandled Exception?1 -
Here is a weird fact I have been thinking about this evening:
Helio X20 was the only mainstream ARM processor that had 10 CPU cores. It was first introduced in 2015, however no more ARM processors with high core count were used since then..
Nowadays smartphone processors have `8` cores max 🤔🧐
I guess 8 cores the reasonable limit for smartphones. Must have something to do with cost-to-performance factor3 -
I tend to be OS agnostic, but I hate the way that Microsoft treats yesterday’s hardware like garbage. I was an unfortunate owner of a surface rt. I also currently own 3 machines with older i7 processors that are not supported by Windows 11.19
-
The human brain (also animal brains, even ants) are incredibly complex. Each neuron is now supposedly its own processor. So a human brain is a complex network of billions of processors, not just threshold variables. This means to simulate an organic brain sufficiently it will take a huge computer system with billions of parallel processors. Now, I don't know if the sophistication of a computer processor is represented in each cell. So this may not be equivalent to billions of pentium cores for instance. However, it still presents a huge challenge for AI, as it exists now, to replicate. My thoughts are that AI that is silicon based will take a different approach that leverages how computers work. My guess is that current neural net models are not a good match for this unknown AI. Will it inherently exhibit pattern matching like an organic brain? Or will it be a different kind of consciousness altogether? Will we even realize it is self aware? Will my roomba plan to kill my pet for my attention? What are some other models being employed in AI research?3
-
I’m sure y’all have heard about apple fucking with their phones processors.
I have an iPhone, and I’m totally ok with it. The apple community is complaining about it. The fact is that it’s apple’s choice for the UX, and it’s not a problem.9 -
Why do developer prefer macOS or linux over windows? Even though windows can run almost all the programs. Windows can provide great speed with newer processors and SSDs.
I find Windows to be more interactive and simple to use. What do you guys think?37 -
Does any of you have the compulsion to micro-optimize every bit of code that you write? How do you deal with it?
I'm not just talking about algorithmic optimizations, but the real nitty gritty stuff. I'm talking about using bit fiddling to avoid if statements where speculative processors might make mispredictions. Anything that might make a program compile to fewer machine instructions or avoid extra stack frame overhead.
This all started a year ago when I took a systems programming course at my university, and started learning C and C++. But I find myself doing this in the wrong places. Who cares if this trivial program that I wrote runs in 1.2 or 0.6 seconds? My future employers won't care if my code is 10% more efficient when it takes four times as long to write.
It's gotten to the point that I can't bring myself to use languages like Python because I don't know how it's implemented under the hood and can't predict how the different ways I could write a function will affect performance. How do I bring myself to trust that the compilers (or interpreters) and the programmers that wrote them will be sufficiently optimal, and just move on? 😩4 -
The Surface RT failed because of the lack of apps available. At least that’s what I heard.
Why didn’t Microsoft make a x86 compatibility emulator like Apple did when they were moving away from the PowerPC architecture?
Sure x86 apps would be slower, but if they distributed the ARM version of windows as well, made it available for the Raspberry Pi and all sorts of devices, I fell that would be a huge drive from ARM based processors.
The DirectX, Windows forms..etc. libraries could be recompiled by Microsoft, which would make graphically intensive programs run faster too. Did Microsoft just not think of a compatibility layer? Or is there some obvious reason I’m missing?2 -
https://remotelyawesomejobs.com/job...
Looking at the "ideally you should" sections doesn't make me feel this is a junior position:
You are passionate about making data useful to the lay-person, there is no data you couldn’t derive visual meaning from. D3.js is your go to and Canvas is your friend.
You build components in your sleep, rock solid and performant using atomic design patterns.
You bend CSS processors to your will or throw them out and code by hand.1 -
I finally almost have enough money to buy myself a MacBook (to get into iOS development) but now I know if I buy one it will be outdated as soon as the ARM processors come out...
So I guess I'll have my trusty Ubuntu laptop for the next year or two2 -
!Rant
Hey guys, who's using those new Intel U series processors? What are your thoughts on them? What's kind of OS you use and work you do with them? Are they reliable when using docker or VMS?
I'm asking cuz most reviews show that they are really bad compared to even older processors. I know that they are only dual core and use less power, but are they reliable for dev jobs?1 -
I have this instructor at the moment, and I've had this instructor before but this semester is almost intolerable because of the instructor. He is good with processors and knows the history of how computers came to be pretty well, mostly because he lived through it, but for the 2nd year in a row he is teaching how to create games. This class is mandatory. We are creating games using html5 and Javascript. He refuses to give any game engine a chance. He gives inconsistent grades (i.e. we did everything right but got 17/30) only to go to his office, sit there for about 45 minutes watching him struggle to operate a computer and nitpick our code. He asks us what certain things do in our code, but not as in a teacher-student questionnaire, he just plain doesn't know what any of it does. Then after the shenanigans, you see your grade updated a few days later and he gives you maybe 5 points back, so you go back until you get the grade you deserve. It's a mess. This is my last semester with him and I've mapped out my last year at the uni to make sure I DON'T take any classes with this him.
-
I have a nice laptop already. Yet I see this and want it. It is about what I paid for my current lappy. The nice part is more ram, more cores, and a 2060 gfx. Right now I am running a 1650 gfx.
I just don't have a "good" reason to get this. I am glad lappy prices have finally gotten near normal again.
Anybody have experience with the newer AMD processors? Worth it?
https://amazon.com/ASUS-IPS-Type-Ge...2 -
TIL vanilla Java has the facilities to do (some form) of arbitrary compile time metaprogramming via annotation processors5
-
So, some data need to be prepared during the summer and the diverse departments' elected data processors got shared in a Google spreadsheet they will need to fill with some basic data IT needs. Simple, straightforward data entry, with nothing private nor confidential. Just another divide-and-conquer-style large amount of data to enter & organise, that's all.
Today, I received a new comment notification as the owner of the spreadsheet. You can imagine my surprise when I saw that, for some f*cked up reasons, one of the guys just wrote the super-admin username & pw for one of the main data systems we use in a freaking comment in the spreadsheet... WTF...
Oh, and also, juuust in case, he also wrote the pin code that is normally required to pass through the device-check when you log-in as a super-admin from an unknown device and/or location.
Fortunately I could catch it on time, but this just ruined half of my day.
I am supposedly on freaking annual leave. Ha Ha. Ha. -
Why do we still use floating-point numbers? Why not use fixed-point?
Floating-point has precision errors, and for some reason each language has a different level of error, despite all running on the same processor.
Fixed-point numbers don't have precision issues (unless you get way too big, but then you have another problem), and while they might be a bit slower, I don't think there is enough of a difference in speed to justify the (imho) stupid, continued use of floating-point numbers.
Did you know some (low power) processors don't have a floating-point processor? That effectively makes it pointless to use floating-point, it offers no advantage over fixed-point.
Please, use a type like Decimal, or suggest that your language of choice adds support for it, if it doesn't yet.
There's no need to suffer from floating-point accuracy issues.26 -
There seems to be a lot of 10 series laptops with 6th gen processors, but not many 10 series laptops with 7th gen processors. It'd be nice to have refreshed hardware in one machine1
-
Surely to God there is a way to write simple code on an Android 10 phone without a computer. My Moto G7 Super has 3GB RAM and 8 processors.
The UI will suck but shoot me already as I can't use a computer right now. The major problem is file access as the languages I have used are run in the cloud.
Any advice is welcome. At this point am agnostic re language.
Any suggestions?4