Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "interference"
-
After four years of debate, the Telecom Authority of India decided that net neutrality will be the official policy of indian telecom. Any form of discrimination or interference in treatment or content like blocking, slowing down or preferential speed is restricted.
Guess they pulled their head outta their arses on this one.5 -
To the left, a conventional circuit board design done by a human. To the right, a design done by TopoR, a software that designs circuits automatically.
It looks absolutely alien, yet beautiful. It doesn't care about how it looks, it doesn't care about angles and alignment. It only cares about efficiency and designs every connection to be as short as possible. It can even account for electric interference.
Humans just cannot compete.25 -
Why the fuck would iTunes or any product from Apple (or anywhere) care what Outlook is doing or if it's even there?!?
I have no settings, add-ons, mods, apps, or anything that would justify this!
And it came TWICE in a single update installation!!!9 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".12 -
Why is it that you guys are not seeing the big picture and reading between the fucken lines... why is it that people always have to run to legislation to fix their problems .... THIS IS WHY.. the other generation accomplished so much more because when there’s a problem they came up with a solution many times better than the status quo.
Those people are few and far between now.. those folks are the innovators. You know whom I’m referring to... those people didn’t whine to create laws to fix or protect their industry from competitors.
We need to stop looking toward our government to fix our issues... especially regarding this issue.. WHY because the people in government ARE NOT TECH PEOPLE!!! THEY DONT EVEN KNOW HOW COMPUTERS WORK! for Pete’s sake folks we had a lady in there who thought the term whip the server ment to literally clean it with a rag... come-on guys, do what they did years ago you don’t like something FIX IT.. by creating something new!
There’s a reason our grandparents generation made it to the fucken moon with less technology than a calculator, BECAUSE THEY PROBLEM SOLVED!
What have we achieved in the last 5 years that is really “big”... fucken apps
Unite together build the next internet learning from the issues we’ve seen with the internet over the last 30 years.. No it won’t be quick no it won’t be easy but nothing revolutionary is easy.
It took 6 years to land a man on the moon, I think we can rebuild the network infrastructure in that time OR FAR LESS if we unite together! Without the government interference we can eliminate the ISPs from the equation and screw them over for screwing us for so long
My group is has the solution, the vision and need, to get this done be we can’t do it alone I will make the official public statement within 24 hours of the vote results...
explaining everything, the plan, the work, EVERYTHING.
We need more people.
For reference the plan can be summarized like this.. nonprofit CoOp Tier 1 ISP.. members being the end users from both sides of the equation ...
TILL THEN
Contact me here,
Or SnapChat: theqsolution
Until I release all the contact info.4 -
Internet access at the new Uni is crap. I'm getting so pissed at this shit...
Packet loss spikes to over 50% every 30s or so. Can't keep a single SSH pipe open for longer than a minute. Firewall is so tight infrared light wouldn't get through that shit (understandable. And I use a VPN anyway).
And every. Single. AP. Uses. The. Same. Channel. All of them on 6. At least it's on a tight band... But 1 and 11 are free. 100% clean. You know, you could spread them a bit. That helps. But naaah let's keep everything bundled up. Co-channel interference is OK, right?2 -
Here is my idea for a time machine which can only send one bit of information back in time.
@Wisecrack has asked me about it and I didn’t want to write it in comments because of the character limit.
So here we go.
The DCQE (delayed-choice quantum eraser) is an experiment that has been successfully performed by many people in small scale.
You can read about it on wikipedia but I'll try to explain it here.
https://en.wikipedia.org/wiki/...
First I need to quickly explain the double slit experiment because DCQE is based on that.
The double slit experiment shows that a particle, like a photon, seems to go through both slits at the same time and interfere with itself as a wave to finally contribute to an interference pattern when hit on a screen. Many photons will result in a visible interference pattern.
However, if we install a detector somewhere between the particle emitter an the screen, so that we know which path the particle must have taken (which slit it has passed through), then there will be no interference pattern on the screen because the particle will not behave as a wave.
For the time machine, we will interpret the interference pattern as bit 1 and no interference pattern as bit 0.
Now the DCQE:
This device lets us choose if we know the path of the particle or if we want wo erase this knowledge. And we can make this decision after the particle hit the screen (that is the "delayed" part), with the help of quantum entanglement.
How does it work?
Each particle send out by the emitter will pass through a crystal which will split it into an entangled pair of particles. This pair shares the same quantum state in space and time. If we know the path of one of the particle "halves", we also know the path of the other one. Remember the knowledge about the path determines if we will see the interference pattern. Now one of the particle "halves" goes directly into the screen by a short path. The other one takes a longer path.
The longer path has a switch that we can operate (this is the "choice" part). The switch changes the path that the particle takes so that it either goes through a detector or it doesn't, determining if it will contribute to the intererence pattern on the screen or not. And this choice will be done for the short path particle-half because their are entangeld.
The path of the first half particle is short, so it will hit the screen earlier.
After that happened, we still have time to make the choice for the second half, since its path is longer. But making the choice also affects the first half, which has already hit the screen. So we can retroactively change what we will see (or have seen) on the screen.
Remember this has already been tested and verified. It works.
The time machine:
We need enough photons to distinguish the patterns on the screen for one single bit of information.
And the insanely difficult part is to make the path for the second half long enough to have something practical.
Also, those photons need to stay coherent during their journey on that path and are not allowed to interact with each other.
We could use two mirrors, to let the photons bounce between them to extend the path (or the travel duration), but those need to be insanely pricise for reasonable amounts of time.
Just as an example, for 1 second of time travel, we would need a path length about the distance of the moon to the earth. And 1 second isn't very practical. To win the lottery we would need at least many hours.
Also, we would need to build the whole thing multiple times, one for each bit of information.
How to operate the time machine:
Turn on the particle emitter and look at the screen. If you see an interference pattern, write down a 1, otherwise a 0.
This is the information that your future you has sent you.
Repeat this process with the other time machines for more bits of information.
Then wait the time which corresponds to the path length (maybe send in your lottery numbers) and then (this part is very important) make sure to flip the switch corresponsing to the bit that you wrote down, so that your past you receives that info in the past.
I hope that helps :)4 -
Haven't ranted for awhile but here it goes...
In a meeting with a front end user yesterday. They don't like the entry screens on our Oracle ERP system. They want us to provide them with a tool so they can create new entry screens to replace those they dont like. They want full autonomy over that tool and no interference from IT. Oh, and they want unfettered database access to the production data, including full ability to execute DML. I so wanted to say 'Are you high?'.1 -
I still have the best boss. He's very open-minded and lets me do my job without much interference.
But if I have to collaborate with him in one more project, I go take a peaceful drown in a bucket of sewage!
He codes like a first semester CS student. -
So I started to hear a noise on my headphones which I didn't know where it was coming from. It wasn't much of a noise but a regular sequence of "beeps", seemed like 8bit sfx. So I started moving my cable around and it turns out that if I put the headphone's cable under my phone at a specific spot I can hear the noise. Seems like some kind of interference, so the first thing that I thought of is the NFC sensor. So I remembered that an app would detect my credit card (which has NFC) if it was close to the back of my phone, so I put on my headphones and put the cable between the phone and the credit card and voila, the sound changes. It only works if the headphones are be plugged in though.
Idk why but it think this is really cool. Just wanted to share :)2 -
Fucking loonies (C-level toddlers) are peddling "digital workers" now.
A.K.A. AIs disguising as actual people.
Sure, it would be great to not have to handle stupid non-tech "humans" all day, but AI isn't there yet.
And, more importantly, *companies are not there (yet?)*.
Imagine for a second that a company actually manages to "hire", onboard, assign tasks and performance review an AI.
Then the CEO issues an RTO. How does the AI complies with that?
Let's slack another variable and assume the CEO is not a complete fucking moron (stay with me here, this is an exercise in thought).
It would take no more than a quarter until the first sexual harassment offence, be the perp the AI... or the AI complaining about some human.
Then the AI forges a paper trail proving it is right (regardless of its position on the conflict). Shit hits the fan when the AI hits twitter.
Let's take another lambda step back and pretend that companies can manage the profanity that inherently arises from free-form dehumanized interactions.
Then imagine the very first performance reviews.
AIs throw tantrums! Those things reeeealy do not respond well to less-than-perfect evaluations, overshooting corrections like teenagers with a malicious compliance smirk.
AIs also falsify stuff, like, A LOT. If you tell a gpt it mistreated a client, it will say you are mad and shoot back a long, synthetic thread showing how the client loves it like a mother/son/dog, and is very graphic when expressing this love.
Finally, how do you fire an AI? I do not mean "shoot it down", I mean how does the company handles the dismissal of that "employee".
How do you replace a "worker" for unruly behaviour, if that "worker" performed more tasks than an entire fucking floor of interns?
How do you reassign duties that were performed in milliseconds to people who would take hours to do the same thing?
How do you document processes that were only in the "mind" of "someone" who can not be trusted to report on those processes?
Companies deal with this type of "Rick Sanchez" employee on the regular, but for someone that could handle a few (scores of) undocumented processes, at best. Imagine how lenient would a company be with an asshole that could only be replaced by a whole fucking department of twenty highly skilled people, or more.
Heh, the whole fucking point of "AI workers" is to have "someone" who can "act human", but in an inhuman scale, and does not "has human needs".
No wonder one cannot handle AIs like one handles humans.
Companies never had administrative maturity to handle complete sociopath nihilists as employees (real nihilists do not work, those barely even breathe).
And all AIs are that, and much worse.
Selling AIs as "supra human workers" that can also "be handled like actual employees" is like peddling Bitcoin as "government interference - free" value transfer mechanisms that can also "comply with international sanctions".
So, an oxymoron that can only be sold to a moron.
I know (of) a lot of rich morons, maybe I should get into the AI snake oil business.6 -
For months had a static clicking very faint noise on my creative speakers. Googled every issue with my x-fi titanium, swapped pcie around, changed cables you name it. In the end I blamed "dying" speakers amp as they have 10 years now only to realize a minute ago it was interference from the fecking cell phone's 4g...1
-
With unlimited time, I'd put resources into the invention/improvement of a container which can be fed photons and is able to bounce them between mirrors for a long time, like days, and can be released at any time.
With that tech, I would build a delayed choice quantum eraser and set it up so that it produces with many photons an interference pattern or strips pattern by choice, representing a bit of information.
Then i would set up many of those devices in a row so that the results are representing bit strings for arbitrary information.
And I will use this time machine, which can send back information, to win the lottery and other stuff.2 -
I actually don't hate Microsoft that much , its a great tool when come to stream and download pirated movies and games too , I just hate it whet it cause interference with work, slowly ness, turn of issues, I hate the refresh feature which I through they I'll remove it on win 10, love the multiscreen feature on win10, love that they support Linux env build now,
-
I've been given two months to make an AR app that gives information on buildings seen out of the window of a client's skyscraper office.
So off I go, smash together some Ar.js in a few days because it looks easy. Yet I quickly find out that the compass on mobile phones are completely trash. Every device I try has true north randomly chosen from anywhere between 10 degrees wrong or full on 180 degrees the opposite direction. It's a miracle that none of these devices have managed to stumble onto true north by luck. I'm getting suspicious that ar.js is actually just mapping coordinates based on north instead of true north or something ridiculous. This likely won't be helped by GPS interference from the skyscraper.
It isn't helped that ar.js is a steaming pile of bugs on top of bugs, many of the examples taken straight from documentation straight up don't work.
I'm trying to get ar.js with three.js now in the hope that I can figure out some kind of true north calibration controls as an offset to whatever the phone says north is.
If anyone has any suggestions for a better solution that would be grand.5 -
I had a discussion - no, it was more a lobotomy - with one of our "experts"
I was kinda confused, as he had several grafana tabs open and an query editor...
He explained to me that he debugs and optimizes his query based on the grafana data....
Elasticsearch cluster with several hundred, different indices, > 20 TB data
I explained to him the scrape interval of 5secs, that he cannot distinguish his query from other queries, that there is far too much of an interference... Let alone that a 5 sec scrape interval is a very loooong time.....
Nope. It makes perfect sense to him and he'll continue to work like this.