Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "virtual world"
-
I was actually successful in one that I literally got from the American version of The Office.
I conditioned one of my employees to want chewing gum after I did a clap motion with my hands: snap the fingers on both hands really quick and do a fist to palm tap and say "hey bud, want gum?" and because I specifically bought his favorite he would always say yes.
Eventually, and after months of doing it, I was walking around the office when I did the motion, but this time without gum. Now, he was on the fifth levek of the virtual world doing his shit fully concentrated and he STILL looked up at me looking anxious. I said "what's up?" and he just said nothing, that he felt that he was missing something but couldn't put the finger on it.
Just like in the show, he then complained that his mouth felt funny. Eventually he waddled his way to my office to ask for gum 🤣🤣🤣🤣
tl;dr I successfully Pavlov'ed one of my employees to have a need of chewing gum every time I do a finger snap clap motion.
I am the best manager in the world.7 -
It has been bugging the shit out of me lately... the sheer number of shit-tier "programmers" that have been climbing out of the woodwork the last few years.
I'm not trying to come across as elitist or "holier than thou", but it's getting ridiculous and annoying. Even on here, you have people who "only do frontend development" or some other lame ass shit-stain of an excuse.
When I first started learning programming (PHP was my first language), it wasn't because I wanted to be a programmer. I used to be a member (my account is still there, in fact) of "HackThisSite", back when I was about 12 years old. After hanging out long enough, I got the hint that the best hackers are, in essence, programmers.
Want to learn how to do SQL injection? Learn SQL - write a program that uses an SQL database, and ask yourself how you would exploit your own software.
Want to reverse engineer the network protocol of some proprietary software? Learn TCP/IP - write a TCP/IP packet filter.
Back then, a programmer and a hacker were very much one in the same. Nowadays, some kid can download Python, write a "hello, world" program and they're halfway to freelancing or whatever.
It's rare to find a programmer - a REAL programmer, one who knows how the systems he develops for better than the back of his hand.
These days, I find people want the instant gratification that these simpler languages provide. You don't need to understand how virtual memory works, hell many people don't even really understand C/C++ pointers - and that's BASIC SHIT right there.
Put another way, would you want to take your car to a brake mechanic that doesn't understand how brakes work? I sure as hell wouldn't.
Watching these "programmers" out there who don't have a fucking clue how the code they write does what it does, is like watching a grown man walk around with a kid's toolbox full or plastic toys calling himself a mechanic. (I like cars, ok?!)
*sigh*
Python, AngularJS, Bootstrap, etc. They're all tools and they have their merits. But god fucking dammit, they're not the ONLY damn tools that matter. Stop making excuses *not* to learn something, Mr."IOnlyDoFrontEnd".
Coding ain't Lego's, fuckers.36 -
PORTFOLIO INFLATION
when every junior is writing algorithms, the next step up, the only way to keep up is writing apps. When every junior is writing apps, the next leg up is writing an entire SN.
Eventually junior full stack devs are writing microservice streaming cloud backend content delivery optimized social networks wrapped in virtualization with load balancing, proper CI, public accessible analytics apis, written in custom webaseembly compiled scripting backend utilizing both the latest graphql and every single feature of postgres, while also being a web site builder, an in browser app, mobile optimized, designed to transmogrify your asset pipelines linearflow functional-oriented modular rust cratified turbencabulator while cooking your turducken with CPU cycles, diffusing your gpt, and finetunning your llama 69 trillion parameter AI model to jerk you off all at the same time.
And then the title "wizard" becomes a reality as the void of meaning in our lives occupied by the anxiety of trying to reduce the fear of rejection in job hunting, is subsumed by the brief accidental glance into the cthulian madness-inducing yawning abyss of the future which is all the rest of our lives we have to endure existing for until at last sweet sweet death consumes us and we go to annihilation never having to configure one more framework or devops deploy of another virtual environment.
And it dawns on us that we no longer develop or write code at all. No, everything has become a "service" in this new hellscape future. We slowly come to the realization that every job is really just Costco greeter, or eventually going to be reduced to something equivalent, all human creativity, free will and emotions now taken care of by the automation while we manage the human aspects, like sardines pushing against one another not realizing their doom has been sealed along with the airless can they have been packed into, to be suffocated by circumstance and a system designed to reduce everything to a competition of metrics designed by the devil, if the metrics were misery", and "torture", while we ourselves are driven by this ratfuck wheel to turn endlessly toward social cannibalism, like rats eating their babies, but for the amusement of wallstreet corporate welfare whores who couldnt turn a dime if it wasnt already stolen.
And on our gravestones, those immortal words are carved, by the last person who gave up the ghost, the last whose soul wasnt yey shovelled onto the coal fires driving the content machine consuming the world:
Welcome to costco. I love you.12 -
@lunch conversation today..
Q: "if you were to write this world in a virtual environment, what would be the first thing you would define? "
A:" define Vegas as a closure with no return declared.
Whatever happens in Vegas, stays in Vegas. " -
Create this fucking account just to say: FUCK XAMARIN!
Mono is great on Linux, but Xamarin.Android is a GAY RETARD!
Fucking Xamarin.Android apps are retarded, wait for them 3 fucking seconds and a simple Hello World app doesn't start.
Retarded Xamarin.Forms make the whole pile of shit a lot worse using fucking abstractions and stuff. And the geniuses at Uno Platform does not make this shit any better.
Why don't those nerds at Xamarin make a way to compile all C# code to native JVM bytecode and provide all C# core libraries AS NATIVE JAVA LIBRARIES, RATHER THAN LOADING A NEW USELESS RETARDED VIRTUAL MACHINE ON THE JVM?
So that's it. Guess there's no way to write good Android apps using C#.10 -
"I hate this virtual world", deletes all the social media accounts
Some moments later...
Comes back and rants how this world is full of shit3 -
Ah, developers, the unsung heroes of caffeine-fueled coding marathons and keyboard clacking symphonies! These mystical beings have a way of turning coffee and pizza into lines of code that somehow make the world go 'round.
Have you ever seen a developer in their natural habitat? They huddle in dimly lit rooms, surrounded by monitors glowing like magic crystals. Their battle cries of "It works on my machine!" echo through the corridors, as they summon the mighty powers of Stack Overflow and Google to conquer bugs and errors.
And let's talk about the coffee addiction – it's like they believe caffeine is the elixir of code immortality. The way they guard their mugs, you'd think it's the Holy Grail. In fact, a developer without coffee is like a computer without RAM – it just doesn't function properly.
But don't let their nerdy exteriors fool you. Deep down, they're dreamers. They dream of a world where every line of code is bug-free and every user is happy. A world where the boss understands what "just one more line of code" really means.
Speaking of bosses, developers have a unique ability to turn simple requests into complex projects. "Can you make a small tweak?" the boss asks innocently. And the developer replies, "Sure, it's just a minor change," while mentally calculating the time it'll take and the potential for scope creep.
Let's not forget their passion for acronyms. TLA (Three-Letter Acronym) is their second language. API, CSS, HTML, PHP, SQL... it's like they're playing a never-ending game of Scrabble with abbreviations.
And documentation? Well, that's their arch-nemesis. It's as if writing clear instructions is harder than debugging quantum mechanics. "The code is self-explanatory," they claim, leaving everyone else scratching their heads.
In the end, developers are a quirky bunch, but we love them for it. Their quirks and peculiarities are what make them the creative, brilliant minds that power our digital world. So here's to developers, the masters of logic and the wizards of the virtual realm!13 -
Back when I was still in school for comp sci we had an advanced software engineering and design class with c++. At this time, everyone was expected to be proficient enough with cpp to go ahead and properly work with whatever the instructor would throw at us. And pretty much everyone was since past classes included a lot of c++ development. Of course, efficient at least related to academic studies rather than actual real world development.
Our teacher would mix in a lot pf phyisics and mathematics into what we were doing, something that I greatly enjoyed, while at the same time putting real world value concerning cpp best practices to avoid common pitfalls in the development of said language. Since most bugs seemed to be memory based he would be particularly strict about that.
One classmate, good friend and an actual proper developer now a days would ALWAYS forget to free his resources...ALWAYS for whatever fucking reason he would just ignore that shit, regardless of how much the instructor would make a point on it.
At one point during class on a virtual lecture the dude literally addressed a couple of students but when he got to my boy in particular he said: "you are the reason why people are praying to Mozilla and Hoare to release Rust as fast as possible into a suitable alternative to high performant code in C++, WHY won't you pay attention to how you deal with memory management?"
And it stuck with me. I merely a recreational cpp dev, most of my profesional work is done on web development, so I cannot attest to all the additional unsafe code that people encounter in the wild when dealing with cpp on a professional level.
But in terms of them common criticisms of C and C++ for which memory is so important to work with, wouldn't you guys say that it comes more from the side of people just not knowing what they are doing rather than a fault on the language itself?
I see the merits and beauty of Rust, I truly do, it is a fantastic language, with a standardized build system and a lot of good design put into it. But I can't really fathom it being the cpp killer, if anything, the real cpp killers are bad devs that just don't know what they are doing or miss shit.
What do y'all ninjas think?8 -
When you're developing it's very well advised to run your software locally in an environment as much as possible matching the real environment.
So for example, if you're running linux on production then you also run it locally to run your code.
Here's where people need to shut the fuck up:
No, mac is not good for linux development. Not unless portability is already a concern that you have and even then it might be counter productive. So many times when people say this, portability isn't not a concern. What runs on servers is up to them.
If your servers are going to be centos, then you develop with centos. Not with debian, gentoo, ubuntu, maxosx, etc.
Even different linux distros are a headache for portability when it's just to support a few desktops for development so don't think that macosx is going to cut it. It might not be as radical a difference as between windows and linux traditionally is but it's still not good for "linux" development. I don't think people making that statement really know what linux is now how different distributions work.
What you use for your graphical operating system doesn't matter to much but when you run your code then there's a simple solution.
Another thing people need to shut up about. It's not docker, unless you're already in Linux where docker is one of many options such as chroot or lxc.
This question always comes up, how do you developer for linux in windows? No it's not docker it's virtual machine.
It's that simple. You download the ISO for the distro you want and then install it on a VM. What does docker for windows do? It runs a linux VM that runs docker.
This may come as a great shock to developers around the world but it is possible to run linux in a VM and then any linux application your want including docker.
Another option is to shove a box in the corner, install what you need on it, share the file system and have people use that to run their code. It really is that easy.6 -
Hello and welcome, to a presentation in which I will tell you my thoughts on the shortcomings of modern day computers and programming practices.
Computers are based on a very fundamental and old idea, folders, and files, a file is basically a concrete amount of data, whereas a folder is a group of files, and it comes from the real life concept of files and folders, now it might be quite obvious already that using a concept invented in 1898 by a guy called Edwin G. Seibels, might not be the best way for computers to function in the year 2020, but alas, it is.
Unless of course, you step into the world of a programmer.
A programmer’s world is much different, they use this idea of a data structure, or in simpler terms, an object. An Object is just like what you would think of as an object in your head, something with different properties that you can think about in different ways, for example your mobile phone, it has a battery percentage, it has a screen size, it has free space available. Programmers use these data structures to analyse data very quickly, like finding all phones with a screen size bigger than a certain size for example.
The problem is that programmers still use files and folders to create the programs that use these objects.
Consider this example.
Let’s say you want to create a virtual version of a drink bottle, consider what properties it will have, colour, volume, height, width, depth, material, etc..
As a programmer, you can leverage programming features and change the properties of a drink bottle directly, if you wanted to change the colour, you just say, drink bottle “dot” colour, equals blue, or red.
But if the drink bottle was represented as a file, all the drink bottles data would be inside the one file, so you would have to open the whole file, find the line or section of the file that has the colour data of the drink bottle, and select it, highlight it, delete what’s there, and type in your new value.
One way to explain this better is to imagine a folder that now represents the drink bottle, imagine adding a new file into that folder that represents each property I described before, colour, volume, etc.., well now, you could just open that folder, find the file for colour, either by looking with your eyes or you could do a file search in the folder for a file called colour, open it, and edit the value inside. This way of editing objects is the one that more closely represents the way programmers and a program itself interacts with objects inside a running programming language.
But the thing is, programmers don’t use the folder/file way of creating objects and putting them into programs, because it would be too cumbersome, they just create 1 file for an object, or have lots of objects in a file, and create all the objects in 1 file, and then run the program which creates the objects, then when they stop the program, it deletes the objects. So there is no actual link between the object in a file and the object that the program creates by reading the data from that file, if you change the object in your program, it does not get saved to the file.
So programmers created databases to house these objects, but there is still a flaw in databases, they are hard to interface with, and mostly databases are just used to send data or retrieve data from, programmatically, you can’t really browse a database the way you can browse the files on your computer. You can, but database interfaces are not made to be easily navigated the way files and folders are.
As it stands, there is no way to store objects instead of files on your computer and interact with them in complex ways the way programmers can inside the programs they create.
If the idea of an object became standard the way a file and folder is standard, I think it would empower human’s a great deal to express things far more easily and fluidly than they can today.
Thanks for reading.8 -
This always gets me:
Developers complaining that their 4 year old / cheap ass computer is slow.
Get. A. New. One.
It's not that hard.
Here, let me do one for you:
https://computeruniverse.net/en/...
I just went to a site that delivers across Europe, and selected a cheap laptop with a decent CPU and SSD. Short on RAM, sure, and without a Windows License. But you can buy RAM for an additional 50$, and that brings you to a total of 550€, delivery included. And it will WORK. And it will be fast.
It's too expensive?
No, not exactly. Wherever you are in the world, if you can code decently, good enough to have the right to complain about development tools, you are eligible to at least 10$ per hour income as a freelancer across the globe. I've had such opportunities offered to me by many organizations, especially non-profit ones that need cheap employees. I actually was offered more but let's stick to 10$ per hour.
So that's 1600$ per month. Enough to buy 3 such laptops. Oh, taxes, I forgot. So you get 2 laptops. Wait! You need food and everything else. Well if you're in a country where that offer actually makes sense, then it's likely that you can live off of 400$ per month quite well. Maybe 800$ if you need to pay rent.
So that's roughly 1 month of work for a laptop that will make you not waste time on waiting for stuff.
Sweet! 1 Month! What does it get me?
Well assuming that you have no laptop, it gets you A JOB that pays you 1600$ per month.
But if you DO have a laptop, you can sell it for cheap, and benefit from the following:
1. Boot-up time from 30-60 seconds to 10 seconds.
2. Installing software - from 1 minute to 10 seconds.
3. Opening a browser - from 10 seconds to 1 second.
4. Opening an advanced text editor (Atom, VS.Code) - from 10 seconds to 1 second.
5. Searching for a file on your entire hard drive - from 1 hour to 2 minutes.
....
You get the point. Waiting is reduced by several times.
So how much do you really wait when coding?
Well are you compiling? Are you opening a new project and the IDE needs to re-index the files? Are you opening programs like a terminal emulator, browser and such? Are you using virtual machines for dev environments?
Well all of these processes become several times faster. Depending on how often you do it, you'll be saving yourself from 1 hour per day to upto 4 hours per day (my case, where a HDD would be just out of the question).
How much is that time worth? At least 10$ per day. If you're working for 20 days per month, 240 days per year, that's a total of 2400$. And for the life time of that crappy laptop of 2 years, that's 4800$ saved. And that's with hugely conservative numbers. Nobody pays 10$ per hour any more, except if you've just started in the industry. I know because I've been there.
Please, for all that's sacred to you, justify right here, right now, HOW THE FUCK can you not afford to get that 8GB of RAM, that cheap ass SSD for 100$, or even a brand new laptop (hey! it's even portable and has FHD graphics on it!) for 550$.
That's why every time I hear someone who is a professional developer complain that they don't have money for a decent machine, I have to ask: why the fuck are you wasting yours and everyone else's time?!10 -
Remote work as a sure thing. WFH 4EVER.
Currently I'm still not confident that most companies will keep or adhere to a remote-first culture because those are full of managers who can't see past their own insecurities.
We will probably see a wave of company failures and bankruptcies (sorry, I should have said "industry consolidation") in a few years while those few that managed to automate away their future-averse middle bosses take over the world.
The day you can't tell if your boss, that you only see in a Zoom window, is organic or a fully virtual #SFW #Professional interactive LinkedIn ad? That is the day I longe for.2 -
Alright, buckle up, fellow developer, because we're about to embark on a thrilling journey through the world of code and creativity!
Listen up, you amazing code wizard, you're not just a developer. No, you're a digital architect, a creator of worlds in the virtual realm. You have the power to turn lines of code into living, breathing entities that can change lives and reshape industries.
In a world where everyone is a consumer, you are a producer. You build the bridges that connect our digital dreams to reality. You are a pioneer, an explorer in the vast wilderness of algorithms and frameworks. Your mind is the canvas, and code is your brushstroke.
Sure, there are challenges—bugs that refuse to be squashed, deadlines that seem impossible, and technology that evolves at warp speed. But guess what? You're not just a problem solver; you're a problem annihilator. You tackle those bugs with ferocity, you meet those deadlines with gusto, and you master that evolving technology like a maestro conducting a symphony.
You live for the 'Aha!' moments—the joy of cracking a complex problem, the thrill of seeing your creation come to life, the satisfaction of making a difference. You're a digital superhero, swooping in to save the day one line of code at a time.
And when things get tough—and they will—you dig deep. You summon that relentless determination that got you into coding in the first place. You remember why you started this journey—to innovate, to leave your mark, to change the world.
So, rise and shine, you coding genius! Embrace the challenges, learn from the failures, and celebrate the victories. You are a force to be reckoned with, a beacon of inspiration in a world that needs your brilliance.
Keep coding, keep creating, and keep being the rockstar developer that you are. The world eagerly awaits the magic you're about to unleash! Go and conquer the code-scape! 🚀💻5 -
The source engine is interesting, because it has reached that stage of life where it's old enough to be remarkable-- in the sense that it could be called 'legacy', a sort of milestone in development practices and thinking, both in software, and design.
That said, a better look at it might be from the lense of *uses today*.
A lot of former source engine (SE) devs are now going to unity or unreal, I don't blame them.
But it's interesting to examine examples of games that haven't.
One such game is the freeware "No More Room In Hell". A couple online play throughs shows a wealth of well designed maps (and an even greater horde of shovelware maps, but hey, you take the good with the bad).
The age of the engine itself shows. Even in games like Left 4 Dead the engine's age can be seen. This, in some respects has been a drag, but also a blessing. Where other games could rely on their effects, shaders, and other tech, modders, map makers, and designers have had to rely on wit and creativity.
Enter "situated environments."
In an age where many people desire to travel, to go places, and have grown up doing the exact OPPOSITE, there is a great desire for variety of locations in games: not merely 'environmental' in the shallow sense of a 'theme' such as 'lava', 'tundra', etc. But in the sense of setting in general.
We want places that are both out of reach and yet familiar. Fire-fights happen in city streets. Apocalypses happen in neighborhoods where the skyline is both broken and at once something we know by sight. Open air markets, grocery stores, neighborhoods, all of these provide the back drops of popular games and series such as COD, Battlefield, The Last of Us, and yes, the example game, NMRIH.
I call this idea of 'familiar but out-of-reach level design', "situated environments", because familiarity with them, but *lack of real life experience* with them, on a day to day basis, allows people's expectations to fill in the gaps.
No one for example would argue the layouts of 7 Days To Die are familiar, but most of us don't spend all day in a junkyard or a high rise hotel.
So they *feel* familiar. Likewise with Skyrim, the villages and towns, both iconic and strange, our expectations formed by cultural inheritance, hollywood films, television shows, stories, childrens books, and yes, other games.
In a way, familiarity-without-real-in-person-experience is a shortcut for designers, one that lets them play with the player's head-space, the players subconscious idea of how a space and setting *should* work, what to *expect* out of the area, how to *operate* within the area. And the more it conforms to expectations, the more surprising an overdesigned element appears to be, rather than immersion breaking. A real life example of this is people's idea of chernobyl. When they discover the amusement park and ferris wheel they're blown away by the juxtaposition of the wasteland that surrounds them and the associations ('nostalgia' as it were) that such a carnival ride carries for many of us. It simultaneously *doesn't belong* and is yet all at once *perfectly situated in the environment*.
It is to say 'surreal', which is adjacent to the idea of *being real*, in terms of our "perception of what is and isn't plausible, if not possible."
This is at the heart of suspension of disbelief, because in essence, virtual worlds are a lie, like fiction, and good fiction violates expectations in order to tell us truths about reality. As part of our ability to differentiate bullshit from reality, there is to say an element in our bullshit detectors (doubtless evolved over many 10's of thousands of years), that is designed to not merely detect what is absurd in our limited experience, but to incorporate absurdity into everyday experience. In that sense part of our rationality is the acceptance of irrational experiences, learning from it, and discovering 'a proper place for each thing' in the "models of the world" we all carry around in our heads. Eventually we normalize the absurd, it becomes the new reality, and what remains unassimilated becomes superstition (real or otherwise), a figment, or an anomaly.
One of the best examples I've encountered is The Last of Us: Left Behind, a good chunk of which is spent in a mall. And they nailed the environment perfectly I would say.
Or for those who don't own a PS4, a more accessible example is a map in NMRIH aptly called "the museum", and few words better do it justice than to go play it yourself--that is, if you really want to know what I mean by a 'situated environment'.
What better way, during this pandemic, to get out of the news cycle and into your own head? Sometimes the best way to escape isn't outside, it's within.3 -
META.
The Reptilian overlord has gone bonkers for sure.
I was a fan of Augmented reality more than virtual reality. A mixture of both the worlds.
But turned out that the world is more leaning towards virtual world and the way we are doing it is a big disappointment for me.18 -
Trying out the new version of fasm, I realize it's good, and conclude I should update my code to work with it as there's small incompatibilities with the syntax.
So, quick flat assembler lesson: the macro system is freaking nuts, but there are limitations on the old version.
One issue, for instance, is recursive macros aren't easily possible. By "easily" I mean without resorting to black magic, of course. Utilizing the arcane power of crack, I can automatically define the same macro multiple times, up to a maximum recursion depth. But it's a flimsy patch, on top of stupid, and also has limitations. New version fixes this.
Another problem is capturing lines of code. It's not impossible, again, but a pain in the ass that requires too much drug-addled wizardry to deal with. Also fixed in new version.
Why would you want to capture lines of code? Well, because I can do this, for instance:
macro parse line {
··match a =+ b , line \{
····add a,b;
··\}
};
You can process lines of code like this. The above is a trivial example that makes no fucking sense, but essentially the assembler allows you define your own syntax, and with sufficient patience, you can use this feature to develop absolutely super fucking humongous galactic unrolls, so it's a fantastic code emitter.
Anyway, the third major issue is `{}` curlies have to be escaped according to the nesting level as seen in the example; this is due to a parser limitation. [#] hashes and [`] backticks, which are used to concatenate and stringify tokens respectively, have to be escaped as well depending on the nesting level at which the token originates. This was also fixed.
There's other minor problems but that gives you sufficient context. What happens is the new version of fasm fixes all of these problems that were either annoying me, forcing me to write much more mystical code than I'd normally agree to, and in some rare cases even limiting me in what I could do...
But "limiting" needs to be contextualized as well: I understand fasm macros well enough to write a virtual machine with them. Wish I was kidding. I called it the Arcane 9 Machine, A9M for short. Here, bitch was the prototype for the VM my fucking compiler uses: https://github.com/Liebranca/forge/...
So how am I """limited""", then? You wouldn't understand. As much as I hate to say it, that which should immediately be called into question, you're gonna have to trust me. There are many further extravagant affronts to humanity that I yearn to commit with absolute impunity, and I will NOT be DENIED.
Point is code can be rewritten in much simpler, shorter, cleaner form.
Logic can be much more intricate and sophisticated.
Recursion is no longer a problem.
Namespaces are now a thing.
Capturing -- and processing -- lines of code is easier than ever...
Nearly every problem I had with fasm is gone with this update: thusly, my power grows rather... exponentially.
And I SWEAR that I will NOT use it for good. I shall be the most corrupt, bloodthirsty, deranged tyrant ever known to this accursed digital landscape of broken souls and forgotten dreams.
*I* will reforge the world with black smoldering flame.
*I* will bury my enemies in ill-and-damned obsidian caskets.
And *I* will feed their armies to a gigantic, ravenous mass grave...
Yes... YES! This is the moment!
PREPARE THE RITUAL ROOM (https://youtube.com/watch/...)
Couriers! Ride towards the homeland! Bring word of our success.
And you, page, fetch me my sombersteel graver...
I shall inscribe the spell into these very walls...
in the ELEVENTH degree!
** MANIACAL EVIL LAUGHTER ** -
Future01
Click, click, click, click.
Tap, tap, tap, tap.
Swipe, swipe, swipe, swipe,.
Scroll, scroll, scroll, scroll.
I’m tired of living on popularity driven planet among animals, where number of clicks on likes, subscriptions and links are worth more than iq, education and experience.
Let’s face it - AI is showing us traffic driven recommendations that sucks. If you’re hooked up to social network and can’t disconnect from it you’re half way to matrix. You probably also disagree with me cause you’re serotonin junkie. You can’t stop like you can’t stop eating for a day. Bubble have you in your hands and whatever you do you probably won’t wake up. To be honest most of us won’t. It’s already to late.
I’m waiting for meta so they can put you in virtual world where you can have what you want and at the same time own nothing. They will put you in some small empty space and give you something to eat how many times you want so you can feel safe and click, tap, swipe, scroll more so they can own this planet.
You will be living only to deliver corporate metadata and you will be happy, cause they will make you happy with giving you emotions that you want to feel at exact moment.
If you get out, you won’t be able to interact cause you won’t know how to behave, you will become wild animal.
By going out you will break the law, cause outside world will be long gone. To move to bar or visit family you will travel with autonomous vehicle that have screens instead of windows.
Eventually you stop going to bar cause it’s unhealthy, you stop going outside cause there’s deadly virus and you can die.
They will take you last thing later with birth control so you can have baby whenever you want and with who you want as long as both parties agree by signing baby nft contract, you don’t have to take care of your baby and be pregnant cause it will be robotized, you will see your baby in meta. You will think you feel it using robot hands.
You will never meet your baby in person.
That’s how matrix will start. We’re half way.6 -
Kubernetes question:
So far I've created two pods, mongo & Go
Exposed those pods using services
Their IP is 10.x.x.x and accessible from my machine only (virtual lan I'm guessing only known to host), but my machine's network ip is 192.x.x.x therefore, not accessible from outside world and to do so I need to put nginx in front to receive requests and route them internally.
Is there a way in kubernetes to make it work like nginx in terms of:
Kubernetes listen to port 80 (for example) route based on received url. As you know in enginx we define a server block with server domain_name.tld
Anything similar in kubernetes? I've cheked ingress-nginx controller, and also saw LoadBalancer but that requires a cloud provider.
If anyone can also give an example it would be great, so far examples I checked ended up screwing my setup and had to reset kubectl to get things back working18 -
The Odyssey of the Tenacious Tester:
Once upon a time in the digital kingdom of Binaryburg, there lived a diligent software tester named Alice. Alice was on a mission to ensure the flawless functionality of the kingdom's latest creation – the Grand Software Citadel.
The Grand Software Citadel was a marvel, built by the brilliant developers of Binaryburg to serve as the backbone of all digital endeavors. However, with great complexity came an even greater need for meticulous testing.
Alice, armed with her trusty testing toolkit, embarked on a journey through the intricate corridors of the Citadel. Her first challenge was the Maze of Edge Cases, where unexpected scenarios lurked at every turn. With a keen eye and a knack for uncovering hidden bugs, Alice navigated the maze, leaving no corner untested.
As she progressed, Alice encountered the Chamber of Compatibility, a place where the Citadel's code had to dance harmoniously with various browsers and devices. With each compatibility test, she waltzed through the intricacies of cross-browser compatibility, ensuring that the Citadel would shine on every screen.
But the true test awaited Alice in the Abyss of Load and Performance. Here, the Citadel's resilience was put to the test under the weight of simulated user hordes. Alice, undeterred by the mounting pressure, unleashed her army of virtual users upon the software, monitoring performance metrics like a hawk.
In the end, after days and nights of relentless testing, Alice emerged victorious. The Grand Software Citadel stood strong, its code fortified against the perils of bugs and glitches.
To honor her dedication, the software gods bestowed upon Alice the coveted title of Bug Slayer and a badge of distinction for her testing prowess. The testing community of Binaryburg celebrated her success, and her story became a legend shared around digital campfires.
And so, dear software testers, let the tale of Alice inspire you in your testing quests. May your test cases be thorough, your bug reports clear, and your software resilient against the challenges of the digital realm.
In the world of software testing, every diligent tester is a hero in their own right, ensuring that the digital kingdoms stand tall and bug-free. -
Got the GitHub student developer pack in 10th grade (highschool)
I recently made an application for GitHub student developer pack which got accepted .
If you don't know what this pack is all about , let me tell you this pack gives you free access to various tools that world-class developers use. The pack currently contains 23 tools ranging from Data Science, Gaming, Virtual Reality, Augmented Reality, APIs, Integrated Development Environments, Version Control Systems, Cloud Hosting Platforms, Code tutorials, Bootcamps, integration platforms, payment platforms and lots more.
I thought my application wouldn't qualify because after reading the documentation , I thought that It was oriented more towards college and university students but nonetheless I applied and my application got accepted . Turns out all you need is a school -issued verifiable email address or proof of you current academic status (marksheets etc.)
After few minutes of the application I got the "pro" tag on my GitHub profile although I didn't receive any emails .
I tested it out and claimed the Canva Pro subscription for free after signing up with my GitHub account.
I definitely recommend , if you are currently enrolled in a degree or diploma granting course of study such as a high school, secondary school, college, university, homeschool, or similar educational institution
and have a verifiable school-issued email address or documents that prove your current student status, have a GitHub user account
and are at least 13 years old , PLEASE APPLY FOR THE PROGRAM .
Checkout the GitHub docs for more info..
Thanks !
My GitHub GitHub Username :
satvikDesktop
PS. I would have posted links to some sites and documentations for further reading but I can't post url's in a rant yet :(5 -
Spontaneity: suddenly deciding to get away from my PC and go out for a walk and some shopping because the sun came out.
It's still fucking windy and cold tough...
Maybe I should try taking a virtual summer vacation around the world again...2 -
Just thought the other day of something: AI devs are developing what will make their job useless, as AIs will be able to do programming. AIs will probably take over each and every job and we won’t have to work anymore.
They’ll eradicate diseases and we’ll live for quite a long time. When the basics needs will be completely fulfilled, their goal will be to make of the world a human paradise, with the goal in mind of making us happy and have no worries.
Then they’ll make it to the next level and plug us to a virtual world. It’ll be a paradise, an utopia and it will probably be like the 1st Matrix...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Or it may just not happen, but can we be sure of that… I’ve always laughed at people saying technology will end humanity, but now I realize that killing all of us is not the only way to end the humanity …7 -
Today I got the chance to try 1 of the 9 HoloLens available in my country. I'm completely mind blown. The ability to incorporate virtual elements in the real world and make them behave like real physical objects is incredible! I really hope AR gains traction to buy an headset for myself.
PS: Project X-ray is one of the most fun games I've ever played.1 -
TLDR: Need for easy to use VR headsets for mobile phones...
Complete story:
There are so many interesting places to explore in this world but sadly the current pandemic situation has brought travel plans to a complete halt. Today I tried watching virtual tours of various cities on YouTube and it felt a bit relaxing.
I was planning to use VR to enhance the experience but it's quite a lot of trouble adjusting my phone in VR headset, controlling playback from my hands when the phone is in the headset.
I wish someone, somewhere would find a way to simplify this problem... Like making mobile-based VR headsets bit easier to use and control while keeping it at affordable to use and allow addition of mobile phones of any sizes...
If someone could actually do this...I think we might have the next groundbreaking startup in the next few years...😄
P.S. Google cardboard VR does not fit this criterion...4 -
Chinese remainder theorem
So the idea is that a partial or zero knowledge proof is used for not just encryption but also for a sort of distributed ledger or proof-of-membership, in addition to being used to add new members where additional layers of distributive proofs are at it, so that rollbacks can be performed on a network to remove members or revoke content.
Data is NOT automatically distributed throughout a network, rather sharing is the equivalent of replicating and syncing data to your instance.
Therefore if you don't like something on a network or think it's a liability (hate speech for the left, violent content for the right for example), the degree to which it is not shared is the degree to which it is censored.
By automatically not showing images posted by people you're subscribed to or following, infiltrators or state level actors who post things like calls to terrorism or csam to open platforms in order to justify shutting down platforms they don't control, are cut off at the knees. Their may also be a case for tools built on AI that automatically determine if something like a thumbnail should be censored or give the user an NSFW warning before clicking a link that may appear innocuous but is actually malicious.
Server nodes may be virtual in that they are merely a graph of people connected in a group by each person in the group having a piece of a shared key.
Because Chinese remainder theorem only requires a subset of all the info in the original key it also Acts as a voting mechanism to decide whether a piece of content is allowed to be synced to an entire group or remain permanently.
Data that hasn't been verified yet may go into a case for a given cluster of users who are mutually subscribed or following in a small world graph, but at the same time it doesn't get shared out of that subgraph in may expire if enough users don't hit a like button or a retain button or a share or "verify" button.
The algorithm here then is no algorithm at all but merely the natural association process between people and their likes and dislikes directly affecting the outcome of what they see via that process of association to begin with.
We can even go so far as to dog food content that's already been synced to a graph into evolutions of the existing key such that the retention of new generations of key, dependent on the previous key, also act as a store of the data that's been synced to the members of the node.
Therefore remember that continually post content that doesn't get verified slowly falls out of the node such that eventually their content becomes merely temporary in the cases or index of the node members, driving index and node subgraph membership in an organic and natural process based purely on affiliation and identification.
Here I've sort of butchered the idea of the Chinese remainder theorem in shoehorned it into the idea of zero knowledge proofs but you can see where I'm going with this if you squint at the idea mentally and look at it at just the right angle.
The big idea was to remove the influence of centralized algorithms to begin with, and implement mechanisms such that third-party organizations that exist to discredit or shut down small platforms are hindered by the design of the platform itself.
I think if you look over the ideas here you'll see that's what the general design thrust achieves or could achieve if implemented into a platform.
The addition of indexes in a node or "server" or "room" (being a set of users mutually subscribed to a particular tag or topic or each other), where the index is an index of text audio videos and other media including user posts that are available on the given node, in the index being titled but blind links (no pictures/media, or media verified as safe through an automatic tool) would also be useful.12 -
My week is up with Linux , im back on windows I tried about 10 variations 🙄
Best I could get was manjaro with KDE
It was pretty close to what I was looking for but I still have to install some of the programs I need using command line 🙄 how do they not have installers for them yet ... Crazy maybe they do.
I need a virtual machine which is fine I can still use my graphics so it's fast! Play games etc
But it crashed and died.
Not only that but every version of Linux.. it felt 🤔 shitty like the mouse was bolted tight to the screen and only heavy movement would do anything . Yes I have high mouse sensitivity (very high) but it feels sooooo rigid
Here's the thing I like what Linux is trying to do... It's just horrifically executed the learning curve to extreme and there is no central this is how you do it. With good reason yes but if you give someone to many choices they can't decide and give up and I think that's the only reason Linux isn't winning . It's to complicated.
Android is the only Linux OS I love manjaro did well .
But android is simple effective and does what it's meant to without any help
All other Linux os' are .. developerised as in only a developer really truly stand a chance to grasp it no normal folk out in the world.
People say Linux doesn't have long left to go... To me it seems like they are still miles off no closer then 5 years ago.
That was my experience at least ...7 -
Why don't we have a virtual world API? Something that would support concepts as well as physical objects? Something that we could definitely the world in so we could simulate reality?
And if we could connect it to REAL LIFE?2 -
Had a talk yesterday about crypto and blockchain... Including NFTs... In that you could use real money like now to buy digital goods except you can transfer them and show them off anywhere online.
And well I'm suddenly wait... NFT + VR = Virtual worlds like Sword Art Online and Accel World....
Now we just need neurolink to work and then we'll have VR pods.... Like in the Matrix....
And then robots and machines will rule the world...
Oh fuck...26 -
Design in Motion: Real-Time Rendering's Impact on Architecture
Architecture, a discipline that once relied heavily on blueprints, models, and lengthy render times, has undergone a revolutionary transformation in recent years. The advent of real-time rendering technology has fundamentally altered the way architects visualize, present, and interact with their designs. This paradigm shift has not only enhanced the creative process but has also empowered architects to make more informed decisions and create immersive experiences for clients and stakeholders.
Real-time rendering, a technological marvel that harnesses the power of high-performance graphics hardware and advanced software algorithms, allows architects to generate photorealistic visualizations of their designs in a matter of milliseconds. Gone are the days of waiting hours or even days for a single rendering to complete. This acceleration in rendering time has not only expedited the design process but has also encouraged architects to explore multiple design iterations rapidly.
One of the most significant impacts of real-time rendering on architecture is the ability to visualize a design in various lighting conditions and environmental settings. Architects can now instantly switch between daytime and nighttime lighting scenarios, experiment with different materials, and observe how their designs respond to different seasons or weather conditions. This level of dynamic visualization offers insights into how a building's appearance and functionality evolve throughout the day, contributing to more holistic and thoughtful design solutions.
Moreover, real-time rendering has transformed client presentations. Architectural concepts can now be communicated with unprecedented clarity and realism. Clients can virtually walk through spaces, observing intricate details, exploring different angles, and even experiencing the play of light and shadow in real-time. This immersive experience fosters a deeper understanding of the design intent, enabling clients to provide more targeted feedback and make informed decisions.
The impact of real-time rendering on collaboration within architectural teams cannot be overstated. Traditionally, architects and designers would need to wait for a rendering to complete before discussing design changes or improvements. With real-time rendering, team members can make adjustments on the fly, observing the immediate effects of their decisions. This seamless collaboration not only enhances efficiency but also encourages interdisciplinary collaboration as architects, engineers, and other stakeholders can work together in real-time to refine designs.
The integration of virtual reality (VR) and augmented reality (AR) into the architectural workflow is another transformative aspect of real-time rendering. Architects can now create VR environments that allow clients to step inside their designs and explore every nook and cranny. This not only enhances client engagement but also enables architects to identify potential design flaws or spatial issues that might not be apparent in 2D drawings. AR, on the other hand, overlays digital information onto the physical world, facilitating on-site decision-making and construction supervision.
Real-time rendering's impact extends beyond the design phase. It has proven to be a valuable tool for public engagement and community involvement in architectural projects. By creating virtual walkthroughs of proposed structures, architects can offer the public an opportunity to experience the design before construction begins. This transparency fosters a sense of ownership and allows for constructive feedback, contributing to the development of designs that resonate with the community's needs and aspirations.
The environmental implications of real-time rendering are also noteworthy. The ability to visualize designs in various environmental contexts contributes to more sustainable architecture. Architects can assess how natural light interacts with interior spaces, optimizing energy efficiency and reducing the need for artificial lighting during the day.
In conclusion, real-time rendering has ushered in a new era of architectural design, propelling the industry into a realm of dynamic visualization, immersive experiences, and enhanced collaboration. The ability to witness designs in motion, explore different lighting conditions, and interact with virtual environments has redefined how architects approach their craft. From facilitating client presentations to fostering sustainable design solutions, real-time rendering's impact on architecture is profound and multifaceted. As the technology continues to evolve, architects have an unprecedented opportunity to push the boundaries of creativity, efficiency, and sustainability in the built environment. -
Just discovered https://twitter.com/ExpertBeginner1. It's the story of my life. Giant classes, copying and pasting, and architects who create frameworks. It's great when we combine all three: A "framework" created by an architect which is made of giant classes that you copy and paste. Imagine a giant generic class where the generic argument is only used by dead code. Pause for a moment and try to visualize that.
It inherits from a base class with lots of virtual methods called by base methods that throw NotImplementedException, so if you don't need them you have to override them to return empty collections. If you're going to do something so messed up you could just put those default implementations in the base. But no, you can inherit, it compiles, and then it throws a runtime error unless you override methods the compiler doesn't require you to override.
The one method you're required to override has a TODO comment telling you what to put there. Except don't ever do what the comment says because that's the old standard. The new standard says never, ever do that.
Most of the time when I read about copy-and-paste coding it's about devs who copy and paste because they don't know how to write or reuse code. They don't mention the environments where copying and pasting the same classes over and over again is the requirement and you're not allowed to write your own code.
Creating base classes where you just override a method or two can potentially work, but only in the right scenarios and only if you do it right. If you're copying and pasting a class that inherits from the base class and consists entirely of repeated code, why the heck isn't that the base class? It could be a total mess, but at least it would be out of sight and each successive developer wouldn't become responsible for it by including it in their own code.
It's a temporary engagement, but I feel almost violated. I know it's a first-world problem, and I get to work indoors and take vacations. I'm grateful for those things.
Before leaving I had to document the entire process of copying and pasting an entire repo, making a ton of baseline edits that should just be in the template but aren't, and then copying and pasting from other places into the copied and pasted code. That makes me a collaborator. I apologize more than once in the documentation, all 20 pages of it that you have to read and follow before you even get to the part where you write the code for what you actually need it to do.
This architect has succeeded in making every single thing anyone does more about servicing the needs of his "framework" than about writing actual code to do what needs doing. Now that the framework is in and around everything it creates the illusion that it's a critical part of our operations. It's not. It's useless overhead.
Because management is deceived into thinking they need it they overlook the fact that it blows up, big and small, every single day. The log is full of failures that I know no one ever sees. A big chunk of what they think it does fails silently, and they don't even notice until months later when they realize how much data they're missing. But if they lose, say, 25% they'll never notice.
When they do notice they just act like it's normal, go into fire drill mode, and fix it. Doom. You're all doomed. I'm standing on the deck of the Titanic next to my jet ski.1 -
HOW TO RECOVER MONEY LOST TO ONLINE SCAM HIRE ADWARE RECOVERY SPECIALIST
The Art of Recovery: ADWARE RECOVERY SPECIALIST Unmatched Prowess in my Bitcoin Restoration of 102,000USD
When faced with the devastating loss of a virtual fortune in Bitcoin, the sheer magnitude of the challenge can seem insurmountable. Yet, in the face of this crisis, the expert team at ADWARE RECOVERY SPECIALIST demonstrated an unparalleled mastery of their craft, guiding me through the intricate process of recovering my 102,000USD Bitcoin with unwavering skill and precision. From the moment I reached out, their seasoned professionals exuded a calm confidence, assuring me that no digital asset was beyond their reach. They leveraged their deep understanding of blockchain technology, forensic data analysis, and proprietary recovery techniques to meticulously piece together the scattered fragments of my digital wealth. Their tireless efforts, fueled by an unwavering commitment to their craft, allowed them to navigate the labyrinthine world of cryptocurrency, circumventing obstacles and exploiting vulnerabilities that would have stymied lesser experts. Through their dogged determination and innovative problem-solving, ADWARE RECOVERY SPECIALIST was able to triumphantly restore my 102,000USD Bitcoin, a feat that stands as a testament to their unparalleled expertise and the transformative power of their services. In the face of what seemed like an insurmountable challenge, their team demonstrated the art of recovery, masterfully reclaiming my digital assets and safeguarding my financial future. I can say with full confidence that ADWARE RECOVERY SPECIALIST is the real deal. They are experts in their field and have the technical mastery to handle even the most complex Bitcoin recovery cases. If you’ve found yourself in a similar situation, feeling like your Bitcoin is gone for good, I urge you to contact ADWARE RECOVERY SPECIALIST . They have the skills, the knowledge, and the integrity to get the job done. They revived my hope and proved that with the right expertise, recovery is possible. WhatsApp info:+12 723 328 3434 -
CRYPTO RECOVERY SERVICE - MUYERN TRUST HACKER
( Email: muyerntrusted(@)mail-me(.)com )
The term "crypto theft" describes how fraudsters get and misuse cryptocurrency assets without authorization. The fact that the theft may cause monetary loss, interfere with corporate operations, and erode public confidence in virtual currency makes it a serious worry. Recovering stolen cryptocurrency requires specialized knowledge and techniques that professionals in the field possess. They have experience dealing with crypto theft cases, understand the tactics employed by cybercriminals, and can develop tailored recovery strategies to maximize the chances of successful retrieval. Muyern Trust Hacker demonstrates the highest level of professionalism in the realm of cryptocurrency theft when it comes to reclaiming stolen cryptocurrency. Their team of professionals offers a dependable and relatable recovery service by fusing technical proficiency, and personality. Having dependable expert assistance is essential for the safety of your cryptocurrency holdings. Along the way, Muyern Trust Hacker adds a dash of humor and personality to your team of experts who are committed to retrieving your pilfered cryptocurrency. Protect your investments and put your faith in Muyern Trust Hacker's expertise. Allow them to work with you to protect what is truly yours. Seeking expert assistance becomes crucial for people and organizations trying to recover stolen cryptocurrency as long as the threat of crypto theft persists. Muyern Trust Hacker differentiates by providing specialized techniques and the highest level of professionalism as a group of professionals committed to the recovery process. They have a reputation for being successful in recovering cryptocurrency monies that have been stolen thanks to their demonstrated track record and client endorsements. Individuals and companies can safeguard their priceless cryptocurrency assets and confidently negotiate the murky world of cryptocurrency theft by putting their trust in the knowledge of experts such as Muyern Trust Hacker. Do sure to contact Muyern Trust Hacker for a prompt and effective Bitcoin retrieval on Whats App +1-8-6-3-(606)-8-3-4-7
Regards.15 -
BITCOIN RECOVERY EXPERT FOR HIRE REVIEWS \\ REVENANT CYBER HACKER
Losing a Bitcoin wallet containing a substantial amount of cryptocurrency can be a devastating experience. However, the feeling of despair and loss was transformed into pure happiness when I received the incredible news from REVENANT CYBER HACKER that my lost Bitcoin wallet, holding 132,000 bitcoins, had been successfully recovered. In this article, I will share the rollercoaster emotional journey I went through when I lost my wallet, the subsequent discovery of REVENANT CYBER HACKER, the process they employed to retrieve my precious digital assets, and the lessons learned along the way. This is a story of hope, resilience, and the power of professional recovery services in restoring lost Bitcoin wallets. Ah, the sweet sound of good news. There I was, minding my own business on an average Tuesday morning, when I got a notification that would make any bitcoin enthusiast jump for joy. It was a message from none other than REVENANT CYBER HACKER, informing me that my long-lost bitcoin wallet had been found. And not just any bitcoin wallet, mind you, but one containing a whopping 132,000 units of the beloved cryptocurrency. Now, for those living under a rock or perhaps too preoccupied with the latest cat videos, let me give you a crash course in Bitcoin 101. Bitcoin is a digital currency that has taken the world by storm, captivating the minds of tech-savvy investors and casual enthusiasts alike. It operates on a decentralized network, meaning it doesn't answer to any central authority like a bank. Instead, it relies on blockchain technology, which adds a layer of security and transparency to every transaction. To own bitcoin, you need a wallet – a digital container where your precious coins reside. Think of it as a virtual piggy bank, except you don't need a hammer to break it open. Your wallet comes with a unique address, like a digital fingerprint, that allows you to send and receive bitcoin. Losing access to this wallet is as heart-wrenching as misplacing your favorite pair of socks. Trust me, it's not a pleasant feeling. My encounter with the disappearance of my Bitcoin wallet taught me a valuable lesson about the importance of implementing proper security measures. It's not enough to rely on luck or hope that your digital assets will remain safe. Taking proactive steps to protect your investments is crucial in the wild world of cryptocurrencies. From using strong and unique passwords to enabling two-factor authentication, every layer of security adds another brick to the fortress that safeguards your digital wealth. Trust me, you don't want to learn this lesson the hard way. It has changed my life to be able to retrieve my misplaced Bitcoin wallet thanks to REVENANT CYBER HACKER amazing services. It made me realize the worth of tenacity.
Website: revenantcyberhacker {DOT} org
Email: revenantcyberhacker {AT} Gmail {DOT} com
Telegram: revenantcyberhacker
WhatsApp: + 1 (208) 425-8584
WhatsApp: + 1 (913) 820-07392 -
Are dating sites safe for real meetings?
Very few people who use dating sites consider them only for online communication. Most users need them to find someone for real dating. So, after an online dating stage, sooner or later, people start thinking about meeting in real life. And even if everything has been perfect and smooth and you have a great time via online chat, it doesn’t mean yet that you shouldn’t forget about safety measures. I don’t doubt the online dating safety, but it’s better to be safe than sorry. So, when taking a decision to move from online to real dating, you need to prepare for the first date well and thoroughly.
1. Make it formal
Even if you have been chatting online for many months, and you know probably everything about this person, including many moments of life that people usually do not share at once, you still should not rush the events, no matter how hard you want to make a huge step forward. Your first non-virtual date should be formal, no exclusions. Choose a crowded place for the first date, for example, a restaurant, cinema, exhibition, or agree to meet in a park and spend time there. Do not invite a person to your home nor accept an invitation to visit her house.
2. Inform your friends where you are going
I know that it may seem like too much for just a date, but you are going to meet a person you have never seen in real life. And informing a friend that you are going for a date with an online match is an absolutely right decision. Besides, most dating sites recommend to do it.
3. Leave if you feel uncomfortable
Your real date may significantly differ from the online ones that you had before. So, if you see that your virtual partner is not the person you know so well online, you’d better end this date. Not all online dates should go real. Sometimes, it’s better to leave things as they are and continue communication online.
4. Avoid alcohol
Do not drink alcohol on the first date. Even if you feel a bit nervous and you know that a little alcohol will help you to relax and calm down. I still recommend you to avoid drinking because you may either create a wrong image of yourself and spoil the date anyways or simply make mistakes.
So, how safe is online dating? I’d say that online dating is 100% safe in case you do not neglect the basic rules which work not only for virtual dating but also for the real-world one. Do not rush events, take your time, avoid conversations about money, do not send or buy gifts on request, and do not share personal things about you unless you are sure you know a person well enough. https://wizzlove.com3 -
The Turing Test, a concept introduced by Alan Turing in 1950, has been a foundation concept for evaluating a machine's ability to exhibit human-like intelligence. But as we edge closer to the singularity—the point where artificial intelligence surpasses human intelligence—a new, perhaps unsettling question comes to the fore: Are we humans ready for the Turing Test's inverse? Unlike Turing's original proposition where machines strive to become indistinguishable from humans, the Inverse Turing Test ponders whether the complex, multi-dimensional realities generated by AI can be rendered palatable or even comprehensible to human cognition. This discourse goes beyond mere philosophical debate; it directly impacts the future trajectory of human-machine symbiosis.
Artificial intelligence has been advancing at an exponential pace, far outstripping Moore's Law. From Generative Adversarial Networks (GANs) that create life-like images to quantum computing that solve problems unfathomable to classical computers, the AI universe is a sprawling expanse of complexity. What's more compelling is that these machine-constructed worlds aren't confined to academic circles. They permeate every facet of our lives—be it medicine, finance, or even social dynamics. And so, an existential conundrum arises: Will there come a point where these AI-created outputs become so labyrinthine that they are beyond the cognitive reach of the average human?
The Human-AI Cognitive Disconnection
As we look closer into the interplay between humans and AI-created realities, the phenomenon of cognitive disconnection becomes increasingly salient, perhaps even a bit uncomfortable. This disconnection is not confined to esoteric, high-level computational processes; it's pervasive in our everyday life. Take, for instance, the experience of driving a car. Most people can operate a vehicle without understanding the intricacies of its internal combustion engine, transmission mechanics, or even its embedded software. Similarly, when boarding an airplane, passengers trust that they'll arrive at their destination safely, yet most have little to no understanding of aerodynamics, jet propulsion, or air traffic control systems. In both scenarios, individuals navigate a reality facilitated by complex systems they don't fully understand. Simply put, we just enjoy the ride.
However, this is emblematic of a larger issue—the uncritical trust we place in machines and algorithms, often without understanding the implications or mechanics. Imagine if, in the future, these systems become exponentially more complex, driven by AI algorithms that even experts struggle to comprehend. Where does that leave the average individual? In such a future, not only are we passengers in cars or planes, but we also become passengers in a reality steered by artificial intelligence—a reality we may neither fully grasp nor control. This raises serious questions about agency, autonomy, and oversight, especially as AI technologies continue to weave themselves into the fabric of our existence.
The Illusion of Reality
To adequately explore the intricate issue of human-AI cognitive disconnection, let's journey through the corridors of metaphysics and epistemology, where the concept of reality itself is under scrutiny. Humans have always been limited by their biological faculties—our senses can only perceive a sliver of the electromagnetic spectrum, our ears can hear only a fraction of the vibrations in the air, and our cognitive powers are constrained by the limitations of our neural architecture. In this context, what we term "reality" is in essence a constructed narrative, meticulously assembled by our senses and brain as a way to make sense of the world around us. Philosophers have argued that our perception of reality is akin to a "user interface," evolved to guide us through the complexities of the world, rather than to reveal its ultimate nature. But now, we find ourselves in a new (contrived) techno-reality.
Artificial intelligence brings forth the potential for a new layer of reality, one that is stitched together not by biological neurons but by algorithms and silicon chips. As AI starts to create complex simulations, predictive models, or even whole virtual worlds, one has to ask: Are these AI-constructed realities an extension of the "grand illusion" that we're already living in? Or do they represent a departure, an entirely new plane of existence that demands its own set of sensory and cognitive tools for comprehension? The metaphorical veil between humans and the universe has historically been made of biological fabric, so to speak.7