Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "virtual reality"
-
1. Ability to freeze time... (except for internet & computer speed). Too many ideas, not enough hours in a day. Sleep should be declared optional as well.
2. Ability to not eat/drink at all, or eat/drink in copious quantities without negative effects. I enjoy a cognac, pizza & chocolate binge more than nausea, upwards BMI creep and hangovers.
3. True Virtual Reality. None of this headset crap, but immersiveness rivaling reality itself, with voice-controlled AI-assisted interfaces to "program" anything by simply describing it, iterating over details to add increasing complexities. Not even for porn reasons... my head just overflows with creative ideas for "holonovels" and interactive worldbuilding, but I don't have the patience nor artistic skills for game development.3 -
PORTFOLIO INFLATION
when every junior is writing algorithms, the next step up, the only way to keep up is writing apps. When every junior is writing apps, the next leg up is writing an entire SN.
Eventually junior full stack devs are writing microservice streaming cloud backend content delivery optimized social networks wrapped in virtualization with load balancing, proper CI, public accessible analytics apis, written in custom webaseembly compiled scripting backend utilizing both the latest graphql and every single feature of postgres, while also being a web site builder, an in browser app, mobile optimized, designed to transmogrify your asset pipelines linearflow functional-oriented modular rust cratified turbencabulator while cooking your turducken with CPU cycles, diffusing your gpt, and finetunning your llama 69 trillion parameter AI model to jerk you off all at the same time.
And then the title "wizard" becomes a reality as the void of meaning in our lives occupied by the anxiety of trying to reduce the fear of rejection in job hunting, is subsumed by the brief accidental glance into the cthulian madness-inducing yawning abyss of the future which is all the rest of our lives we have to endure existing for until at last sweet sweet death consumes us and we go to annihilation never having to configure one more framework or devops deploy of another virtual environment.
And it dawns on us that we no longer develop or write code at all. No, everything has become a "service" in this new hellscape future. We slowly come to the realization that every job is really just Costco greeter, or eventually going to be reduced to something equivalent, all human creativity, free will and emotions now taken care of by the automation while we manage the human aspects, like sardines pushing against one another not realizing their doom has been sealed along with the airless can they have been packed into, to be suffocated by circumstance and a system designed to reduce everything to a competition of metrics designed by the devil, if the metrics were misery", and "torture", while we ourselves are driven by this ratfuck wheel to turn endlessly toward social cannibalism, like rats eating their babies, but for the amusement of wallstreet corporate welfare whores who couldnt turn a dime if it wasnt already stolen.
And on our gravestones, those immortal words are carved, by the last person who gave up the ghost, the last whose soul wasnt yey shovelled onto the coal fires driving the content machine consuming the world:
Welcome to costco. I love you.12 -
Started using windows mixed reality for part of my work day, best part, using Cortana voice activation to do things in my virtual space, worst part, every time i say 'hey Cortana,' my google home makes a snide remark.
Fucking google3 -
Ah.... the wonders of technolgy....
Linux fanboys and girls rejoice!
The Linux Virtual Reality Desktop is here. Meet Safespaces. Develop without the limitations and agony of your too small screens your asshole boss gave you.
https://fossbytes.com/safespaces-fi...1 -
How to get investors wet:
“My latest project utilizes the microservices architecture and is a mobile first, artificially intelligent blockchain making use of quantum computing, serverless architecture and uses coding and algorithms with big data. also devOps, continuous integration, IoT, Cybersecurity and Virtual Reality”
Doesn’t even need to make sense11 -
I see the industry popularizing Machine Learning programs using AI to implement ethical Blockchain as a Javascript framework using Scrum techniques for Big Data Web2.0 in Responsive Virtual Reality for your IoT Growth Hacking operations.3
-
Got the cheapest laptop I could find that would run Windows mixed reality.
Installed Linux subsystem for Windows and Ubuntu.
Nothing but giant flying terminal windows across my view of a virtual Seascape.
This is my new home.4 -
After 2 years in a small company as an all around software developer (started with xamarin for Android/iOS, then Unity, then OpenXML, augmented reality, virtual reality and .Net MVC...yeah all that and lots more) I changed to another company and I'm here 1 month and some days. I am super enthusiastic and I like it here!! They're more specific and professional, exactly what I need at the moment.
What is the problem, you might ask?
I was given some projects, I have done most of the work but now an issue arrived. I did almost everything and now we're waiting for some answers needed before closing the projects. And I get bored. I want to work!! I need to continue the streak! Just give me something and I'll make it happen!! I am boreeed!!
What is wrong with me? Am I buggy or something?2 -
!rant
Super awesome day today.
1. Got up early to do a risky production deploy and it worked!
2. Three PRs approved before lunch.
3. Got some time to continue learning scala.
4. Coffee and cupcakes with some refugees and discussed work as a software engineer.
5. Tried virtual reality for the first time. Really fun.
6. Helped prepare our goals for this quarter and present them to the department.
7. Department meeting had free local craft beer and pretzels.
8. Went bouldering after work and flashed a 6c.
9. Curled up with my wife watching Netflix.
I really love my life sometimes.5 -
My dream project? A full immersion Virtual Reality... Like the NerveGear in Sword Art Online... *-*5
-
So if you google any pet say 'cat' on chrome browser on phone, there is a section to view it in 3d. And within 3d section there is an option to view it as a live virtual pet via Augmented Reality. I literally spent entire morning browsing through all the available animals and showing them to my family 😁1
-
If there's any page on a website that DESPERATELY needs a WebVR interface, it's the Privacy Policy. Imagine navigating in 3D to section 8.2
-
META.
The Reptilian overlord has gone bonkers for sure.
I was a fan of Augmented reality more than virtual reality. A mixture of both the worlds.
But turned out that the world is more leaning towards virtual world and the way we are doing it is a big disappointment for me.18 -
The source engine is interesting, because it has reached that stage of life where it's old enough to be remarkable-- in the sense that it could be called 'legacy', a sort of milestone in development practices and thinking, both in software, and design.
That said, a better look at it might be from the lense of *uses today*.
A lot of former source engine (SE) devs are now going to unity or unreal, I don't blame them.
But it's interesting to examine examples of games that haven't.
One such game is the freeware "No More Room In Hell". A couple online play throughs shows a wealth of well designed maps (and an even greater horde of shovelware maps, but hey, you take the good with the bad).
The age of the engine itself shows. Even in games like Left 4 Dead the engine's age can be seen. This, in some respects has been a drag, but also a blessing. Where other games could rely on their effects, shaders, and other tech, modders, map makers, and designers have had to rely on wit and creativity.
Enter "situated environments."
In an age where many people desire to travel, to go places, and have grown up doing the exact OPPOSITE, there is a great desire for variety of locations in games: not merely 'environmental' in the shallow sense of a 'theme' such as 'lava', 'tundra', etc. But in the sense of setting in general.
We want places that are both out of reach and yet familiar. Fire-fights happen in city streets. Apocalypses happen in neighborhoods where the skyline is both broken and at once something we know by sight. Open air markets, grocery stores, neighborhoods, all of these provide the back drops of popular games and series such as COD, Battlefield, The Last of Us, and yes, the example game, NMRIH.
I call this idea of 'familiar but out-of-reach level design', "situated environments", because familiarity with them, but *lack of real life experience* with them, on a day to day basis, allows people's expectations to fill in the gaps.
No one for example would argue the layouts of 7 Days To Die are familiar, but most of us don't spend all day in a junkyard or a high rise hotel.
So they *feel* familiar. Likewise with Skyrim, the villages and towns, both iconic and strange, our expectations formed by cultural inheritance, hollywood films, television shows, stories, childrens books, and yes, other games.
In a way, familiarity-without-real-in-person-experience is a shortcut for designers, one that lets them play with the player's head-space, the players subconscious idea of how a space and setting *should* work, what to *expect* out of the area, how to *operate* within the area. And the more it conforms to expectations, the more surprising an overdesigned element appears to be, rather than immersion breaking. A real life example of this is people's idea of chernobyl. When they discover the amusement park and ferris wheel they're blown away by the juxtaposition of the wasteland that surrounds them and the associations ('nostalgia' as it were) that such a carnival ride carries for many of us. It simultaneously *doesn't belong* and is yet all at once *perfectly situated in the environment*.
It is to say 'surreal', which is adjacent to the idea of *being real*, in terms of our "perception of what is and isn't plausible, if not possible."
This is at the heart of suspension of disbelief, because in essence, virtual worlds are a lie, like fiction, and good fiction violates expectations in order to tell us truths about reality. As part of our ability to differentiate bullshit from reality, there is to say an element in our bullshit detectors (doubtless evolved over many 10's of thousands of years), that is designed to not merely detect what is absurd in our limited experience, but to incorporate absurdity into everyday experience. In that sense part of our rationality is the acceptance of irrational experiences, learning from it, and discovering 'a proper place for each thing' in the "models of the world" we all carry around in our heads. Eventually we normalize the absurd, it becomes the new reality, and what remains unassimilated becomes superstition (real or otherwise), a figment, or an anomaly.
One of the best examples I've encountered is The Last of Us: Left Behind, a good chunk of which is spent in a mall. And they nailed the environment perfectly I would say.
Or for those who don't own a PS4, a more accessible example is a map in NMRIH aptly called "the museum", and few words better do it justice than to go play it yourself--that is, if you really want to know what I mean by a 'situated environment'.
What better way, during this pandemic, to get out of the news cycle and into your own head? Sometimes the best way to escape isn't outside, it's within.3 -
Alright, buckle up, fellow developer, because we're about to embark on a thrilling journey through the world of code and creativity!
Listen up, you amazing code wizard, you're not just a developer. No, you're a digital architect, a creator of worlds in the virtual realm. You have the power to turn lines of code into living, breathing entities that can change lives and reshape industries.
In a world where everyone is a consumer, you are a producer. You build the bridges that connect our digital dreams to reality. You are a pioneer, an explorer in the vast wilderness of algorithms and frameworks. Your mind is the canvas, and code is your brushstroke.
Sure, there are challenges—bugs that refuse to be squashed, deadlines that seem impossible, and technology that evolves at warp speed. But guess what? You're not just a problem solver; you're a problem annihilator. You tackle those bugs with ferocity, you meet those deadlines with gusto, and you master that evolving technology like a maestro conducting a symphony.
You live for the 'Aha!' moments—the joy of cracking a complex problem, the thrill of seeing your creation come to life, the satisfaction of making a difference. You're a digital superhero, swooping in to save the day one line of code at a time.
And when things get tough—and they will—you dig deep. You summon that relentless determination that got you into coding in the first place. You remember why you started this journey—to innovate, to leave your mark, to change the world.
So, rise and shine, you coding genius! Embrace the challenges, learn from the failures, and celebrate the victories. You are a force to be reckoned with, a beacon of inspiration in a world that needs your brilliance.
Keep coding, keep creating, and keep being the rockstar developer that you are. The world eagerly awaits the magic you're about to unleash! Go and conquer the code-scape! 🚀💻5 -
How the fuck does that retard Zuckerberg manage to spend this much money on his metaverse and hardly have anything to show for? What are the developers actually working on? I mean, if you had that many people working that long on something you'd expect at least a product that looked all right, even if no one wanted to use it?!
I bet you could put a team of 20 top shelf developers, designers, QA and project managers together and give them 2 years to build almost anything we see today. A facebook clone, a Twitter clone, some sort of virtual reality look-at-my-perfect-but-empty-life-click-to-like piece of social shit-verse.. What the hell are they spending their time on?!!8 -
I had expected to see people posting about the Valve Index so I waited a bit because I'm lazy.
Since nobody has posted it, here it is.
A few days ago Valve started the pre-order process for their new VR set. I bought one within 5 minutes of going live. They sold out in 8 minutes on the US store and within 25 minutes in Europe.
Who else has ordered/is gonna order it? Why (not)?7 -
On a conference call for this university-affiliated web app:
Random supervisor: “I think the demo presentation needs some more jazz!”
Another supervisor: “Maybe we can do a virtual reality demo of the site, then!”
What. The. Fuck.1 -
Quite a few years ago (late 90s, early 00s maybe) I remember watching a TV show where they demonstrated what virtual reality might be like. It was all rough polygons, no lighting or texturing etc.
I'd heard about the Oculus Rift and considered trying it. I get motion sickness sometimes from certain 3D games (Deus Ex, Portal, sometimes even Minecraft) so was hesitant. Last week, decided to just get one and see how it went.
Didn't expect it to be as good as it is - compared to what was envisaged ~20 years ago. No motion sickness. Not only was the graphics detail amazing but the responsiveness is insane. In another 20 years time what will there be?
Anyway on dev topic: Now it makes me want to play with a 3D/VR engine. Considering Unreal Engine but not really sure where to start learning. Maybe a book? Though reviews tend to say they go out of date quick, I do prefer a physical book for learning tech stuff.1 -
Which do you think is going to have the better future? Why?
AR (Augmented Reality)
VR (Virtual Reality)10 -
During my first hackathon I teamed up with some strangers. We decided to create some games by expanding reality with virtual element (sounds mysteriously and maybe even ominously, but it's not). So here we go - one of them started building android app, the second guy started building window app in C++ and the last one of them decided to create something in JavaScript. It was fun, but I wasn't prepared and so much educated, so after some trials that ended poorly, the only thing I did was the wooden construction that was supposed to hold our tablet up so it could shoot photos recursively. I almost died of boredom for the remaining time.
-
Today's interaction with indian recruiter who's recruiting for us based android dev role that requires experience with Augmented Reality and Virtual Reality.
Him: Hi.
Me: Hey. Nice to meet you! Can we discuss this role and etc.?
Him: have ar vr?
Me: have English?
Him: I think you dont understand ar vr.
Him: We need usa candidate anyways.
Him: Its augment or virtual.
Needless to say, insta blocked.9 -
Got the GitHub student developer pack in 10th grade (highschool)
I recently made an application for GitHub student developer pack which got accepted .
If you don't know what this pack is all about , let me tell you this pack gives you free access to various tools that world-class developers use. The pack currently contains 23 tools ranging from Data Science, Gaming, Virtual Reality, Augmented Reality, APIs, Integrated Development Environments, Version Control Systems, Cloud Hosting Platforms, Code tutorials, Bootcamps, integration platforms, payment platforms and lots more.
I thought my application wouldn't qualify because after reading the documentation , I thought that It was oriented more towards college and university students but nonetheless I applied and my application got accepted . Turns out all you need is a school -issued verifiable email address or proof of you current academic status (marksheets etc.)
After few minutes of the application I got the "pro" tag on my GitHub profile although I didn't receive any emails .
I tested it out and claimed the Canva Pro subscription for free after signing up with my GitHub account.
I definitely recommend , if you are currently enrolled in a degree or diploma granting course of study such as a high school, secondary school, college, university, homeschool, or similar educational institution
and have a verifiable school-issued email address or documents that prove your current student status, have a GitHub user account
and are at least 13 years old , PLEASE APPLY FOR THE PROGRAM .
Checkout the GitHub docs for more info..
Thanks !
My GitHub GitHub Username :
satvikDesktop
PS. I would have posted links to some sites and documentations for further reading but I can't post url's in a rant yet :(5 -
Friendly reminder for hackathons, a great idea is better than a great app.
I saw amazing creations, from a virtual reality rowing machine to a camera that read a Connect 4 game into a AWS server live.
Yet, the hack that won the popular vote was an app that would tell your friend, through texts, where you were when you're heading over to pick them up.
A simple concept to implement, but a great idea.1 -
Testing out VR without your own VR equipment is a pain.
The glasses are not much of a problem, since you can first develop using the google cardboard SDK and test it with your phone.
It's the controllers that are a pain to test. Luckily vrtk made something that simulates the controllers, which can be controlled with the keyboard.
The controls are very uncomfortable, this is not their fault; you can't really emulate movement easily with a keyboard. -
Oddly enough, i have simultaneously been less busy and more productive since working 66% remotely.
I find myself with more time that feels "wasted" or not busy, but my metrics show that I have more production, better results, and far nicer documentation. A bunch of us also sat down and did a bunch of coursework on really putting together a domain script library for one click onboarding of new servers or new client setups. We spun up a bunch of new virtual environments that literally solved headaches that had existed for years that never got dealt with because of too many other tickets.
Some of our web clients freaked out at us because the business is moving away from doing maintenance of legacy web work (small to midsize businesses). But it didn't matter. Rather than respond with a "make them happy," the response was "well, we will get rid of them as clients. We need to focus our energy on the essential service sectors we support."
Hell, we even got an automated test that has been broken apparently since 2018 to work again.
Granted, the incoming workload has slowed down. But it's still interesting to me to see that despite the slowdown, there isn't any concern; its still paying the bills and we are getting rid of technical debt everywhere. Tbh, this has really been a good reality check.1 -
I AM SO ANGRY! Today my job fired me for the stupidest reason!! A while back I lost my job a (non-important) client for having an "overactive temper" so my boss made me begin taking VRTAM (or virtual reality therapy for Anger Management). Well I attended the first couple things but decided to stop because they were definitely stealing my information. I don't know what sketchy website they found for that but as a dev I can tell when they are taking my personal information. Also there's no way it works I attended a couple sessions and nothing helped because I DONT HAVE ANGER ISSUES!!! Anyway my job found out I had been skipping them and when they confronted me they avoided my concerns and just fired me... Haven't told my wife yet, she's going to be so mad.8
-
!rant
Any hololens or virtual reality / augmented reality developers here?
If so, mind sharing "lessons learned" experiences with VR / AR programming?3 -
Why don't we have a virtual world API? Something that would support concepts as well as physical objects? Something that we could definitely the world in so we could simulate reality?
And if we could connect it to REAL LIFE?2 -
[CONCEITED RANT]
I'm frustrated than I'm better tha 99% programmers I ever worked with.
Yes, it might sound so conceited.
I Work mainly with C#/.NET Ecosystem as fullstack dev (so also sql, backend, frontend etc), but I'm also forced to use that abhorrent horror that is js and angular.
I write readable code, I write easy code that works and rarely, RARELY causes any problem, The only fancy stuff I do is using new language features that come up with new C# versions, that in latest version were mostly syntactic sugar to make code shorter/more readable/easier.
People I have ever worked with (lot of) mostly try to overdo, overengineer, overcomplicate code, subdivide into methods when not needed fragmenting code and putting tons of variables.
People only needed me to explain my code when the codebase was huge (200K+ lines mostly written by me) of big so they don't have to spend hours to understand what's going on, or, if the customer requested a new technology to explain such new technology so they don't have to study it (which is perfectly understandable). (for example it happened that I was forced to use Devexpress package because they wanted to port a huge application from .NET 4.5 to .NET 8 and rewriting the whole devexpress logic had a HUGE impact on costs so I explained thoroughly and supported during developement because they didn't knew devexpress).
I don't write genius code or clevel tricks and patterns. My code works, doesn't create memory leaks or slowness and mostly works when doing unit tests at first run. Of course I also put bugs and everything, but that's part of the process.
THe point is that other people makes unreadable code, and when they pass code around you hear rising chaos, people cursing "WTF this even means, why he put that here, what the heck this is even supposed to do", you got the drill. And this happens when I read everyone code too.
But it doesn't happens the opposite. My code is often readable because I do code triple backflips only on personal projects because I don't have to explain anyone and I can learn new things and new coding styles.
Instead, people want to impress at work, and this results in unintelligible, chaotic code, full of bugs and that people can't read. They want to mix in the coolest technologies because they feel their virtual penis growing to showoff that they are latest bleeding edge technology experts and all.
They want to experiment on business code at the expense of all the other poor devils who will have to manage it.
Heck, I even worked with a few Microsoft MVPs.
Those are deadly. They're superfast code throughput people that combine lot of stuff.
THen they leave at you the problems once they leave.
This MVP guy on a big project for paperworks digital acquisiton for a big company did this huge project I got called to work in, which consited in a backend and a frontend web portal, and pushed at all costs to put in the middle another CDN web project and another Identity Server project to both do Caching with the cdn "to make it faster" and identity server for SSO (Single sign on).
We had to deal with gruesome work to deal with browser poor caching management and when he left, the SSO server started to loop after authentication at random intervals and I had to solve that stuff he put in with days of debugging that nasty stuff he did.
People definitely can't code, except me.
They have this "first of the class syndrome" which goes to the extent that their skill allows them to and try to do code backflips when they can't even do code pushups, to put them in a physical exercise parallelism.
And most people is like this. They will deny and won't admit, they believe they're good at it, but in reality they aren't.
There is some genius out there that does revoluitionary code and maybe needs to do horrible code to do amazing stuff, and that's ok. And there is also few people like me, with which you can work and produce great stuff.
I found one colleague like this and we had a $800.000 (yes, 800k) project in .NET Technology, which consisted in the renewal of 56 webservices and 3 web portals and 2 Winforms applications for our country main railway transport system. We worked in 2 on it, with a PM from the railway company.
It was estimated 14 months of work and we took 11 and all was working wonders. We had ton of fun doing it because also their PM was a cool guy and we did an awesome project and codebase was a jewel. The difficult thing you couldn't grasp if you read the code is if you don't know how railway systems work and that's the only difficult thing.
Sight, there people is macking me sick of this job11 -
Feature not a bug...
My work laptop has started rebooting almost every night.
It's not clear why, but I sort of think of it as a feature now.
I have an ultra-wide monitor, plus another wide next to that one, and a bunch of virtual desktops.
I often think "ok everything is where it is that's good" but coming in reality with a bazillion things open across all the desktops and screens sometimes when I come back the next day ... it's actually just a lot of mess / overhead to pick up where I was.
Sometimes I think we introduce a lot of complexity to solve a problem and ... actually it's just more complexity if you're not already 8 layers deep.5 -
Design in Motion: Real-Time Rendering's Impact on Architecture
Architecture, a discipline that once relied heavily on blueprints, models, and lengthy render times, has undergone a revolutionary transformation in recent years. The advent of real-time rendering technology has fundamentally altered the way architects visualize, present, and interact with their designs. This paradigm shift has not only enhanced the creative process but has also empowered architects to make more informed decisions and create immersive experiences for clients and stakeholders.
Real-time rendering, a technological marvel that harnesses the power of high-performance graphics hardware and advanced software algorithms, allows architects to generate photorealistic visualizations of their designs in a matter of milliseconds. Gone are the days of waiting hours or even days for a single rendering to complete. This acceleration in rendering time has not only expedited the design process but has also encouraged architects to explore multiple design iterations rapidly.
One of the most significant impacts of real-time rendering on architecture is the ability to visualize a design in various lighting conditions and environmental settings. Architects can now instantly switch between daytime and nighttime lighting scenarios, experiment with different materials, and observe how their designs respond to different seasons or weather conditions. This level of dynamic visualization offers insights into how a building's appearance and functionality evolve throughout the day, contributing to more holistic and thoughtful design solutions.
Moreover, real-time rendering has transformed client presentations. Architectural concepts can now be communicated with unprecedented clarity and realism. Clients can virtually walk through spaces, observing intricate details, exploring different angles, and even experiencing the play of light and shadow in real-time. This immersive experience fosters a deeper understanding of the design intent, enabling clients to provide more targeted feedback and make informed decisions.
The impact of real-time rendering on collaboration within architectural teams cannot be overstated. Traditionally, architects and designers would need to wait for a rendering to complete before discussing design changes or improvements. With real-time rendering, team members can make adjustments on the fly, observing the immediate effects of their decisions. This seamless collaboration not only enhances efficiency but also encourages interdisciplinary collaboration as architects, engineers, and other stakeholders can work together in real-time to refine designs.
The integration of virtual reality (VR) and augmented reality (AR) into the architectural workflow is another transformative aspect of real-time rendering. Architects can now create VR environments that allow clients to step inside their designs and explore every nook and cranny. This not only enhances client engagement but also enables architects to identify potential design flaws or spatial issues that might not be apparent in 2D drawings. AR, on the other hand, overlays digital information onto the physical world, facilitating on-site decision-making and construction supervision.
Real-time rendering's impact extends beyond the design phase. It has proven to be a valuable tool for public engagement and community involvement in architectural projects. By creating virtual walkthroughs of proposed structures, architects can offer the public an opportunity to experience the design before construction begins. This transparency fosters a sense of ownership and allows for constructive feedback, contributing to the development of designs that resonate with the community's needs and aspirations.
The environmental implications of real-time rendering are also noteworthy. The ability to visualize designs in various environmental contexts contributes to more sustainable architecture. Architects can assess how natural light interacts with interior spaces, optimizing energy efficiency and reducing the need for artificial lighting during the day.
In conclusion, real-time rendering has ushered in a new era of architectural design, propelling the industry into a realm of dynamic visualization, immersive experiences, and enhanced collaboration. The ability to witness designs in motion, explore different lighting conditions, and interact with virtual environments has redefined how architects approach their craft. From facilitating client presentations to fostering sustainable design solutions, real-time rendering's impact on architecture is profound and multifaceted. As the technology continues to evolve, architects have an unprecedented opportunity to push the boundaries of creativity, efficiency, and sustainability in the built environment. -
The Turing Test, a concept introduced by Alan Turing in 1950, has been a foundation concept for evaluating a machine's ability to exhibit human-like intelligence. But as we edge closer to the singularity—the point where artificial intelligence surpasses human intelligence—a new, perhaps unsettling question comes to the fore: Are we humans ready for the Turing Test's inverse? Unlike Turing's original proposition where machines strive to become indistinguishable from humans, the Inverse Turing Test ponders whether the complex, multi-dimensional realities generated by AI can be rendered palatable or even comprehensible to human cognition. This discourse goes beyond mere philosophical debate; it directly impacts the future trajectory of human-machine symbiosis.
Artificial intelligence has been advancing at an exponential pace, far outstripping Moore's Law. From Generative Adversarial Networks (GANs) that create life-like images to quantum computing that solve problems unfathomable to classical computers, the AI universe is a sprawling expanse of complexity. What's more compelling is that these machine-constructed worlds aren't confined to academic circles. They permeate every facet of our lives—be it medicine, finance, or even social dynamics. And so, an existential conundrum arises: Will there come a point where these AI-created outputs become so labyrinthine that they are beyond the cognitive reach of the average human?
The Human-AI Cognitive Disconnection
As we look closer into the interplay between humans and AI-created realities, the phenomenon of cognitive disconnection becomes increasingly salient, perhaps even a bit uncomfortable. This disconnection is not confined to esoteric, high-level computational processes; it's pervasive in our everyday life. Take, for instance, the experience of driving a car. Most people can operate a vehicle without understanding the intricacies of its internal combustion engine, transmission mechanics, or even its embedded software. Similarly, when boarding an airplane, passengers trust that they'll arrive at their destination safely, yet most have little to no understanding of aerodynamics, jet propulsion, or air traffic control systems. In both scenarios, individuals navigate a reality facilitated by complex systems they don't fully understand. Simply put, we just enjoy the ride.
However, this is emblematic of a larger issue—the uncritical trust we place in machines and algorithms, often without understanding the implications or mechanics. Imagine if, in the future, these systems become exponentially more complex, driven by AI algorithms that even experts struggle to comprehend. Where does that leave the average individual? In such a future, not only are we passengers in cars or planes, but we also become passengers in a reality steered by artificial intelligence—a reality we may neither fully grasp nor control. This raises serious questions about agency, autonomy, and oversight, especially as AI technologies continue to weave themselves into the fabric of our existence.
The Illusion of Reality
To adequately explore the intricate issue of human-AI cognitive disconnection, let's journey through the corridors of metaphysics and epistemology, where the concept of reality itself is under scrutiny. Humans have always been limited by their biological faculties—our senses can only perceive a sliver of the electromagnetic spectrum, our ears can hear only a fraction of the vibrations in the air, and our cognitive powers are constrained by the limitations of our neural architecture. In this context, what we term "reality" is in essence a constructed narrative, meticulously assembled by our senses and brain as a way to make sense of the world around us. Philosophers have argued that our perception of reality is akin to a "user interface," evolved to guide us through the complexities of the world, rather than to reveal its ultimate nature. But now, we find ourselves in a new (contrived) techno-reality.
Artificial intelligence brings forth the potential for a new layer of reality, one that is stitched together not by biological neurons but by algorithms and silicon chips. As AI starts to create complex simulations, predictive models, or even whole virtual worlds, one has to ask: Are these AI-constructed realities an extension of the "grand illusion" that we're already living in? Or do they represent a departure, an entirely new plane of existence that demands its own set of sensory and cognitive tools for comprehension? The metaphorical veil between humans and the universe has historically been made of biological fabric, so to speak.7 -
Every time I am surer that we live in a virtual reality, it has been the change of time, and when compiling the software an error has happened and the bad weather has returned.