Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "minority report"
-
tl;dr Do you think we will any time soon move from editing raw source code? Will IDE or other interfaces allow us to change the code in graphic representation or even through voice?
---
One thing I found funny watching Westworld is how they depicted the "programming" - it is more like swiping on a smartphone, a bit maybe like Tom Cruise's investigations in Minority report. Or giving certain commands and key words by voice.
There was one quote from Uncle Bob's "Clean Code" I could never find again, where he said something along the lines, that back in the seventies or eighties they thought they would soon raise programming languages to such a high level they would use natural language interfaces, and look at us now, still the same "if's".
So I feel uncomfortable without my shell and having tried a graphical programming language once this particular (Labview) seemed clumsy to me at best. But maybe there are a lot of web devs here and it seems with them frameworks you might be able to abstract away a lot of the pesky system programming... so do you feel like moving to some new shiny programming experience or do you think it will stay the same for more decades as the computer is that stupid machine where you have to spill it out instruction by instruction anyways?7 -
Smart contact lenses and the appropriate software. It would be the ultimate AR experience. I have no idea how to produce them, as they would need to be super high resolution, lag free, completely wirelessly powered and connected, safe to use and to wear and useable 24/7.
My current concept is a ultrabook sized block that can be taken around in a backpack.
Oh and wireless handoff ...
meaning everything I grab and throw in your general direction becomes available to you, kind off like they do it in Avatar. This should also work with PCs, tablet and everything else.
Speaking of grabbing you would also need some kind of minority report glove so every bit of hand movement can be tracked precisely. But probably a bit more elegant meaning only small stickers on the back of your hand.
Did I mention that sharing stuff should enable working together on the same object in real-time?
Also this system should integrate seamlessly with a smart environment, meaning looking at the light, opening its context menu and changing its brightness or colour should be no effort at all.
And of course all of it should be open source, highly scalable and either hosted on public infrastructure (funded by taxes or smt) or by each individual for himself to protect his or her privacy.
So who is with me?2 -
Currently working on my new app idea that will revolutionise modern politics. Never again will you have to worry about your opinions affecting somebody else. My new forum will require Racial Detection™ where you will have the opportunity to voice anonymous concerns and see the points of view of different skin colours. Then you can truly pander to your desired demographic.
We're calling it: Minority Report9