I am thinking if there could be one universal efficient programming language that could perform multiple tasks then it had been cool... Maybe it may cause programming languages singularity !!!

  • 5
    Maybe if there would be a single most efficient Race and Religion of People, there would be singularity and World Peace.
  • 8
    Kill all the current developers and teach the next generation only 1 language, that's the only solution.
  • 5
    @theabbie Yeah lol :)... There wouldn't be much tension regarding the compatibility of different tech stacks/technologies if there's one language... Though I think in the far future it may be a language that AI may generate/create ...
  • 2
    Why, that's what C++ tried to do - and it sucks.
  • 0
    @Fast-Nop Yeah... The way development is progressing ... huge no. of code lines with optimization is something to be tackled ....
    I think even python tried to attempt the same but it's slow ... Also I believe in dev it's scope is just for backend... a language for frontend/backend/android would be cool.... I think there haven't been any known attempts in doing that...
  • 4
    Have you seen Rust though
  • 2
    @12bitfloat Nope ... I've to try it though!
  • 5
    @Surajv It's really amazing. As close to the "universal" programming language you can find
  • 3
  • 0
    But we already have lisp.
  • 0
    @12bitfloat Rust is designed as competitor for C, and it actually sucks at that role.
  • 0
    @Fast-Nop Where exactly does it suck at competing with C? It has basically the same performance and allows for the same low-level hardware access C does. It's also really portable, granted not as portable as C but still
  • 2
    @12bitfloat It sucks because the standard library is anaemic, you need crates for everything, only that there's no standardisation and you have no guarantee that crates from today will be maintained in 5 years.

    It's dependency hell like NPM, which is no wonder because Rust came from a browser vendor where people think this is normal. And where the software product itself is short-lived because nobody uses 5 year old browsers.

    The language itself isn't standardised either, and there's only one compiler vendor.

    On top of that, Rust is a puzzle language where you spend more time to bend your architecture to Rust than actually coding. Even trivial shit like doubly linked lists or a data buffer shared between interrupt and application requires advanced Rust knowledge.

    Or you just say "fuck this shit" and resort to "unsafe" whereever Rust gets in the way, negating the intended advantages as you go.

    The only reason Rust code is "safe" is because it doesn't let you do anything.
  • 1
    Looking at history, its just not feasible. C++ tried it, Java tried it and JS is currently the latest runner.
  • 4
    @Fast-Nop First of all, the small std lib is because the Rust team doesn't want to cram bad code in there just for the sake of having it like PHP does (or did)
    Still, it's UNIVERSES ahead of what C has, which let's be honest isn't even much of a stdlib and more a random set of more or less useful functions

    Further, it's true that Rust has a lively, NPM-like ecosystem but I don't get your problem with it. You don't have to depend on any of it and code everything yourself ad-hoc style just like in C.
    And even when a library gets abandoned your code doesn't just stop working, it may not get any new features but you still have what you signed up for

    The fact that Rust doesn't have a spec is way blown out of proportion. It's only been 5 years since the lang stabilized with 1.0 and a lot of features ARE specified to great extents in their respective RFCs. It probably will get a spec at some point but it takes time
  • 3
    @Fast-Nop The last point deserves an extra comment because it's just categorically wrong.

    Rust is a true systems language. All of it's memory safety mechanisms (apart from range checks, which all of your C code should have aswell!) happen entirely at compile time resulting in the the same code you would have written by hand in C.
    *But that's exactly the issue*: Because you *wouldn't* have written the same safe code in C because you are human and you make mistakes.
    70% of all bugs at Microsoft and in Chrome are memory saftey related. Almost ALL critical security vulnerabilities are memory safety related.

    That's why Rust only allows you to dereference raw pointers in unsafe blocks.
    And 99.9% of the time you don't need to do that.
    And even if you do, at least it's apparant by the unsafe block where exactly the code you need to to audit is located at
  • 3
    @Fast-Nop RedoxOS is a unix-like OS written entirely in Rust. They didn't seem to have a problem with Rust holding them back, did they?
  • 0
    @12bitfloat If you don't see the problem with a language that has no ISO standard and only a single compiler vendor on the one hand and long-term projects on the other hand, I can only assume that you've never had to deal with long-term projects.

    That problem won't get better because without ISO standard, you can't tell the compiler to use the standard from 2020 when it's 2030. Python is a cautionary example why you don't rely on non-standardised quicksand.

    If you don't see a problem with an NPM like ecosystem and long term projects, I really can't help it. Oh you don't have to use... except you do have to because the standard library is anaemic.

    That's the main reasons why most devs think Rust is super hot while next to nobody is using it in reality except for toy projects. And yeah, Redox OS is a very nice toy project that nobody uses written in a nice toy language.

    Btw - C11 has thread support directly in the language, so the standard library isn't what it was in C99.
  • 2
    @Fast-Nop The difference between Rust and Python is that the Rust team actually has some common sense and doesn't introduce breaking changes. Outside of editions there is no backwards breaking change, ever.
    Would a spec be good? Sure. But C++ has a spec which is almost useless because all compilers do fuckshit that is not in the spec.

    If anything, calling Rust a toy language proves that you live in a low-level bubble with zero experience outside of it.
    You only know C so C must be the be all end all of programming languages

    Also kinda ironic since "C has multithreading in the std lib now" is the worst defense you could have possibly made when calling another language a "toy" lmao
    Rust has always had multithreading in the stdlib including efficient mpsc channels and all sorts of other sync primitives. Because that's what you would expect from a systems language worth anything

    Rust also prevents data races at compile time. Guess C isn't toy enough for something like that, huh?
  • 0
    @12bitfloat C11 is from 2011, that's nine years. Oh Rust has had it for one year longer when it was completely unusable, big deal.

    The only reason why Rust prevents data races is the same why Rust is safe: because stuff like lockless access to shared buffers (extremely common in ring buffers) is moved to the "unsafe, we don't care" area anyway.

    And yeah, betting on the Rust team as only "vendor" having common sense sounds like a great basis for industrial projects.

    But I'm sure you'll have fun when you update a project from five years ago and shit won't even build anymore.

    And no, threading isn't to be expected in a systems language because such a language will be used to IMPLEMENT threading in the first place.
  • 0
    There is a shitton of universal languages suitable for a lot of different tasks. Some optimized for speed, some for maintainability.
    You can even choose any combination of paradigms and syntaxes - and chances are good that there is a reasonably performant universal language wich has been designed with the chosen set in mind.

    So what are you missing?
  • 5
    @Fast-Nop @12bitfloat This discussion is exactly why a universal programming language is impossible.

    I'm on the Rust-evangelists side. I've written backends (using Rocket.rs) and API clients (Reqwest) with it, networking proxies (Tokio), file conversion tools, web assembly frontends, a desktop application (Azul).

    It can fucking handle everything, and its type system is at 95%-Haskell-level S-tier perfection.

    But full-featured type systems (and in Rust's case ownership/borrowing/lifetimes/etc) do come with overhead. They warn you about risky code, or even make risky code impossible.

    Sometimes you don't want risky code to be impossible. Sometimes, race conditions aren't a big deal -- like when you want a backend up and running TODAY, and it will handle 10 requests per hour.

    I mean, I'm helping my wife to learn coding. Sure, Rust is perfection in my eyes -- but throwing someone in that pit is just mean. So of course I installed PyCharm & sent her a bunch of Python tutorials!
  • 3
    @12bitfloat @Fast-Nop Also, both C & C++ have the same exact problem as Rust:

    High level of entry. For JS/Python, you ONLY need some logic and syntax. For PHP, and even more so for Java, you need a bunch of object oriented knowledge ("Why the fuck is my code in a class? What the fuck is a constructor? WTF WHY DO YOU KEEP TALKING ABOUT DEPENDENCY INJECTION?")

    For C, someone must also have a deep understanding about memory management, filesystems, low-level networking, and complicated CS topics. I'm not going to explain what a Mutex is in dev bootcamp day 1.

    Rust circumvents the C problem of memory management with a rock solid type system & ownership concept -- which is just as much of a steep learning curve as learning manual memory management.

    Having that strict type system does come with a bunch of advantages in my opinion -- but no matter your preference, neither language is exactly suitable for:

    * Teaching coding 101

    * Quick prototyping

    * Hiring a cheap team
  • 2
    You are talking about ARM assembler aren't you..
  • 0
    @Nanos RISC-V ASM ๐Ÿค”
  • 1
    @Fast-Nop I see you are not a follower of the way of the C++. Brother Stroustrup loves all. lol
  • 3
    Lol, started thinking of programming languages like religions:

    C is the Jews

    C++ is Christians

    C# is Mormons

    Javascript is Heavens Gate cult
  • 1

    ARM is for Satanists, or Jediism.. ?
  • 1

    It has the most beautiful instruction set I've ever seen in a CPU.

    Related links:




    Some years ago I got to play with an early RISC like machine, An HP Apollo Domain Series 10,000 with an A88K CPU, talk about huge CPU board, and it had 4 of them:


  • 0
    @Fast-Nop Ruby is used in production, Go is used in production, JavaScript, Python, PHP, Perl. All used in production

    So I don't see the problem with adopting Rust. It's not made by a company, it's fully open source. The "Rust team" is just the governing body having the last say.

    How is that any different from the C or C++ commitee? It's not, and at the end of the day you'll always have to trust someone somewhere to not fuck shit up, no matter if that's a committee or a core team

    Maybe I'm too naive to give this trust to the people behind Rust but I don't think so. They have proven themselves to do the right thing and advance the language in a great direction while being extremely careful about preserving backwards compatibility
  • 1
    @Nanos Java is Scientology, Assembler is atheism.
  • 2
    @12bitfloat With C/C++, you don't have just one compiler, you have several. Therefore, you have several teams arguing. This also prevents people from declaring stuff standard just because it's easier for their specific compiler to implement.

    I won't trust a language that locks me to any single team and a single compiler vendor, no matter their track record. No second source == no, thank you.

    And then you have ISO standards which mean that even e.g. current GCC 10 can compile ANSI C code from 30 years ago when I give -std=c89 as compiler option. Similar for C++98.

    But without ISO standard and compiler switch support for them, future compiler versions may be required because of bugfixes, but will still fail to build current code even if it's archived completely locally.

    Also, the ecosystem with anaemic std lib and hundreds of competing crates is a lottery in future maintenance.

    Besides, I'm not really a fan of puzzle languages.
  • 1
    So new languages only have a chance to become usable for you if there is enough dispute between users to make multiple compilers emerge?
  • 0
    @Oktokolo That's a pretty odd way to put it, which makes it a loaded question. The only correct way to deal with loaded questions is refusing to answer.
  • 1
    @Fast-Nop Just all the Rust teams combined, exluding all other contributors, consist of over 100 people and they argue non stop.
    Which async/await syntax to choose was a months long debate with thousands of people chipping in.
    So in fact, I'd say this way of doing things is better than C or C++ because design by commitee doesn't work. If you don't agree then why aren't you using C++?

    As to implementations, having one official implementation is always gonna be better than having many non-portable ones.
    Competition is only advantageous when you don't have a completely open source project in the first place. It's a waste of energy building a competitor when you could just make the official implementation better

    Also stop with this dumb argument over the std lib. You're defending C. You have no ground to stand on whatsoever talking about a standard library
  • 1
    @12bitfloat I disagree. Having several implementations bound by a common standard is way better, not to mention that this makes them portable.

    Which harks back to the problem that Rust doesn't have a standard and thus doesn't have an implementation - instead, whatever the current compiler code happens to be, that's the "spec".

    And no, competition e.g. between GCC and LLVM is totally healthy although both are OSS, plus that there are a lot of other compilers in industrial use.
  • 0
    @12bitfloat I think I already mentioned C11, and I won't explain it again. If you compare 2020's Rust to 1990's C, that's your problem.
  • 0
    @Fast-Nop Does C11 have logging, UTF-8 strings (owned and slices, growable), growable vectors, deques, lists, hashmaps, btrees, error handling stuff, futures, composable iterators, panic handling stuff including backtraces, file path parsing/handling, reference counted smart pointers, mpsc channels, threads barriers?
    No, it doesn't. Whether it's C99 or C11, they have the bare essentials

    Compare the entirety of the C std lib (https://en.cppreference.com/w/c/... ) with that of Rust (https://doc.rust-lang.org/std/ ) and then tell me which one is anemic
  • 1
    Okay, let me rephrase: You would only use languages where the compiler-building community is wasting resources by maintaining two different compilers instead of working together on one?
  • 0
    @12bitfloat Come on, stuff like growable vectors is trivial. Anyway, my issues remain, and I don't see Rust taking over any time soon. Since the discussion is turning in circles, I'm dropping out.

    @Oktokolo Same kind of loading, just re-worded. I won't even reply to a third loaded question.
  • 0
    Standards are great, except when they are wrong..




    For example, September 27, 2012 is represented as 2012-09-27.


    Why do we have to represent "September" with the number "09" ?

    Are we running out of memory / storage that this is so important..

    And wouldn't it be useful to include what day it is as well.. "Thursday"..

    > Monday 27 September 2012

    Now, isn't that easier to read for everyone..
  • 0
    No problem, you actually where pretty clear on both topics already.
  • 0
    Yes, it clearly is much more readable for everyone...
    who speaks english.
    The I in ISO stands for "international".

    Current most-spoken language on this planet:
    Mandarin Chinese with 918 million speakers.
    Would you like to have "2012ๅนด9ๆœˆ27ๆ—ฅ" (google thinks, they use that format) instead of "September 27, 2012"?
    I mean, it would still be pretty readable as they use our digits. ;)
  • 1
    You cannot mention C++ without mentioning the major libraries and ecosystems. They don't exist in isolation.

    Boost (the testbed for standardization)

    Qt (which has tons of objects that cover most of the objects that went into the stl)

    Also, C++17/20 is way further along than C++11.
  • 2

    Get out with your imperial ISO dates!

    Metric time FTW. And none of this Unix timestamp bullshit either.

    435 petaseconds since big bang. 2 petaseconds since the dinosaurs roamed the planet. Heh, it feels like yesterday.
  • 1
    @Demolishun C++ is a foundation. Which is important... but I'd argue that a PyQt app will be finished faster than a C++ Qt app.

    Sometimes you want a concrete foundation with a prefab wooden house on top.

    Other times you just want to pitch an ugly tent, and write an electron app.
  • 1
    @bittersweet I have run into issues with Python ecosystem when freezing code (making exes). I am currently converting some apps I wrote in Python to C++ due to this problem. While I like Python, the problems with freezing and lack up libraries being updated to Python 3 has made me rethink using Python for desktop apps.
  • 2
    @Demolishun Yeah, building a neighborhood with wooden houses is faster, until there's a forest fire or hurricane.
  • 0

    Are all numbers English ?

    I don't like any date where its either:

    02 04, could mean 2nd of April, or 4th of January.

    I'm reminded my uncle knew Mandarin..

    I'd be quite happy to write dates in that, if it meant less confusion !
  • 0

    Is there 10 seconds in a metric minute, and 10 minutes to the metric hour, and 10 hours to the metric day ?

    And 10 metric days to a week ?

    We'd still run into issues though like 04 02, being possibly 02 04, or if there was finger trouble, maybe someone meant 03 02..

    But if you include the day name, and the name of the month, think of it as parity check..
  • 0
    If it starts with a year, the numbers are pretty unambiguous.
    But you can also just memorize, that ISO dates are written like other numbers: Most significant digits first.
  • 0

    Only works if everyone sticks to the standard, and it doesn't include redundant data for helping with error correcting.

    I'm reminded of archaeological data issues too..

    Which reminds me of the time a company ordered a new lift, units of course.

    I did point out in a management meeting that as the new units of measurement was cm, and the old units was inches, and that the records of the previous lift did not include what type of units they was measured in, that no one was going to make the mistake of ordering the new lift in cm, without actually checking they wasn't just putting in the number of inches from the previous lift..

    And low and behold, when the new lifts arrived and was unboxed the engineer looked very puzzled as he said did we order them for midgets..
  • 0
    You could at least include something like:

    Yxxxx Mxx Dxx so folk know its Year, Month, Day.

    But if you are going to include text in your datafield, why not go the whole hog and just include the entire name of the day and month too !
  • 1
    I'm reminded when a brilliant engineer who could never find anyone else to work on projects with, was complaining about this issue.

    So I said I'd work with them..

    Out first task, to build some video thing.

    But before then, we had to agree on standards..

    They wanted color.

    I wanted Black & White to save on bandwidth and reduce stuttering issues on low bandwidth devices like mobile phones.. (Since not everyone lives in Utopia with unlimited high speed internet..)

    They wouldn't change their mind from colour, and quit working with me !

    So, who was right, me or them. :-)

    See, standards are not so easy to agree on..
  • 2
    @Nanos Why the redundancy? You have a value and a unit.

    The only true SI time unit we have is the second. The only true SI prefixes we have are kilo, mega, giga, etc.

    Minutes, hours, days, weeks, all irrelevant.

    A lunch break takes 1 or 2 ks (~17-34 minutes).

    A human might focus on a single task for about 10 ks (2.7 hours) before needing a break.

    A work period might last 25 ks (7h) per 100 ks (1.15d). A series of consecutive work periods could be 0.5 Ms (5.7d), after which you have 0.2 Ms (2.3d) completely off work.

    You celebrate your birth once every 30 Megaseconds, or once per 0.1 Gigaseconds if you're not that social. Not because the planet completed a circle around a star, just because it seems like a good frequency to have people you don't like that much in your house.

    Also instead of using 1970 or the Big Bang, both of which are impractical, we should use consider "now" to be roughly the 400th Gigasecond -- using 12000 years ago, approximately the start of human history.
  • 0
    That is brilliant. Laughed hard.
    But seriously: I would expect that to work pretty well after an accomodation time of 30 Ms.
  • 0
    Yes, standras only if people use them exclusively.

    While transitioning from one standard to another, it obviously is important to make sure, that everyone involved knows what they are dealing with.
    The cm/inch fuckup was definitely foreseeable and thank you for warning the management in advance (not your fault that they did not route the message to the one actually placing the order).
    By the way: Next fuckup will be when switching from cm to mm, wich seems to be the real standard when it comes to machining stuff...

    The beauty of current short date formats is their space-efficiency while still being exceptionally easy to parse for humans _and_ machines alike.
    ISO did not invent that format. They just picked the only one of them that also supports easy sorting. Some countries already used it.
    "2020-10-08" does not need any language translation for almost the entire population. Add words and you hinder sortability for all and readability for most.
  • 1
    @Oktokolo The advantages of not coupling time to periodic events of our planet (day/night, seasons) is also that:

    1. It works for all timezones
    2. It is planet-neutral
    3. Having no new year's is culture-neutral
  • 0
    Yes. I really am of the opinion, that it would work pretty well.
    But the first reaction of the tech people is laughter regardless. And non-tech people will just outright call it "madness".
    So while it would be a good thing, it certainly will not happen.

    By the way: TAI for the win.
  • 0

    > Add words and you hinder sortability

    Is that why so many things are not alphabetically sorted these days !

    I thought we had enough CPU cycles these days to handle letters with ease.
  • 0
    Yes, that is why a lot of things aren't sorted alphabetically - their alphabetic order differs from the order, people would want to sort them.

    The beauty of a standard wich leds to intrinsic sortability of values is, that you do not need to define specific sort orders for each language.

    Being language agnostic also most obviously is an especially beneficial trait for a format to be used in data interchange between international entities.
  • 0
    > readability for most.

    And yet:


    Could also mean "2020-08-10"

    It doesn't tell you what day of the week it is, which is really quite an important thing, since weekends can cause issues.

    I'm reminded what a pain all of this can be for folk..


    I'm reminded of a slightly similar issue when a place used / as separator characters between numbers, written by hand..


    1 / 10 / 38493 / 18

    Could easily be mistaken for 1110 / 38493 118

    As we had millions of such things, you quickly got to see where the majority of errors and issues was.

    Dates, numbers, US / Rest of the world, etc. confusions, caused, well, not uncalculable expense to sort out, but unaffordable..

    It's worth testing solutions to find out which work best in practice.
  • 0
    We had a lovely experiment once, how to save electricity with lights.

    Current solution = lightswitch.

    Test solution = no lightswitch..

    Amazingly, they found no lightswitch used more electricity with the lights being on all the time..

    We also had two different kind of usual lightswitches, my favourite, the audio one, whistle, and it comes on, but goes off after 5 minutes..

    My least favourite, the movement detector one, since unless you are right in front of it, it can't see you, and goes off after 5 minutes..

    This involves lots of walking to light switches, which are a long way away.. (Like one block..)

    And when it goes off, you can't see your way to turn it on again..

    Unless you carry a torch !

    So, why is everyone carrying a torch when we have lights.. (Well, actually, only me and one other person carried a torch, everyone else just used to bump into sharp things in the dark..)
  • 0

    But, when starting a new standard, you don't introduce one that could be confused with an already existing one !

    Take the example talked about, and this snippet:

    I used to live in the USA and of course, the default was M/D/YYY

    7 years ago I moved to the United Kingdom and changed the setting as Andrew pointed out and my default is DD/MM/YYYY for all new spreadsheets I create.

    In the US right now, its:


    In the UK its:


    New ISO format would be:


    It's still confusing as to whether the 10 is the day or the month, unless you know the standard..
  • 0
    Think of like hieroglyphics, unless you know what the symbols mean, you are lost.

    If instead it was:


    You don't need to know the standard, its self evident what the day and month are.

    And if someone does get the day/month wrong, there is a greater chance someone will notice..


    Since that is a Monday, not a Thursday.

    But since most people are not so concerned with what year it is, but more concerned what day it is, the new standard should be the other way around..


    I reckon that will be more energy efficient and cause our eyeballs to spend less time searching for the most common answer to our question, which is, what day, rather than what year.

    Of course, we should test these solutions in the real world to determine just which is the more efficient solution.

    They did that right ?

    Since my solution is based upon working with millions of records and mistakes folk make..
  • 0
    As you can see, just trying to get 2 intelligent people to agree on a date format is much harder than you might think, imagine trying to get a standard language..

    Even though in theory it should be simple.

    Maybe that is why new languages tend to be written by a single person, since there is no one to disagree with them that they are doing it wrong. :-)

    Even though, in theory, group effort should produce better results..

    I guess it could do, if only we could agree on the design !

    So, how do you get folk to agree ?
  • 1
    Actually, adding units to the datetime components isn't that bad.
    2020y-10m-09d reads a bit noisier than 2020-10-09, but it still sorts fine and would indeed ease communication with people from random-component-order countries.
    But y, m, and d are SI prefixes but only m is a SI unit - and also the wrong one. So we need some other markers (keep them ASCII for maximum compatibility).
    Or just ask them, if they want to add your variant to the set of formats. They already have year-dayOfYear, wich also nobody uses, so maybe you have a chance.

    That lightswitch stories are hilarious.
    Even if it looks like an automated guessing solution would work reasonably well, a switch is almost always better.
  • 0


    But y, m, and d are SI prefixes but only m is a SI unit - and also the wrong one.


    How about "year" "month" "day", or are those already taken.. ?

    Or "yy" "mm" "dd" ?
  • 0

    Actually, adding units to the datetime components isn't that bad.


    Hurray, some agreement about improvements !

    There is hope yet for the human race. :-)

    Now, if we could all just agree which side of the road to drive on..

    And how many pins to have on our plugs..
  • 1
    Yes there is hope for the human race.
    Only a few countries don't use ISO standards and SI units now.
    The remainder might become economically irrelevant in the near future anyways (see trumps trade war and brexit).

    Proper driving side obviously is right. Technically both sides would work, but the overwhelming majority drives on the right side.

    Power plugs need at least two pins - one more if you need a protective conductor for safety reasons.
    But you could make a plug wich is like the Europlug but has a third pin in the middle for the protective conductor. Then the Europlug would even keep being compatible to the new standard.
    But i don't know, whether the Europlug format can safely be upgraded to support currents above 2.5 A...
    By the way: Grid voltage could be normalized to 240 V, 50Hz sine per phase with three phases delivered to buildings (so you can charge your car at decent speeds if you want to).
  • 2
    @Oktokolo LOL, the economies of most of Europe would die if the USA takes a dive. If not large parts of the world. It is literally a house of cards. Be careful what you wish for.
  • 3
    @Demolishun We don't need the USA anymore because we're already at USB anyway.
  • 0

    FX [ Wonders what it will be like when we get to USR.. ]

    As long as we don't ever reach UXB..
  • 0
    @Nanos I worry we are fast tracked to USWTF
  • 0
    If it happens while the USA is _not_ economically irrelevant yet: Yes.
    If USA becomes economically irrelevant first: No.
  • 1
    @Fast-Nop Ultima Online is coded in C++ and its also used with Unreal. It may not be the easiest language to work with but it's still a big player in the industry. At least that my take.
Add Comment