My preprocessor is just generalized kerning, the macros are variations on the single well-known proof for the Turing-completeness of GK, the type system will probably be a Prolog reskin so simple the translator can be a FSM, the type inference algo is the original HM algorithm which I don't even need to change, the core language is Lambda calculus and no more, and the backend might just be Erlang itself if my research confirms that extending LLVM until it consistently beats Erlang is unrealistic.

I invented nothing, I create nothing. All I do is plug circles into square holes and fill the gaps with play dough.

  • 2
    The fuck? 😂
  • 2
    @iiii Heavy impostor syndrome while designing my own programming language, which I set out to make because I don't like how every language tries to limit its type-level metaprogramming and when it eventually inevitably becomes Turing complete they all lack the proper tools to guide the type checker.

    I also wanted a syntax-level metaprogramming system that allows macros to completely reinvent how the language works within some minimal boundaries but explicitly without restricting the range of effect of a macro like Rust. Generalized kerning is a Turing complete system that works naturally on token trees, preserves a notion of distance and is easy to namespace such that macros start from code that intentionally referenced them and extend their reach through explicit steps.

    The rest is mainly about minimalism, although I also have quips about type inference engines that try too hard. If HM can't figure it out neither can the human reader so it should be explicit.
  • 3
    @iiii It only struck me today that my final design is a big blob of other people's ideas tied together with little patches and wrapped in pretty syntax, that's all.
  • 3
    @lbfalvy isn't that what almost anything any human does: a permutation of all the previous inputs.
  • 0
    Is this markov generated?
Add Comment