6

My god why are all parser combinator libraries in Rust so bad??

Is it too much to ask for to parse a normal ass programming language with error recovery???? That's like 99.99% of what people need a parsing library for and somehow error recovery is always some weird after thought bandaged on after the fact

For fucks sake, I'm finally getting over the urge to reinvent the wheel all the time, but now all the libraries suck ass so I have to. Arrghh!

Comments
  • 3
    Hm, I have no idea but I think this might be one of the hardest things to do in computer science. Is it not?
  • 2
    @Lensflare If you want a perfect solution maybe, but a hand-written-ish recursive descent parser is pretty easy. And error recovery of "skip all tokens until we find some terminator symbol" shouldn't be that hard either

    But even just that is so unnecessarily hard with these libraries. And waaaay to many traits and generic parameters

    trait bounds not statisfied

    trait bounds not statisfied

    trait bounds not statisfied

    trait bounds not statisfied

    trait bounds not statisfied

    Argh!!!!!!!
  • 3
    So is Rust changing the problem of C++ being easy to fuck up the code to Rust being hard to compile? Is this just shifting complexity so it isn't a minefield to a brick wall?
  • 3
    @Demolishun Kinda. But this is more a problem of over engineered libraries, it doesn't have to be this horrible. Think C++ template hell
  • 2
    @12bitfloat templates are fine until they aren't. The long listings are annoying when it just doesn't want to anymore.
  • 4
    I started off with Chumsky but pretty quickly switched over to a manual parser; I think with all the functional-ish features of Rust libraries struggle to justify their existence so they try really hard to be zero-cost which is just miserable beyond a certain level of complexity.
  • 4
    @lorentz the ultimate goal is to implement all the calculations at compile time so that you get your result without even running the code. :)
  • 2
    @lorentz Same. I don't know why it also ends with ten billion trait bounds. Surely you can write a parser combinator library in a more imperative way
  • 3
    @Demolishun Same with Rust traits. Normally they're fine until someone does this shit
  • 4
    @12bitfloat Oh I saw that one! My friend in uni tried to offload some crypto stuff from Erlang into a Rust NIF, and this was the point where he decided to use C instead because learning everything there is to know about advanced Rust type system tricks wouldn't fit in the schedule.
  • 2
    @lorentz sounds like a wise friend to keep close.

    @lorentz of course you switched to a manual parser. You're that awesome.

    @Lensflare hmm, kind off but you want to have the original ast values and stuff. I think you'll regret it if you do interpreting actions in the parser later. Interpreters are hard enough without making exceptions on what is already parse and what not. It's more comfortable to have one central place to do evaluating shit I guess. Wrote many incomplete parsers and one completed one. Almost two 😁 I also am ik the middle of one but it's stalled for months now. That's very bad interpreters should be actively worked on, doing it just a bit on side is not the way to go for interpreters imho. It requires some dedication.
  • 1
    I ask again

    Wtf is combinator and what are you using it for ?
  • 1
    @Lensflare parsing? That's the most fundamental
  • 2
    @AvatarOfKaine "Parser combinators" is a code pattern of essentially small utilty functions you chain together to build a parser. So instead of using a parser generator like yacc you just write a few function
  • 2
    @12bitfloat I must have always written parsers the old fashioned way with string manipulation functions and sequential reading to generate token points based off identifying start and end points of various statement and block and literals...
  • 1
    @12bitfloat that sounds like it would take longer to learn than do it the other way lol

    What else do you use this for ?
  • 1
    @AvatarOfKaine What you're describing is pretty much what parser combinators allow you to do. But instead of having to do everything yourself a library already has those convenience function ready for you
  • 1
    @12bitfloat how complex of a learn is it ?
  • 1
    @AvatarOfKaine Depends. It's not *that* hard, but it has a bunch of footguns: E.g. you have to parse things in the right order to avoid failing on half parsed rules
  • 1
    @AvatarOfKaine Btw, the actual term for what you're describing is "recursive descent parsing" (https://en.wikipedia.org/wiki/... ). If you google for that you'll find a bunch of good stuff
Add Comment