I just found out that you can use the '+' unary operator to typecast char to int.

  • 2
    int facepalm = +”🤦🏻‍♀️”

    Edit: to be fair, though, this is true in quite a few languages, so it isn’t surprising in the least. I do prefer explicit casting a la C: `(int) facepalm`
  • 0
    @Root Maybe. Tbh I only know two languages at the moment.
Add Comment