9

Stop fucking abusing “auto” everywhere.
Don’t use it if you’re going to forget what type it actually is or it’s not obvious from the context.
Nearly all function returns are assigned to auto variables and these functions have stupid fucking names.

It’s all over the code.
Each time I see it, I need to hunt for the type definition in this large fucking codebase and the original developer doesn’t fucking know either.
Do whatever the fuck you want if you’re the only one who’ll ever touch this code. Otherwise, you need to learn how to develop as part of a group.

Comments
  • 6
    Do you actually need to hunt it?
    Doesn't your IDE hint you the inferred type?
  • 0
    What's the point of auto anyway?
    The only actual usage I could think of, is if you got 3x nested generics in a type specifier and don't want to type it out everytime.
  • 2
    @metamourge try coming up with a cleaner way to loop through stuff than:

    @highlight
    for (auto piece_of_your_shit : your_shit) ;
  • 0
  • 2
    auto is type inference, right?
    In all languages that I know supporting it, it is best practice to use it wherever you can.

    And I agree with that because your IDE should be able to tell you the inferred type whenever you want to know it.

    So, this should also apply to C++. Unless I'm missing something.
  • 2
    @Lensflare you're not missing anything. It's actually a modern good practice to use auto or auto& instead of verbatim types where type can be inferred.
  • 1
    If you use auto to make your code work in a generic fashion, that’s completely acceptable but it most definitely does not permit you to be lazy. Don’t use it where you don’t need to. And you want me to rely on an IDE to tell me what the type is. When you know the type, why do you want to use auto.
    If this is regarding long names, then
    I’d rather typedef than make the compiler do anything for me.

    But don’t misunderstand me. I’m not against the use of auto. I’m just against it’s misuse.

    Also, good code doesn’t need compilers, IDEs, etc to make sense of it.
  • 3
    @r20408e122449d For local variables it's generally always acceptable to use type inference

    I used Java for the last 10 years. I've written quite enough unnecessarily long type names, trust me
  • 1
    @metamourge I think that's precisely the main use case for it

    C# has a similar keyword (var), and iirc java lacks this feature and thus you have to write overly verbose declarations of your type names get a bit longer
  • 1
    @LotsOfCaffeine Java also has the var keyword since Java 10 IIRC
  • 1
    @12bitfloat I looked it up and yeah they do

    That was in 2018, haven't done much or any java since to be fair
  • 0
    If the code is written so that that auto var could conceivably be one of a number of types, and still have everything else work correctly, then sure. Concepts will make this sort of usage much tighter.

    But if I see `for (auto i = 0; i < foo.size(); ++i)` in code review one more time...
  • 3
    @halfflat do you prefer

    for(decltype(foo.size()) i; i < foo.size; ++i)? 🤔
  • 2
    @halfflat what exactly bothers you with that auto in your example?

    I never understood the arguments against using type inference. Unless you are refusing to use an IDE or want to print code on paper for some reason.
  • 1
    type inference is not for brevity. It helps to DRY the code and is kind of an encapsulation feature because you are hiding implementation details that you don't care about. Why should you be forced to restate the name of the type that is returned from a function? You need to know what the object does. Not what its exact type name is.

    Brevity is just a bonus.
  • 1
    @Lensflare I think it's the opposite tbh. Not knowing the types is exactly the disadvantage of inference in pursuit of brevity
  • 0
    @iiii That's better, if a bit verbose! Otherwise use `std::size_t` if you know `foo` is standard container, or use an adaptor class so you can use range-based for.
  • 4
    @Lensflare Because it is a bug.

    What is i? It's deduced to int. What is foo.size() going to return, if it's a standard container? std::size_t.

    If foo, whatever it may be, has a size() that returns something larger than INT_MAX, we get undefined behaviour.

    If the loop were `for (auto i = 0u; i < foo.size(); ++i)` we'd get a loop that never terminates.

    The use of type deduction in non-generic code makes it less readable (what type is the variable? we have to search) and less reliable (we are avoiding type checks that would be tripped if the type of the rhs changed to something unexpected due to refactoring elsewhere).
  • 2
    @halfflat agreed. Range is better.
  • 1
    @halfflat ok I see. But the problem here is not type inference but mixing different types. You will have the same problem if someone writes int i = 0.
  • 1
    @halfflat and about avoiding type checks:
    The types are being checked for explicit types and inferred types equally.
    Here it is actually more save to use type inference.
    Suppose you have compatible or implicitly convertible types like the int and t_size in your example.
    You will assign something to a variable with explicit type and than the rhs changes it type after refactoring.
    Now you have a bug because you are implicitly converting to a different type.
    If you had used type inference, the type would be the same as the rhs. Which is what you want.
  • 0
    @12bitfloat but you DO know the types exactly.
    What I was saying is not caring about the exact spelled out name of the type.
  • 2
    @Lensflare The difference between `int a =` and `auto a =` is that *you can see the type*. You don't have to pretend to be the compiler and work out what the type on the rhs is (it's not always trivial); it's there.

    Then when someone later writes `a < foo.size()`, it's much easier to see that it is an error.
  • 2
    @Lensflare that's why I proposed 'declspec', so that it would be refactoring independent in the first place.
  • 2
    > Another language encourages type inference everywhere, so it should be okay to use auto everywhere in C++

    Ah, but what if your other language encouraged you to light yourself on fire? Would you do it?

    Auto should only be used in cases where the type can be deduced *visually, by a human* in the immediate vicinity, or where it is required by template code.

    Nothing more. It is NOT a type inference crutch. It is a powerful tool that shouldn't be abused.
  • 1
    @junon it's not just 'another language'

    I can name C#, Kotlin and Swift for sure. I'm pretty confident that there are many others that recommend to favor type inference in general.

    I remember when C# first introduced var. There was so much resistance from so many people. Arguing it's bad. Now there is a general consensus that it's good. All IDEs are showing hints to use it.

    So I think it's the typical "New things are bad because I am not used to it or I don't understand it"

    Auto in C++ came long after var in C# so eventually it's a matter of time until it is considered good by default in C++, too.
  • 1
    I really hate to play this card because it's insulting and not a real argument. I'm sorry for that.

    But I have experienced this for decades in all the languages I was using professionally.

    Generics, nullability, tuples, type inference, ...
    It's introduced into a language and people think it's bad, it should't be used, it's unclear, and so on.
    Later it becomes good practice. Every time.
  • 0
    @Lensflare generics are still a horrible write-only code in some cases 🤣
  • 0
    @iiii You really are a Go fanboy, ain't ya?
  • 0
    @12bitfloat no. I just point out the flaws.
  • 0
    @iiii Except the ones in Go apparently
  • 0
    @12bitfloat in C++, in C#, in Java. Generics are always a write-only code if they try to be too generic.
  • 0
    @iiii What do you even mean by that

    C++'s template hell is dumb we can agree there but anyone fluent in Java or C# has little problem reading generic code
  • 0
    @12bitfloat I'm not going to try to argue with you
  • 0
    @iiii I think we can agree to disagree
  • 1
    @12bitfloat no. Do not mention that dumb statement. We just disagree and that's it. We do not agree.
  • 0
    @iiii I agree 😉
  • 2
    @Lensflare I've been programming since I was 8. That's 20 years now. I've seen a thing or two, too. It's not that it's new and shiny. It's that auto breaks codebase readability and maintainability quickly. This isn't some dogmatic regurgitation that I read from a C++ Dan Abramov lookalike. It's real experience writing good code that I have to revisit years later, coming from maintaining open source for years, and dealing with hundreds of other codebases.

    So no, sorry, that argument doesn't work on me. Type inference breaks readability and hurts maintainability - at least, in C++ - just to gain a few seconds of typing time. It needs to be used sparingly and surgically else your code becomes very hard to read and understand.
Add Comment