Any descent limit value for a SQL query?

I am performing a search operation and i want to limit the scope

I don't want to search the entire table

  • 3
    It’s definitely 17. No further context is required. Always limit to 17.
  • 0
    @Lensflare 😂 i cannot tell if you are serious
  • 3
    @Afrographics what’s not serious about 17?
    17 is a perfectly serious number. Don’t ever use anything other than 17! It’s the optimal limit. Always.
  • 4
  • 0
    It really depends on if the columns are indexed properly so that queries don't take forever. If they're not, even a simple query could take ages. If they are, you could query a large number of records instantly.

    If I am unsure of the state of the indices, I'd start with a small number as my limit and work my way up. If the indices aren't sufficient, then you may be able to add them to improve performance.

    I am not a DBA, so there may be some magic like views or temporary tables that accomplishes the same thing without modifying the table.
  • 0
    If you don't want to search the entire table, don't search the entire table. If you do want to search the entire table, search the entire table.
  • 3
    Tell it to give up after it gets tired
  • 1
    Not the number of rows you're looking for, young padawan.

    Data database shoveling over network you must lookee at.


    Joke aside. Seriously.

    If you have a 4 byte integer ID row, a 2 byte short integer flag and an ASCII varchar field with a length of 8....

    4 + 2 + 8 = 14 byte per row tops.

    No worries. Even with a 10 Mbit connection / 1.25 MByte connection rate you would need *a lot* of rows to be noticable.

    Now let's change it to a worse example:

    An 4 byte integer field and 64 varchar cols, UTF-8 with a max length of 256 chars.

    UTF-8: 4 byte per character max

    So 64 x 256 x 4 byte = 65.536 byte

    Plus 4 byte = 65.540 byte

    64 kibibyte.

    So... 1000 rows could sum up to 64.000 kibibyte as the absolute worst case - total over estimation, yes. Highly unlikely someone transmits only emoticons / hieroglyphs in 64 columns with 256 chars full.

    But... Nevertheless. We're now talking about roughly 64 MB for 1000 rows.

    You absolutely don't want that too happen in the worst case.

    These calculations might seem stupid and are not entirely correct, e.g. packet overhead etc. - but for a quick analysis of what could happen in the worst case... Pretty neat.
  • 2
    @cuddlyogre yeah, but if the table is huge and often written to, indices may become a bottleneck :)

    point is, it depends
Add Comment