4

Stupid question

I wrote a script to scan a site for this tv I want to buy. At first, I just wrote a get request to the site every 30min. After while, I realized that the TV is selling out because it’s getting stocked, and then sold out within 5 minutes.

Can I send a get request to this site every 5 minutes, without causing harm to the site? I really don’t want to mess this guys site up lol

Comments
  • 5
    Every 5 minutes should be absoluetely okay!

    Anything not subsecond should be handled well by a normal webserver
  • 1
    @jonas-w awesome, I’ll make sure I don’t do shorter than 1 second. Thanks!!
  • 6
    @DeepHotel Pick a random number between 2 and 5.

    Most request limiters are bound to requests per fixed timespan - e.g. N requests per IP per 60 seconds.

    Monotonously hammering a webserver isn't nice - it's less about workload, more about how it "looks".

    If you were an firewall and you would get the same request every n seconds from the same IP... Yeah. You'd block that shit, cause that's fishy as fuck.
  • 1
    It’s convention to have a robots.txt at the root of the site to tell you exactly how frequently they will allow you to hit their site.
  • 1
    @tedge the robots.txt doesn't define the frequency
  • 3
    Just think about it this way. Does this site handle more than 1 user per 5 minutes? The answer is probably yes and the user is probably doing multiple requests per minute. So yeah you should be fine, you also probably only get the html and read that while a browser requests multiple resources per page load, probably like 10 or something.
  • 0
    Ok sweet. So it seems like I should randomize the time between my requests (between 2 and 5 sec). Thanks everyone!
Add Comment