33
Vinka01
5y

My University distributes all worksheets over an online system. To access the files one has to download them each time first. So to get rid of all this annoying clicking in the browser, I just programmed a service, which logs onto the website ,crawls trough every folder, searches for new files and downloads them if they do not exist on my computer. Kind of proud as this is pretty much the first really useful program I developed lol

Comments
  • 4
    So basically a one-way sync for that particular server, nice! Now make it two-way, with support for changing stuff on both sides while they are offline and so on and you get a giant headache. :D
  • 4
    nice! now add a script where it loads the documents in your group chat for your colleagues. 🙂
  • 2
    Hey neat :)
  • 3
    It only crawls by name? If that is the case you should also look at documents that are replaced with newer versions under the same name.
  • 1
    What happened to git?
  • 2
    @Fabian Neat idea, but we only have read permission on those files :/
  • 2
    @Vitrox Usually the prof uploads the files once and never touches them again. Unfortunately to check if there is a new version I would have to perform a get request for every single file because those information (file size, upload date, last edited etc) are not given at the folder pages. With over 100 Files I am frightened to get blocked of by the spam protection.
  • 1
    @Vinka01 That's awesome! If you want to improve it a bit, you could cache the already downloaded filenames in a text file so you only crawl by text, get all the strings and match them with your cache(text file).

    You could also use regular expressions to create some kind of search functionality for you program.

    Nice job, I hope you make it better, just for fun & exp points! :D
Add Comment