4
webdev
4y

I'm just dumping 10 GB of data remotely from a mysql db, because my el cheapo VPS run out of space

can you suggest a good book?
oh, actually I already found one, the title is "Prepare your fucking server/workspace properly if you want to play around with a lot of data"

Comments
  • 2
    oh and the best part, I had a laravel project in dev mode on the site, so the "No space left on disk" error just throws all my env variables at you if you open the site... being dumb hurts man... this is how all those "I'll do that later" tasks end
  • 3
    A lil extra hint ...

    DO NOT PLAN.

    If u know a lot of data will be generated / store, do urself a favor.

    1. Create database
    2. Minify / Index database
    - Types
    - Status Associations via JOIN
    ---> deleted / active / .... , generate assoc table
    - Indexing
    - ...
    3. Seeed with random data
    4. Test the fuck out of it, considering worst case analysis as a starting point. Write actual queries if possible.
    5. Make diagram with DDL of current status
    Repeat 3 times.

    Now compare 1 with 3. Think closely why / what has changed.
    Generate final design.

    Most of the time you'll create a naive approach first, then you'll tune / tweak, and start to over optimize... Hence three times. It's time consuming, but it usually gives me the best result ;)
  • 0
    @PublicByte No... Do not anticipate.

    Test.

    Especially in databases, anticipation is really hard. Sometimes you think you have it 100 % right, till you test and realize you've hit an eg optimizer corner case.
    This way, you've got at least more reliability when anticipating.
  • 1
    I don't know if it helps or not, try dumping with compressing by piping it to gzip or something like that
  • 1
    @shiv7071007 i was not sure about the syntax, and didnt wanted to screw up my data so i used mysqlworkbench and it worked just fine, then i deleted about 9,9gigs of historical data and im running fine now
Add Comment