5

I have to do a transfer of about 2 GB of data from one remote server to another. Any suggestions?

My idea was to do multiple curl requests while compressing the data using gzcompress.

Preliminary testing shows that won't work. Now I'm considering putting the data in a file on our S3 bucket for the other server to obtain.

Comments
  • 7
    Why not use rsync?
  • 1
    The problem with that is that we can't write files to the server. And they have a strict post max size.
  • 5
    @jchw this. rsync is the way to go
  • 0
    Thanks guys. Never heard of it. Will do research and learn it.

    Will comment again on Monday to let you guys know how it went.
  • 0
    @kimmax would it work with a data object that's not in a file?
  • 0
    @wolt but the data isn't in a physical file.
  • 0
    We don't have that access.
  • 0
    Stupid I know
  • 0
    We don't have access file write permissions to export it onto the server to be able to transfer it.
  • 0
    Rsync.
  • 0
    @wolt actually it does: it's more robust (to issues like corruption, using rolling checksums) and has support for compression built-in. Besides that, Scp is an old program that predates modern alternatives like sftp and rsync and as far as I can tell the only good reason to use it is if those things are not supported.
  • 0
    If any other solution doesn't work for you, please avoid S3/Drive/Box. Use Storj instead. It is free.
  • 0
    We're a big nonprofit. I have no say. We use AWS.
  • 0
    @wolt I would use -rCp as options
Add Comment