Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple APILearn More
Use the s3 CLI. The web UI isn't designed for large copies.
Use ec2 instances to do that. You will get best speed. Use 'screen' to run the command in background so that even if u close ssh connection to ec2 instance, you command will keep running.
I use it when I have to move data comprises of multiple small files resulting in 20 gb to another bucket.
That's hacky. Rsync will allow resuming connections and detaching.
The AWS CLI handles all that with continuations as well, no need to copy it twice.
arvinds1041y@shiv7071007 same here I have small files but the total size is 20gb and anyways I made a tar out of it and moved it. Thanks 😊