Hello,

I'm trying to upload a fairly large file (18GB or so) to my AWS S3
account via bin/hadoop fs -put ... s3://...

It copies for a good 15 or 20 minutes, and then eventually errors out
with a failed retry attempt (saying that it can't retry since it has
already written a certain number of bytes, etc. sorry don't have the
original error message at the moment). Has anyone experienced anything
similar? Can anyone suggest a workaround or a way to specify retries?
Should I use another tool for uploading large files to s3?

Thanks,
Ryan

Reply via email to