How often is he doing this? We have a customer that uploads geo data, a couple gigs at a time. We have him bring the Data to our office on a flash drive and we just upload it for him. This was good to prove that the Receiving server was limiting him at 10mb/s on the upload anyway. Now, whenever he has a lot of data to upload, he'll just bring it over to the office, and we offload the hours of upload off the Wireless network.

On 10/12/2014 11:05 AM, Ken Hohhof via Af wrote:
I have a customer who keeps asking for more upload speed because it takes too long to upload a bunch of files to his server at a datacenter. He thinks this should not cost a lot because he is just "bursting", unfortunately what he wants would require a dedicated link from the tower to his house. He will upload around 100 files when he finishes a project and it takes about 20 minutes, apparently that's a problem. What I see when he is uploading is about 50% duty cycle, apparently a file uploads, then dead time, then another file. So I'm thinking he first needs to improve how efficiently he uses his Internet connection.

He says he is using drag and drop in Windows 7. I assume this means he is using Remote Desktop and using drag and drop within the RDP session from his local drive to a drive on the remote server.

Would I be right that RDP drag and drop is not an efficient way to transfer lots of files? (I've never done that myself.) What would be the best way?

Personally, I would just use FTP, maybe create a tar archive first, but he is using Windows. If he needs security, it seems there are choices like SFTP, FTPS, SCP. If HIPAA level security is not required, vanilla FTP would avoid the encryption overhead. I found an article on how to set the number of concurrent connections in Filezilla to something like 10, would that keep the link 100% utilized? My other FTP client is WS_FTP, I don't know if it can do concurrent file transfers.

Reply via email to