The most reliable way to transfer large files to a remote server through Java?

I'm building a Java application that allows our users to load file lists and transfer them to our server for video coding I have built an API for managing files before and after file transfer, but I need to determine a good transfer protocol to actually move files

Now I prefer Apache commons net (see: http://commons.apache.org/net/ )The software package and FTP move files from the client computer to the server There, I will use secure API calls to move files where they need to go

Is this the best route? Is there a better way to reliably transfer large (1 GB) files? Is there any way to use this method to restore broken downloads? I want to avoid traditional HTTP post requests because they are unreliable and cannot recover corrupted uploads

thank you!

Solution

You didn't mention whether using Amazon S3 is an option for your solution, but they do provide native partial upload support The basic workflow is:

>Create an upload placeholder and hold down the response key > upload block – concurrent and retry as needed > use the response key to combine blocks into one file

Their SDK provides built - in file slicing and block uploading

Even if S3 is not the final location, you can use S3 as an upload pen and download files for permanent storage when convenient

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>