问题
I am downloading a large repository from github of size 300M. It takes 10-15sec when I download from my browser. On the same machine, it takes 110-120sec when I use below code to download. I am wondering if I am doing wrong. Please suggest me to get the same speed(10-15sec) using apache http client. Or is there anything better than http client ?
Apache httpclient = 4.5
java - 8
Code that I used:
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.IOUtils;
import org.apache.http.HttpResponse;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.ResponseHandler;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.impl.client.LaxRedirectStrategy;
import org.apache.http.impl.conn.PoolingHttpClientConnectionManager;
public class Downloader {
public File download(URL url, File dstFile) {
PoolingHttpClientConnectionManager manager = new PoolingHttpClientConnectionManager();
manager.setDefaultMaxPerRoute(20);
manager.setMaxTotal(200);
CloseableHttpClient httpclient = HttpClientBuilder.create()
.setConnectionManager(manager)
.build();
// Second option: it also takes same time.
// .setRedirectStrategy(new LaxRedirectStrategy())
// .setMaxConnTotal(2 * 50)
// .setMaxConnPerRoute(50)
// .build();
// CloseableHttpClient httpclient = HttpClients.custom()
// .setRedirectStrategy(new LaxRedirectStrategy()) // adds HTTP REDIRECT support to GET and POST methods
// .build();
try {
HttpGet get = new HttpGet(url.toURI()); // we're using GET but it could be via POST as well
File downloaded = httpclient.execute(get, new FileDownloadResponseHandler(dstFile));
return downloaded;
} catch (Exception e) {
throw new IllegalStateException(e);
} finally {
IOUtils.closeQuietly(httpclient);
}
}
static class FileDownloadResponseHandler implements ResponseHandler<File> {
private final File target;
public FileDownloadResponseHandler(File target) {
this.target = target;
}
@Override
public File handleResponse(HttpResponse response) throws ClientProtocolException, IOException {
InputStream source = response.getEntity().getContent();
FileUtils.copyInputStreamToFile(source, this.target);
return this.target;
}
}
}
回答1:
I had the same problem; the fix was implementing the suggestion from here: https://stackoverflow.com/a/35458078/9204142, i.e. use disableContentCompression()
or contentCompressionEnabled=false
at the RequestConfig
level.
After the fix the speed matched the one of curl.
As I've used Apache Camel the fix implied adding ?httpClient.contentCompressionEnabled=false
in the URI of the Endpoint.
来源:https://stackoverflow.com/questions/32239188/apache-http-client-slower-than-browser-download