I've been working for some time with aws java API with not so many problems. Currently I'm using the library 1.5.2 version.
When I'm iterating the objects inside a folder with the following code:
AmazonS3 s3 = new AmazonS3Client(new PropertiesCredentials(MyClass.class.getResourceAsStream("AwsCredentials.properties")));
String s3Key = "folder1/folder2";
String bucketName = Constantes.S3_BUCKET;
String key = s3Key +"/input_chopped/";
ObjectListing current = s3.listObjects(new ListObjectsRequest()
.withBucketName(bucketName)
.withPrefix(key));
boolean siguiente = true;
while (siguiente) {
siguiente &= current.isTruncated();
contador += current.getObjectSummaries().size();
for (S3ObjectSummary objectSummary : current.getObjectSummaries()) {
S3Object object = s3.getObject(new GetObjectRequest(bucketName, objectSummary.getKey()));
System.out.println(object.getKey());
}
current=s3.listNextBatchOfObjects(current);
}
Gist: Link: https://gist.github.com/fgblanch/6038699 I'm getting the following exception:
INFO (AmazonHttpClient.java:358) - Unable to execute HTTP request: Timeout waiting for connection from pool
org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
at org.apache.http.impl.conn.PoolingClientConnectionManager.leaseConnection(PoolingClientConnectionManager.java:232)
at org.apache.http.impl.conn.PoolingClientConnectionManager$1.getConnection(PoolingClientConnectionManager.java:199)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:456)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:315)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:199)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:2994)
at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:918)
at com.madiva.segmentacion.tests.ListaS3.main(ListaS3.java:177)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caught an AmazonClientException, which means the client encountered a serious internal problem while trying to communicate with S3, such as not being able to access the network.
Error Message: Unable to execute HTTP request: Timeout waiting for connection from pool
Any idea how to avoid this error. It only happens in folders with a number of object , in this case there were 463 files inside. Thanks
I've found that S3Object opens a connection for each object. That are not liberated even if the object is garbage collected so it is needed to execute object.close(), in order to liberate the connection to the pool.
So the corrected code would be:
for (S3ObjectSummary objectSummary : current.getObjectSummaries()) {
S3Object object = s3.getObject(new GetObjectRequest(bucketName, objectSummary.getKey()));
System.out.println(object.getKey());
object.close();
}
Check if HttpResponseHandler is closing connections from the pool. AmazonHttpClient has the paramter 'leaveHttpConnectionOpen' which indicates that the connection should be closed or not.
来源:https://stackoverflow.com/questions/17782937/connectionpooltimeoutexception-when-iterating-objects-in-s3