I am frequently getting a \'Premature EOF\' Exception when reading a web page.
The following is the StackTrace
java.io.IOException: Premature EOF
You can use APACHE commons io FileUtils.copyURLToFile
method
http://commons.apache.org/io/api-release/org/apache/commons/io/FileUtils.html#copyURLToFile%28java.net.URL,%20java.io.File,%20int,%20int%29
This could be because the server is closing the connection. I have experienced the exact same issue when I had a piece of code which opened a connection, did some other processing, and only then tried to download the contents of the input stream - by the time it to the stream after spending a few seconds on other processing, the server had apparently closed the connection, resulting in IOException: Premature EOF. The solution was to be careful to always immediately handle the contents of the stream - otherwise, you are leaving an HTTP connection open and idle, and eventually the server on the other end of the line will hang up on you.
You can also try to set the buffer size to 1. This slightly helps and if you implement a try logic around it, then it should do the trick.
This may be because you are reading the content line by line and for the last line the file may be missing a return, to signal the end of line. Replace your while with this:
int BUFFER_SIZE=1024;
char[] buffer = new char[BUFFER_SIZE]; // or some other size,
int charsRead = 0;
while ( (charsRead = rd.read(buffer, 0, BUFFER_SIZE)) != -1) {
sb.append(buffer, 0, charsRead);
}
StringBuilder sb = new StringBuilder();
try{
URL url = new URL(address);
InputStream is = url.openStream();
InputStreamReader isr = new InputStreamReader(is);
BufferedReader in = new BufferedReader(isr);
String str;
while((str = in.readLine()) != null){
sb.append(str);
sb.append("\n");
}
in.close();
isr.close();
is.close();
return sb.toString();
}catch(Exception e){
//OMG....
}