I am now using batch:
String query = \"INSERT INTO table (id, name, value) VALUES (?, ?, ?)\";
PreparedStatement ps = connection.prepareStatement(query);
i think this will do
String query = "INSERT INTO table (id, name, value) VALUES ";
for (Record record : records)
{
query += "(" + record.id + ",'" + record.name + "'," + record.value + "),";
query = query.substring(1, query.length() - 1);
PreparedStatement ps = connection.prepareStatement(query);
ps.executeUpdate();
}
because you have to execute the query for each records to insert into database.
executeBatch
will have an improved performance over executeUpdate
as long as autocommit is set to false:
connection.setAutoCommit(false);
PreparedStatement ps = connection.prepareStatement(query);
for (Record record : records) {
// etc.
ps.addBatch();
}
ps.executeBatch();
connection.commit();
You can face a serious performance issue if the number of items that you want to insert is large. Therefore, it is safer to define a batch size, and constantly execute the query when the batch size is reached.
Something like the following example code should work. For the full story of how using this code efficiently, please see this link.
private static void insertList2DB(List<String> list) {
final int batchSize = 1000; //Batch size is important.
Connection conn = getConnection();
PreparedStatement ps = null;
try {
String sql = "INSERT INTO theTable (aColumn) VALUES (?)";
ps = conn.prepareStatement(sql);
int insertCount=0;
for (String item : list) {
ps.setString(1, item);
ps.addBatch();
if (++insertCount % batchSize == 0) {
ps.executeBatch();
}
}
ps.executeBatch();
} catch (SQLException e) {
e.printStackTrace();
System.exit(1);
}
finally {
try {
ps.close();
conn.close();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public void insertBatch(final List<Record > records ) {
String query = "INSERT INTO table (id, name, value) VALUES (?, ?, ?)";
GenericDAO.getJdbcTemplate().batchUpdate(sql, new BatchPreparedStatementSetter() {
@Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
Record record = records .get(i);
ps.setInt(1, record.id);
ps.setString(2, record.name);
ps.setInt(3, record.value);
}
@Override
public int getBatchSize() {
return records.size();
}
});
}
First of all, with query string concatenation you not only lose the type conversion native to PreparedStatement methods, but you also get vulnerable to malicious code being executed in the database.
Second, PreparedStatements are previously cached in the very database itself, and this already gives a very good performance improvement over plain Statements.