I get following hibernate error. I am able to identify the function which causes the issue. Unfortunately there are several DB calls in the function. I am unable to find the
I was facing this exception, and hibernate was working well. I tried to insert manually one record using pgAdmin, here the issue became clear. SQL insert query returns 0 insert. and there is a trigger function that cause this issue because it returns null. so I have only to set it to return new. and finally I solved the problem.
hope that helps any body.
This happened if you change something in data set using native sql query but persisted object for same data set is present in session cache. Use session.evict(yourObject);
I got the same exception while deleting a record by Id that does not exists at all. So check that record you are updating/Deleting actually exists in DB
This can happen when trigger(s) execute additional DML (data modification) queries which affect the row counts. My solution was to add the following at the top of my trigger:
SET NOCOUNT ON;
In my case there was an issue with the Database as one of the Stored Procs was consuming all the CPU causing high DB response times. Once this was killed issue got resolved.
Prior to Hibernate 5.4.1, the optimistic locking failure exceptions (e.g., StaleStateException
or OptimisticLockException
) didn't include the failing statement.
The HHH-12878 issue was created to improve Hibernate so that when throwing an optimistic locking exception, the JDBC PreparedStatement
implementation is logged as well:
if ( expectedRowCount > rowCount ) {
throw new StaleStateException(
"Batch update returned unexpected row count from update ["
+ batchPosition + "]; actual row count: " + rowCount
+ "; expected: " + expectedRowCount + "; statement executed: "
+ statement
);
}
I created the BatchingOptimisticLockingTest in my High-Performance Java Persistence GitHub repository to demonstrate how the new behavior works.
First, we will define a Post
entity that defines a @Version
property, therefore enabling the implicit optimistic locking mechanism:
@Entity(name = "Post")
@Table(name = "post")
public class Post {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE)
private Long id;
private String title;
@Version
private short version;
public Long getId() {
return id;
}
public Post setId(Long id) {
this.id = id;
return this;
}
public String getTitle() {
return title;
}
public Post setTitle(String title) {
this.title = title;
return this;
}
public short getVersion() {
return version;
}
}
We will enable the JDBC batching using the following 3 configuration properties:
properties.put("hibernate.jdbc.batch_size", "5");
properties.put("hibernate.order_inserts", "true");
properties.put("hibernate.order_updates", "true");
We are going to create 3 Post
entities:
doInJPA(entityManager -> {
for (int i = 1; i <= 3; i++) {
entityManager.persist(
new Post()
.setTitle(String.format("Post no. %d", i))
);
}
});
And Hibernate will execute a JDBC batch insert:
SELECT nextval ('hibernate_sequence')
SELECT nextval ('hibernate_sequence')
SELECT nextval ('hibernate_sequence')
Query: [
INSERT INTO post (title, version, id)
VALUES (?, ?, ?)
],
Params:[
(Post no. 1, 0, 1),
(Post no. 2, 0, 2),
(Post no. 3, 0, 3)
]
So, we know that JDBC batching works just fine.
Now, let's replicate the optimistic locking issue:
doInJPA(entityManager -> {
List<Post> posts = entityManager.createQuery("""
select p
from Post p
""", Post.class)
.getResultList();
posts.forEach(
post -> post.setTitle(
post.getTitle() + " - 2nd edition"
)
);
executeSync(
() -> doInJPA(_entityManager -> {
Post post = _entityManager.createQuery("""
select p
from Post p
order by p.id
""", Post.class)
.setMaxResults(1)
.getSingleResult();
post.setTitle(post.getTitle() + " - corrected");
})
);
});
The first transaction selects all Post
entities and modifies the title
properties.
However, before the first EntityManager
is flushed, we are going to execute a second transition using the executeSync
method.
The second transaction modifies the first Post
, so its version
is going to be incremented:
Query:[
UPDATE
post
SET
title = ?,
version = ?
WHERE
id = ? AND
version = ?
],
Params:[
('Post no. 1 - corrected', 1, 1, 0)
]
Now, when the first transaction tries to flush the EntityManager
, we will get the OptimisticLockException
:
Query:[
UPDATE
post
SET
title = ?,
version = ?
WHERE
id = ? AND
version = ?
],
Params:[
('Post no. 1 - 2nd edition', 1, 1, 0),
('Post no. 2 - 2nd edition', 1, 2, 0),
('Post no. 3 - 2nd edition', 1, 3, 0)
]
o.h.e.j.b.i.AbstractBatchImpl - HHH000010: On release of batch it still contained JDBC statements
o.h.e.j.b.i.BatchingBatch - HHH000315: Exception executing batch [
org.hibernate.StaleStateException:
Batch update returned unexpected row count from update [0];
actual row count: 0;
expected: 1;
statement executed:
PgPreparedStatement [
update post set title='Post no. 3 - 2nd edition', version=1 where id=3 and version=0
]
],
SQL: update post set title=?, version=? where id=? and version=?
So, you need to upgrade to Hibernate 5.4.1 or newer to benefit from this improvement.