Copied parent inserts childs instead of updating them

帅比萌擦擦* 提交于 2021-01-05 12:48:18

问题


Describtion

To implement multithreading in hibernate & jpa, i deep copy some of my entities. Those copys are used by the sessions to add, remove or update the entities.

Problem

It worked fine so far, but i ran into a issue with parent/child relations. When i update my parent, its childs are "always" inserted... they never receive any sort of update. And this is pretty bad, because i receive a "Duplicate Key" exception at the second parent-update iteration.

My flow currently looks like the following...

  • Game update triggered
  • Deep copy entities which are marked as "update".
  • Pass those deep copy entities to the update thread ( multithreaded environment )
  • Open session, let the session update them
  • Wait for next game update and repeat the cycle

Parent-Child

And those classes represent the child/parent relationship.

/**
 * A component which marks a {@link com.artemis.Entity} as a chunk and stores its most valuable informations.
 */
@Entity
@Table(name = "chunk", uniqueConstraints = {@UniqueConstraint(columnNames={"x", "y"})}, indexes = {@Index(columnList = "x,y")})
@Access(value = AccessType.FIELD)
@SelectBeforeUpdate(false)
public class Chunk extends HibernateComponent{

    public int x;
    public int y;
    public Date createdOn;

    @OneToMany(fetch = FetchType.EAGER)
    @JoinTable(name = "chunk_identity", joinColumns = @JoinColumn(name = "identity_id"), inverseJoinColumns = @JoinColumn(name = "id"), inverseForeignKey = @ForeignKey(ConstraintMode.NO_CONSTRAINT))
    @Fetch(FetchMode.JOIN)
    @BatchSize(size = 50)
    public Set<Identity> inChunk = new LinkedHashSet<>();

    @Transient
    public Set<ChunkLoader> loadedBy = new LinkedHashSet<>();

    public Chunk() {}
    public Chunk(int x, int y, Date createdOn) {
        this.x = x;
        this.y = y;
        this.createdOn = createdOn;
    }
}

/**
 * Represents a ID of a {@link com.artemis.Entity} which is unique for each entity and mostly the database id
 */
@Entity
@Table(name = "identity")
@Access(AccessType.FIELD)
@SQLInsert(sql = "insert into identity(tag, typeID, id) values(?,?,?) ON DUPLICATE KEY UPDATE id = VALUES(id), tag = values(tag), typeID = values(typeID)")
@SelectBeforeUpdate(value = false)
public class Identity extends Component {

    @Id public long id;
    public String tag;
    public String typeID;

    public Identity() {}
    public Identity(long id, String tag, String typeID) {
        this.id = id;
        this.tag = tag;
        this.typeID = typeID;
    }

    @Override
    public boolean equals(Object o) {
        if (this == o) return true;
        if (o == null || getClass() != o.getClass()) return false;
        var identity = (Identity) o;
        return id == identity.id;
    }

    @Override
    public int hashCode() {
        return Objects.hash(id, tag, typeID);
    }
}

Question

Any idea why my deep cloned parent always inserts its childs ? And how could i prevent this while still using multithreading ( When i dont use cloned objects, a hibernate internal exception occurs )...

回答1:


I guess the select before update configuration is the problem. Since you are using Session.update, which AFAIK doesn't work with @SelectBeforeUpdate(value = false) which is also documented in the Java Docs, Hibernate has no way to know if the object exists, so it always tries to insert it.

I think this is a perfect use case for Blaze-Persistence Entity Views.

Blaze-Persistence is a query builder on top of JPA which supports many of the advanced DBMS features on top of the JPA model. I created Entity Views on top of it to allow easy mapping between JPA models and custom interface defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure the way you like and map attributes(getters) via JPQL expressions to the entity model. Since the attribute name is used as default mapping, you mostly don't need explicit mappings as 80% of the use cases is to have DTOs that are a subset of the entity model.

A projection with Entity Views could look as simple as the following

@EntityView(Chunk.class)
interface ChunkDto {
    @IdMapping
    Long getId();
    int getX();
    int getY();
    @Mapping(fetch = MULTISET) // This is a much more efficient fetch strategy
    Set<IdentityDto> getIdentities();
}
@EntityView(Identity.class)
interface IdentityDto {
    @IdMapping
    Long getId();
    String getTag();
    String getTypeID();
}

Querying is a matter of applying the entity view to a query, the simplest being just a query by id.

ChunkDto dto = entityViewManager.find(entityManager, ChunkDto.class, id);

But the Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features

List<ChunkDto> findAll();

You can also make use of updatable entity views which reduces the amount of data fetched and flush back only the parts that you actually want to change:

@CreatableEntityView
@UpdatableEntityView
@EntityView(Chunk.class)
interface ChunkDto {
    @IdMapping
    Long getId();
    void setId(Long id);
    int getX();
    int getY();
    @Mapping(fetch = MULTISET) // This is a much more efficient fetch strategy
    Set<IdentityDto> getIdentities();
    default void addIdentity(String tag, String typeID) {
        IdentityDto dto = evm().create(IdentityDto.class);
        dto.setTag(tag);
        dto.setTypeID(typeID);
        getIdentities().add(dto);
    }
    EntityViewManager evm();
}
@CreatableEntityView
@UpdatableEntityView
@EntityView(Identity.class)
interface IdentityDto {
    @IdMapping
    Long getId();
    void setId(Long id);
    String getTag();
    void setTag(String tag);
    String getTypeID();
    void setTypeID(String typeID);
}

Now you can fetch that object and then after changing the state flush it back to the database:

ChunkDto o = repository.findOne(123L);
o.getIdentities().addIdentity("my-tag", "my-type-id");
repository.save(o);

And it will only flush back the new identity through an insert and the association to the chunk through an insert to the join table as you will see in the SQL. Blaze-Persistence Entity-Views supports real dirty tracking which allows for flushing updates (and also only flushes the state that really changed i.e. like @DynamicUpdate) without the need for selects.




回答2:


I made some test and noticed the following.

I iterate over the chunks and add new entities to them, nearly each frame. The update occurs every single minute onces, that means each chunk has many, many different new or old removed childs.

Even when i update/merge those on the mainthread, hibernate throws an duplicate entry exception. I think this is related to the amount of times we update those chunk childs. It might happen that one child gets removed, added, removed, added and then stays, so hibernate tries to replicate this behaviour and fails.

But i might be wrong, i added/removed different cascade settings, merge instead of update and they all had the same problem.

Solution

Theres no real solution... a way to bypass that exception is to add custom @SQLInsert annotations for ignoring the duplicate key exception. It works great on the main thread then. It even seems to work with deep cloned entities, even if only insert statements for the childs appear, never any delete or remove statements.

Why ? I think it might work because i define in the custom sql query what should happen on a duplication key error, this way every parent inserts its child and overwrites the old values... because each child is only the child of one parent, it works flawless. Might have issues in other relations.

This might be solved by merging the updated deep cloned object , or by replacing the original one with the updated deep cloned object. Probably theres even some hibernate persistence context hack we missed here.



来源:https://stackoverflow.com/questions/64731215/copied-parent-inserts-childs-instead-of-updating-them

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!