UnableToExecuteStatementException: Batch entry was aborted. Call getNextException to see the cause

后端 未结 2 1687
被撕碎了的回忆
被撕碎了的回忆 2021-01-15 18:36

Using @SqlBatch to batch update the DB

@SqlBatch(\"\")
@BatchChunkSize(INSERT_BATCH_SIZE)
void insert(@BindBean Iterator<         


        
相关标签:
2条回答
  • 2021-01-15 19:26

    I have just seen a thing like that with Hibernate and Postgres. As I understand, the problem is in the PostgreSQL driver (org.postgresql.core.v3.QueryExecutorImpl).

    What I saw was:

    java.sql.BatchUpdateException: Batch entry 0 INSERT INTO myscheme.table_name (list,of,column,names,...) VALUES (9007199314139196, 'F', 27, 625, 625, 625, 625, 625, 28), (9007199314139198, 'T', 2395, 2369, 2369, 2369, 2369, 2369, 2389), ... was aborted.  Call getNextException to see the cause.
    

    Uninformative.

    I caught and printed the exception,

        } catch(JDBCConnectionException t) {
            System.out.println("================ {{{");
            SQLException current = t.getSQLException();
            do {
               current.printStackTrace();
            } while ((current = current.getNextException()) != null);
            System.out.println("================ }}}");
            throw t;
        }
    

    and got:

    Caused by: java.io.IOException: Tried to send an out-of-range integer as a 2-byte value: 33300
    

    Still confusing. 33300 is a very strange number. What is important is that it is a multiple of the number of columns.

    The exception happens

    at org.postgresql.core.v3.QueryExecutorImpl.sendParse(QueryExecutorImpl.java:1329)
    

    which is

    pgStream.SendInteger2(params.getParameterCount()); // # of parameter types specified
    

    What does that mean?

    The total number of values, that is, the number of columns multiplied by the number of rows must not exceed 32767 for a single INSERT statement.

    You can divide 32767 by the number of columns to get the maximal number of rows per one SQL INSERT statement.

    0 讨论(0)
  • 2021-01-15 19:31

    I found a similar issue importing from Aurora (MySQL) to Redshift using a DataPipeline CopyActivity. I was able to solve it by casting the incoming data to the proper target table types in the insertQuery like this:

    INSERT INTO my_table (id, bigint_col, timestamp_col) VALUES (?,cast(? as bigint),cast(? as timestamp))
    
    0 讨论(0)
提交回复
热议问题