Sybase JConnect: ENABLE_BULK_LOAD usage

前端 未结 5 2077
野的像风
野的像风 2020-12-16 08:10

Can anyone out there provide an example of bulk inserts via JConnect (with ENABLE_BULK_LOAD) to Sybase ASE?

I\'ve scoured the internet and found nothing

相关标签:
5条回答
  • 2020-12-16 08:25

    Support for Batch Updates

    Batch updates allow a Statement object to submit multiple update commands as one unit (batch) to an underlying database for processing together.

    Note: To use batch updates, you must refresh the SQL scripts in the sp directory under your jConnect installation directory. CHAPTER

    See BatchUpdates.java in the sample (jConnect 4.x) and sample2 (jConnect 5.x) subdirectories for an example of using batch updates with Statement, PreparedStatement, and CallableStatement. jConnect also supports dynamic PreparedStatements in batch.

    Reference:

    http://download.sybase.com/pdfdocs/jcg0420e/prjdbc.pdf

    http://manuals.sybase.com/onlinebooks/group-jcarc/jcg0520e/prjdbc/@ebt-link;hf=0;pt=7694?target=%25N%14_4440_START_RESTART_N%25#X

    .

    Other Batch Update Resources

    http://java.sun.com/j2se/1.3/docs/guide/jdbc/spec2/jdbc2.1.frame6.html

    http://www.jguru.com/faq/view.jsp?EID=5079

    0 讨论(0)
  • 2020-12-16 08:36

    Don't know how to do this in Java, but you can bulk-load text files with LOAD TABLE SQL statement. We did it with Sybase ASA over JConnect.

    0 讨论(0)
  • 2020-12-16 08:41

    In order to get the sample provided by Chris Kannon working, do not forget to disable auto commit mode first:

    dbConn.setAutoCommit(false);
    

    And place the following line before dbConn.commit():

    pstmt.executeBatch();
    

    Otherwise this technique will only slowdown the insertion.

    0 讨论(0)
  • 2020-12-16 08:44

    I got in touch with one of the engineers at Sybase and they provided me a code sample. So, I get to answer my own question.

    Basically here is a rundown, as the code sample is pretty large... This assumes a lot of pre initialized variables, but otherwise it would be a few hundred lines. Anyone interested should get the idea. This can yield up to 22K insertions a second in a perfect world (as per Sybase anyway).

    SybDriver sybDriver = (SybDriver) Class.forName("com.sybase.jdbc3.jdbc.SybDriver").newInstance();
    sybDriver.setVersion(com.sybase.jdbcx.SybDriver.VERSION_6);
    DriverManager.registerDriver(sybDriver);
    
    //DBProps (after including normal login/password etc.
    props.put("ENABLE_BULK_LOAD","true");
    
    //open connection here for  sybDriver
    
    dbConn.setAutoCommit(false);    
    String SQLString = "insert into batch_inserts (row_id, colname1, colname2)\n values (?,?,?) \n";
    
    PreparedStatement   pstmt;
    try
    {
       pstmt = dbConn.prepareStatement(SQLString);      
    }
    catch (SQLException sqle)
    {
       displaySQLEx("Couldn't prepare statement",sqle);
       return;
    }
    
    for (String[] val : valuesToInsert)
    {
       pstmt.setString(1, val[0]);  //row_id    varchar(30)
       pstmt.setString(2, val[1]);//logical_server varchar(30)
       pstmt.setString(3, val[2]);  //client_host varchar(30)
    
       try
       {
          pstmt.addBatch();
       }
       catch (SQLException sqle)
       {
          displaySQLEx("Failed to build batch",sqle);
          break;
       }
    }
    
    try {
       pstmt.executeBatch();
       dbConn.commit();
       pstmt.close();
    } catch (SQLException sqle) {
       //handle
    }
    
    try {
       if (dbConn != null)
          dbConn.close();
    } catch (Exception e) {
       //handle
    }
    
    0 讨论(0)
  • 2020-12-16 08:49

    After following most of your advice we didn't see any improvement over simply creating a massive string and sending that across in batches of ~100-1000rows with a surrounding transaction. we got around: *Big String Method [5000rows in 500batches]: 1716ms = ~2914rows per second. (this is shit!).

    Our db is sitting on a virtual host with one CPU (i7 underneath) and the table schema is:

    CREATE TABLE
    archive_account_transactions
    (
    account_transaction_id INT,
    entered_by INT,
    account_id INT,
    transaction_type_id INT,
    DATE DATETIME,
    product_id INT,
    amount float,
    contract_id INT NULL,
    note CHAR(255) NULL
    )
    

    with four indexes on account_transaction_id (pk), account_id, DATE, contract_id.

    Just thought I would post a few comments first we're connecting using:

    jdbc:sybase:Tds:40.1.1.2:5000/ikp?EnableBatchWorkaround=true;ENABLE_BULK_LOAD=true   
    

    we did also try the .addBatch syntax described above but it was marginally slower than just using java StringBuilder to build the batch in sql manually and then just push it across in one execute statement. Removing the column names in the insert statement gave us a surprisingly large performance boost it seemed to be the only thing that actually effected the performance. As the Enable_bulk_load param didn't seem to effect it at all nor did the EnableBatchWorkaround we also tried DYNAMIC_PREPARE=false which sounded promising but also didn't seem to do anything.

    Any help getting these parameters actually functioning would be great! In other words are there any tests we could run to verify that they are in effect? I'm still convinced that this performance isn't close to pushing the boundaries of sybase as mysql out of the box does more like 16,000rows per second using the same "big string method" with the same schema.

    Cheers Rod

    0 讨论(0)
提交回复
热议问题