Reading Huge volume of data from Sqlite to SQL Server fails at pre-execute

前端 未结 1 1740
南笙
南笙 2020-12-07 04:44

I have a huge (26GB) sqlite database that I want to import to SQL Server with SSIS.

I have everything setup correctly. Some of the data flows are working correctly a

相关标签:
1条回答
  • 2020-12-07 05:38

    Step by Step guide

    Since the error is thrown when reading from a large dataset, try reading data by chunks, to achieve that you can follow these steps:

    1. Declare 2 Variables of type Int32 (@[User::RowCount] and @[User::IncrementValue])
    2. Add an Execute SQL Task that execute a select Count(*) command and store the Result Set into the variable @[User::RowCount]

    1. Add a For Loop with the following preferences:

    1. Inside the for loop container add a Data flow task
    2. Inside the dataflow task add an ODBC Source and OLEDB Destination
    3. In the ODBC Source select SQL Command option and write a SELECT * FROM TABLE query *(to retrieve metadata only`
    4. Map the columns between source and destination
    5. Go back to the Control flow and click on the Data flow task and hit F4 to view the properties window
    6. In the properties window go to expression and Assign the following expression to [ODBC Source].[SQLCommand] property: (for more info refer to How to pass SSIS variables in ODBC SQLCommand expression?)

      "SELECT * FROM MYTABLE ORDER BY ID_COLUMN
      LIMIT 500000
      OFFSET " + (DT_WSTR,50)@[User::IncrementValue]"
      

    Where MYTABLE is the source table name, and IDCOLUMN is your primary key or identity column.

    Control Flow Screenshot

    References

    • ODBC Source - SQL Server
    • How to pass SSIS variables in ODBC SQLCommand expression?
    • HOW TO USE SSIS ODBC SOURCE AND DIFFERENCE BETWEEN OLE DB AND ODBC?
    • SQLite Limit
    0 讨论(0)
提交回复
热议问题