How can we bulk insert data in SQLServer without creating a text file from RODBC package?

痞子三分冷 提交于 2019-12-24 01:24:07

问题


This question is the extension of this question How to quickly export data from R to SQL Server. Currently I am using following code:

 # DB Handle  for config file #
   dbhandle <- odbcDriverConnect()

 # save the data in the table finally
   sqlSave(dbhandle, bp, "FACT_OP", append=TRUE, rownames=FALSE, verbose = verbose, fast = TRUE)
 # varTypes <-  c(Date="datetime", QueryDate = "datetime")
 # sqlSave(dbhandle, bp, "FACT_OP",  rownames=FALSE,verbose = TRUE, fast = TRUE, varTypes=varTypes)

 # DB handle close
  odbcClose(dbhandle)

I have tried this approach also, which is working beautifully and I have gained significant speed as well.

 toSQL = data.frame(...);
 write.table(toSQL,"C:\\export\\filename.txt",quote=FALSE,sep=",",row.names=FALSE,col.names=FALSE,append=FALSE);
sqlQuery(channel,"BULK
            INSERT Yada.dbo.yada
            FROM '\\\\<server-that-SQL-server-can-see>\\export\\filename.txt'
            WITH
            (
            FIELDTERMINATOR = ',',
            ROWTERMINATOR = '\\n'
            )");

But my issue is I can NOT keep my data at rest between the transaction (Writing data to a file is not an option because of data security), so I was looking for solution if I can directly Bulk insert from memory or cache the data. Thanks for the help.


回答1:


Good question - also useful in instances where the BULK INSERT permissions cannot be setup for whatever reason.

I threw together this poor man's solution a while back when I had enough data that sqlSave was too slow, but not enough to justify setting up BULK INSERT, so it does not require any data being written to a file. The primary reason that sqlSave and parameterized queries are so slow for inserting data is that each row is inserted with a new INSERT statement. Having R write the INSERT statement manually bypasses this in my example below:

library(RODBC)
channel <- ...
dataTable <- ...relevant data...
numberOfThousands <- floor(nrow(dataTable)/1000)
extra <- nrow(dataTable)%%1000

thousandInsertQuery <- function(channel,dat,range){
  sqlQuery(channel,paste0("INSERT INTO Database.dbo.Responses (IDNum,State,Answer)
                                  VALUES "
                          ,paste0(
                            sapply(range,function(k) {
                              paste0("(",dat$IDNum[k],",'",
                                     dat$State[k],"','",
                                     gsub("'","''",dat$Answer[k],fixed=TRUE),"')")
                            })                                         
                            ,collapse=",")))
}

if(numberOfThousands)
  for(n in 1:numberOfThousands)
  {
    thousandInsertQuery(channel,(1000*(n-1)+1):(1000*n),dataTable)
  }
if(extra)
  thousandInsertQuery(channel,(1000*numberOfThousands+1):(1000*numberOfThousands+extra))

SQL's INSERT statements written out with values will only accept up to 1000 rows at a time, so this code breaks it up into chunks (much more efficiently than one row at a time).

The thousandInsertQuery function will obviously have to be customized to handle whatever columns your data frame has - note also that there are single quotes around the character/factor columns and a gsub to handle any single quotes that might be in the character column. Other than this there are no safeguards against SQL injection attacks.




回答2:


Building on @jpd527 solution which I found really worth digging into...

require(RODBC)
channel <- #connection parameters
dbPath <- # path to your table, database.table
data <- # the DF you have prepared for insertion, /!\ beware of column names and values types...

# Function to insert 1000 rows of data in one sqlQuery call, coming from
# any DF and into any database.table

insert1000Rows <- function(channel, dbPath, data, range){

    # Defines columns names for the database.table
    columns <- paste(names(data), collapse = ", ")

    # Initialize a string which will incorporate all 1000 rows of values
    values <- ""

    # Not very elegant, but appropriately builds the values (a, b, c...), (d, e, f...) into a string
    for (i in range) {
        for (j in 1:ncol(data)) {

            # First column
            if (j == 1) {

                if (i == min(range)) {
                    # First row, only "("
                    values <- paste0(values, "(")
                } else {
                    # Next rows, ",("
                    values <- paste0(values, ",(")
                }
            }

            # Value Handling
            values <- paste0(
                values

                # Handling NA values you want to insert as NULL values
                , ifelse(is.na(data[i, j])
                    , "null"

                    # Handling numeric values you want to insert as INT
                    , ifelse(is.numeric(data[i, j])
                        , data[i, J]

                        # Else handling as character to insert as VARCHAR
                        , paste0("'", data[i, j], "'")
                    )
                )
            )

            # Separator for columns
            if (j == ncol(data)) {

                # Last column, close parenthesis
                values <- paste0(values, ")")
            } else {

                # Other columns, add comma
                values <- paste0(values, ",")
            }
        }
    }

    # Once the string is built, insert it into SQL Server
    sqlQuery(channel,paste0("insert into ", dbPath, " (", columns, ") values ", values))
}

This insert1000Rows function is used in a loop in the next function, sqlInsertAll, for which you simply define which DF you want to insert into which database.table.

# Main function which uses the insert1000rows function in a loop
sqlInsertAll <- function(channel, dbPath, data) {
    numberOfThousands <- floor(nrow(data) / 1000)
    extra <- nrow(data) %% 1000
    if (numberOfThousands) {
        for(n in 1:numberOfThousands) {
            insert1000Rows(channel, dbPath, data, (1000 * (n - 1) + 1):(1000 * n))
            print(paste0(n, "/", numberOfThousands))
        }
    }
    if (extra) {
        insert1000Rows(channel, dbPath, data, (1000 * numberOfThousands + 1):(1000 * numberOfThousands + extra))
    }
}

With this, I am able to insert 250k rows of data in 5 minutes or so, whereas it took more than 24 hours using sqlSave from the RODBC package.




回答3:


What about using DBI::dbWriteTable() function? Example below (I am connecting my R code to AWS RDS instance of MS SQL Express):

library(DBI)
library(RJDBC)
library(tidyverse)

# Specify where you driver lives
drv <- JDBC(
  "com.microsoft.sqlserver.jdbc.SQLServerDriver",
  "c:/R/SQL/sqljdbc42.jar") 

# Connect to AWS RDS instance
conn <- drv %>%
  dbConnect(
    host = "jdbc:sqlserver://xxx.ccgqenhjdi18.ap-southeast-2.rds.amazonaws.com",
    user = "xxx",
    password = "********",
    port = 1433,
    dbname= "qlik")

if(0) { # check what the conn object has access to
  queryResults <- conn %>%
    dbGetQuery("select * from information_schema.tables")
}

# Create test data
example_data <- data.frame(animal=c("dog", "cat", "sea cucumber", "sea urchin"),
                           feel=c("furry", "furry", "squishy", "spiny"),
                           weight=c(45, 8, 1.1, 0.8))
# Works in 20ms in my case
system.time(
  conn %>% dbWriteTable(
    "qlik.export.test",
    example_data
  )
)

# Let us see if we see the exported results
conn %>% dbGetQuery("select * FROM qlik.export.test")

# Let's clean the mess and force-close connection at the end of the process
conn %>% dbDisconnect()

It works pretty fast for small amount of data transferred and seems rather elegant if you want data.frame -> SQL table solution.

Enjoy!



来源:https://stackoverflow.com/questions/37688685/how-can-we-bulk-insert-data-in-sqlserver-without-creating-a-text-file-from-rodbc

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!