I am new to the database connection capabilities of dplyr package, but I am very interested in using it for an SQLite connection. I followed this tutorial and created an SQL
In this newsgroup.
Hadley explained the purpose of the function dplyr::copy_to()
.
It is intended to create temporary test tables.
The email exchange ends by suggesting to use RMySQL::dbWriteTable()
to append data to an existing table. The same applies to SQLite databases, as explained in the accepted answer above.
To append a data frame dtf
which has the same column names as an existing database table, I used:
library(RMySQL)
DB <- dbConnect(MySQL(), user="usename", host="localhost",
password="***", dbname="dbname")
dbWriteTable(DB, "tablename", dtf, append=TRUE, row.names = FALSE)
The main reason I use dplyr
to write into a database is that I do not want to switch between languages mid code. For your question the best solution I think is to use the db_insert_into()
function of dplyr
as illustrated by StatSandwich
You can perform SQL operations on a database/table created via dplyr
, but you have to revert to RSQLite/DBI calls and change how you made the database/table:
library(dplyr)
my_db <- src_sqlite("my_db.sqlite3", create=TRUE)
copy_to(my_db, iris, "my_table", temporary=FALSE) # need to set temporary to FALSE
# grab the db connection from the object created by src_sqlite
# and issue the INSERT That way
res <- dbSendQuery(my_db$con,
'INSERT INTO my_table VALUES (9.9, 9.9, 9.9, 9.9, "new")')
No, you can do this all within dplyr
.
require(dplyr)
my_db <- src_sqlite( "my_db.sqlite3", create = TRUE) # create src
copy_to( my_db, iris, "my_table", temporary = FALSE) # create table
newdf = iris # create new data
db_insert_into( con = my_db$con, table = "my_table", values = newdf) # insert into