I have a large .sql file, created as a backup from a MySQL database (containing several tables), and I would like to search elements within it from R.
Ideally, there
Would this return what you seek (which I think upon review you will admit is not yet particularly well described):
require(RMySQL)
drv <- dbDriver("MySQL")
con <- dbConnect(drv)
dbListTables(con)
# Or
names(dbGetInfo(drv))
If these are just source code than all you would need is readLines
. If you are looking for an R-engine that can take SQL code and produce useful results then the sqldf package may provide some help. It parses SQL code embedded in quoted strings and applies it either to dataframe objects in memory or to disk-resident tables (or both). Its default driver for disk files is SQLite but other drivers can be used.
No can do, boss. For R to interpret your MySQL database file, it would have to do a large part of what the DBMS itself does. That's a tall order, infeasible in the general case.
My workaround so far (I am also a newbie with db) is to export the database as .csv file in the phpMyAdmin (need to tick "Export tables as separate files" in the "custom" method). And then use read_csv() on tables I want to work with.
It is not ideal because I would like to export the database and work on it on my computer with R (creating functions that will work when accessing the database that is online) and access the real database later, when I have done all my testing. But from the answers here, it seems the .sql export would not help for that anyway (?) and that I would need to recreate the db locally...