问题
I have a database table on a development server that is now fully populated after I set it running with an import routine for a CSV file containing 1.4 million rows.
I ran the Database Publishing Wizard on the table, and now I have a 286MB SQL script on my local machine. The problem is, I can't figure out how to run it. If I load it into SQL Server Management Studio Express I get an alert window that says "The operation could not be completed".
Any ideas on how I can get this SQL script to run?
回答1:
Running something that large inside a single transaction is not a good idea. Therefore, I'd recommend breaking up the file into smaller, more manageable chunks.
Another option is to look at some of the other ways to import CSV data directly.
回答2:
use the sqlcmd tool to execute the file..
sqlcmd -S myServer\instanceName -i C:\myScript.sql
In case your have an unexplained "script error" for large sql files (> 100MB) which includes several INSERT, just replace "INSERT INTO" by "GO INSERT INTO" in your file, which will reduce size of transaction.
回答3:
This tool (Big SQL Script File Runner) on CodePlex will run any size script file with log and GUI.
回答4:
Adding to Gulzar Nazim's answer:
If you still get a failure, try specifying the codepage of your SQL file using option -f
:
sqlcmd -S myServer\instanceName -d databaseName -i C:\myScript.sql -f 65001
I was trying to import a .dump file from SQLite (UTF-8 by default), and sqlcmd kept throwing an error after encountering the first special character. -f 65001
fixed it for me.
回答5:
Why not just use DTS to import the CSV file directly?
回答6:
Yes we could do that, I tried with BCP(Bulk Copy Program) approach in order to avoid OutOfMemory issue.
Note : Tried in SQLServer 2014
In BCP, first we need to export the Source DataBase data to bcp file(in local directory folder) and then need to import that bcp file to Source DataBase
Below are the cake walk steps:
Note:
a) Make sure empty table is present in Destination DataBase
b) Make sure Temp folder is present in C drive
1) Create a bat file named as Export_Data.bat with below command
bcp.exe [Source_DataBase_Name].[dbo].[TableName] OUT "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q
pause
2) Run that bat file, as a result of that a bcp file will get generated in Temp folder
3) Then Create a another bat file named as Import_Data.bat with below command
bcp.exe [Destination_DataBase_Name].[dbo].[TableName] IN "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q
Pause
And here we go!!
来源:https://stackoverflow.com/questions/222442/sql-server-running-large-script-files