I have an SQL server 2008 database instance on one machine. Now I want to copy this database to another machine. I use the script wizard inside SQL Management Studio to gene
EDIT: Just noted from your comment that you're running sqlcmd -S server\database -i script.sql
. There is a -I
switch that stands for "Enable Quoted Identifiers". Try to run the command with this switch.
Btw, to edit a large file, consider using a nice editor like Notepad++ or UltraEdit. I wouldn't use a workstation without em :)
I ended up on this question after trying to find a solution to a similar problem I had. I also needed to dump a DB (with data) via Generate scripts wizard and the resulting file was too big to be executed from SSMS. So I tried the sqlcmd
but ended with the error
Sqlcmd: Error: Syntax error at line 10 near command '"' in file 'script.sql'.
It turned out the cause of the issue was a record containing data with jQuery syntax in it - $(".someclass")
. It's because it is also a way how to insert a variable into sqlcmd
.
The solution is to disable variable substitution by adding -x
command line argument.
In order to move the data from one SQL Server to another (e.g. from Production environment to Test environment) makes sense to use "Generate scripts" feature which is available in database options in SQL Server Management Studio. The result of this operation is text file with SQL commands that can be executed on another SQL Server. Usually these files are too big to execute them in SQL Server Management Studio, so we need to use sqlcmd command line utility from SQL Server installation package. In the most cases utility works smoothly and additional user actions are not necessary.
In some rare cases the sqlcmd utility can fail with the import and raise the following error: "Unclosed quotation mark after the character string ..." which indicates that one of SQL queries has not been executed. This happens because sqlcmd works using stream processing, i.e. it reads some piece of data, processes it, reads next piece and so on. In some cases an input file can contain huge SQL instruction which size is bigger than the amount of the data that could be processed by sqlcmd at a time, so sqlcmd tries to execute broken SQL and fails.
In order to fix this issue 2 approaches can be used:
The sqlcmd utility can accept the "-a" parameter which defines the maximum size of packet (piece of data) that will be used during processing. The maximum value is 32767, the default value is 4096, so it makes sense to always use this parameter with maximum value.
sqlcmd -i input.sql -a 32767 -o import_log.txt
If the first approach didn't help and issue still appears, there is another, more difficult solution:
Go to the directory where the SQL file generated by SQL Server Management Studio is located. You need to use Linux style slashes "/" instead of Windows style which is "\"
cd d:/temp
Change the encoding of the SQL file from UTF-16LE to UTF-8, because "sed" cannot process UTF-16LE, this conversion is safe for the data. The result will be a new file, that we will use in next step
iconv -f UTF-16LE -t UTF-8 input.sql > input_utf8.sql
Convert the new file, to have one SQL query in one batch. The result will be a new file, that we will use in next step
sed -e 's/^INSERT/GO\nINSERT/' input_utf8.sql > input_utf8_adapted.sql
Now the file "input_utf8_adapted.sql" should be processed by sqlcmd without any issues, so we can execute the following:
sqlcmd -i input_utf8_adapted.sql -a 32767 -o import_log.txt
After execution is done, please check import_log.txt to make sure that no errors appeared
Not a direct answer to the question but to duck this issue you could use one of the following other methods of copying the database to the new location.
Method 1 is usually preferable as it keeps the source DB online and detaching can cause information held in the master database about the source to be lost (e.g. full text enabled status)