问题
We have clients that run a particular piece of software with a large SQL Server database and they use SSMS to manage it (we don't want to ask them to install any other software on their servers). We need to get a number of tables out of the server using SSMS so they can save them (this has to be a relatively easy process, it will be an IT manager not a DBA or programmer doing the task) and send them to us (on usb drive).
So I have experimented with the generate script option, but in the test I did we ended up with a 200GB .sql
file which I would then need to edit the first line (the USE [database]) to specify a different database to copy back to (or each client would have the same database name and overwrite the data from other clients). Of course editing a line in a 200GB file is not an easy task, then I still have to get the import to work.
I was thinking maybe if I use the generate script from a sample database to make the tables etc on my end, then use the export functionality in SSMS to export the data to a CSV, however, the data will likely be anything but clean, and this could easily cause issues with a CSV. I was thinking a flat file rather than a CSV, but I am worried they may stuff something up with the encoding etc (and I am not sure how messy data goes in a flat file compared to a CSV).
I was thinking maybe if I could create an SQL script of some description to output a file, but it would have to be something simple so that they can tell there is nothing suspect in the code, and would need to output a file or set of file, but would still have the same issue of how to save without the possibility of data corruption.
Any ideas? We are on Windows Server 2012 R2 and the data may be coming from various versions of SQL Server depending how recently that company updated.
回答1:
Until there is a better answer, I will just leave what we did here.
We have created a set of instructions and a script that will get the client to create a new database, then use the script to transfer the data over to the new database, and then back up this newly created database.
The script (query) effectively creates a loop to go through the tables and create the sql with:
SET @sql = 'SELECT * INTO [' + @toDB + '].' + @currTable + ' FROM [' + @fromDB + '].' + @currTable
which takes the current table name (@currTable) and moves it from their main database (@fromDB) into the newly created database (@toDB).
This is not ideal, but for now seems to be the simplest option for large amounts of data. What would be great is if they had an option when doing a backup of choosing which tables to include.
For reference if others need to do something like this, here is the script:
--before you run this script, check that the 2 variables at the top are set correctly
--the @toDB variable should be a database you have just created to temporarily store exported data
DECLARE @fromDB VARCHAR(max) = 'main_database' --this should be set to the name of the database you are copying from
DECLARE @toDB VARCHAR(max) = 'main_database_export' --this should be set to the name of the database you are copying to (the temporary one)
/* ------------------------------------------
---------Do not edit from here down---------
------------------------------------------- */
--declare variables to be used in different parts of the script
DECLARE @sql VARCHAR(max)
DECLARE @currPos INT = 1
DECLARE @currTable VARCHAR(max)
DECLARE @tableNames TABLE(id INT, name varchar(max))
--create a list of files that we want top copy to the new database, the id must be sequential and start at 1)
INSERT INTO @tableNames VALUES
(1, '[dbo].[table1]'),
(2, '[dbo].[table2]'),
(3, '[dbo].[table3]'),
(4, '[dbo].[table4]')
DECLARE @totalTables INT = 4 --this should always be the number of the last table to be copied, if you add more or take any away, update this
--loop through the tables and copy them across
WHILE (@currPos <= @totalTables)
BEGIN
--get the table name of the table we are up to
SELECT @currTable = name FROM @tableNames WHERE id = @currPos
--create the sql that will copy from the old table into the new table (including the table structure), this table must not exist yet
SET @sql = 'SELECT * INTO [' + @toDB + '].' + @currTable + ' FROM [' + @fromDB + '].' + @currTable
--run the sql statement we just created, this will create the table and copy the content (and leave a message to say how many rows were copied)
EXECUTE (@sql)
--set the counter up one so we move onto the next table
SET @currPos = @currPos+1
--output the name of the table that was just processed (note that no messages will show until the entire script finishes)
PRINT @currTable + ' Copied.'
END
Note that this script is designed to give to the client, the "Do not edit from here down" is an instruction for them (you will need to edit the table names you are copying and the variable holding the total number of tables).
We then send this with a set of instructions on how to create the new database, run this script and then back up the new database etc.
来源:https://stackoverflow.com/questions/59781099/export-tables-in-ssms-and-import-them-onto-a-different-server