问题
What I want to do is build a dynamic data pull from different SQL source servers (Server1,Server2,Server3) etc.
To pull down to dynamic locations on my SQL server (Dev,Prod) into databases (database1,database2,etc)
The tables will be dropped and recreated each time the package is run so that I am sure I match the source servers if they change anything on source (field names, datatypes, lengths, etc)
I will still get the data to extract. I want to pull this down using a single dataflow in a foreach loop
.
I have a table that has all the server names and tables and databases in it and
I want to loop through that table and pull all the rows of tables inside down to my server (server1.database1.table_x
,server5.database3.table_y
,etc) So that I don't have to build a new data flow for each table.
In order to do this I have already built the foreach loop with a sql task that is dumping results into an object. Then the foreach loop takes that object that has 7 different fields (Source_Server_Name
,Source_Server_Type_Driver
,Source_Database
,Source_Table
,Source_Where_Clause
,Source_Connection_String
,then destination stuff) and it puts each of those fields into a different String variable for use inside the loop.
I can change the Connections dynamically using the variables but I can't figure out how to get the column mapping in the dataflow to function,
Is there some kind of script task I can use to edit the backend XML that will create the column mapping for me so the metadata does not error out? Any help would be greatly appreciated :-)
This is the best illustrated example I could find of what I am doing just remember I need to have a different metadata setup for each table I pull down to my server.
http://sql-bi-dev.blogspot.com/2010/07/dynamic-database-connection-using-ssis.html
回答1:
The solution I ended up using is BIML which generates the package on the fly using dynamic sql and BIML. Not pretty but it works :-)
回答2:
I have heard that it is possible to dynamically generate and publish the packages but I would never go this route. I have done something similar using c# code which can be run from an application via sql agent or from inside an SSIS package script task.
If you try this approach look into SqlConnection and SqlCommand. Then write code to build the sql statements dynamically.
For example create table statements using ExecuteNonQuery(), use datareader to pipe in input and pass that reader to SqlBulkCopy to write to the destination.
来源:https://stackoverflow.com/questions/40893543/how-to-use-ssis-visual-studio-2008-to-build-dynamic-datapull-from-multiple-sql-s