I use RedGate SQL data compare and generated a .sql file, so I could run it on my local machine. But the problem is that the file is over 300mb, which means I can\'t do copy
Your question is quite similar to this one
You can save your file/script as .txt or .sql and run it from Sql Server Management Studio (I think the menu is Open/Query, then just run the query in the SSMS interface). You migh have to update the first line, indicating the database to be created or selected on your local machine.
If you have to do this data transfer very often, you could then go for replication. Depending on your needs, snapshot replication could be ok. If you have to synch the data between your two servers, you could go for a more complex model such as merge replication.
EDIT: I didn't notice that you had problems with SSMS linked to file size. Then you can go for command-line, as proposed by others, snapshot replication (publish on your main server, subscribe on your local one, replicate, then unsubscribe) or even backup/restore
Take command prompt with administrator privilege
Change directory to where the .sql file stored
Execute the following command
sqlcmd -S 'your server name' -U 'user name of server' -P 'password of server' -d 'db name'-i script.sql
Run it at the command line with osql, see here:
http://metrix.fcny.org/wiki/display/dev/How+to+execute+a+.SQL+script+using+OSQL
Hope this help you!
sqlcmd -u UserName -s <ServerName\InstanceName> -i U:\<Path>\script.sql
I am using MSSQL Express 2014 and none of the solutions worked for me. They all just crashed SQL. As I only needed to run a one off script with many simple insert statements I got around it by writing a little console app as a very last resort:
class Program
{
static void Main(string[] args)
{
RunScript();
}
private static void RunScript()
{
My_DataEntities db = new My_DataEntities();
string line;
System.IO.StreamReader file =
new System.IO.StreamReader("c:\\ukpostcodesmssql.sql");
while ((line = file.ReadLine()) != null)
{
db.Database.ExecuteSqlCommand(line);
}
file.Close();
}
}
I had similar problem. My file with sql script was over 150MB of size (with almost 900k of very simple INSERTs). I used solution advised by Takuro (as the answer in this question) but I still got error with message saying that there was not enough memory ("There is insufficient system memory in resource pool 'internal' to run this query").
What helped me was that I put GO command after every 50k INSERTs.
(It's not directly addressing the question (file size) but I believe it resolves problem that is indirectly connected with large size of sql script itself. In my case many insert commands)