sqlbulkcopy

How do I efficiently store all OpenStreetMap data in an indexed way?

只谈情不闲聊 提交于 2019-12-22 06:49:55
问题 Note: While I target Windows Phone 7, it doesn't introduce anything besides a size restriction. In an attempt to write a GPS / Routing / Map application for the Windows Phone 7, I'm trying to attempt to use OpenStreetMap for this and I want to get my data stored in a SQL Server Compact Edition database on my Windows Phone 7. This is giving me a lot of trouble so I'm getting clueless what the right way is... Here is my progress: I've downloaded Belgium.osm.pbf, which contains all the Belgium

Need recommendations on pushing the envelope with SqlBulkCopy on SQL Server

 ̄綄美尐妖づ 提交于 2019-12-21 20:05:20
问题 I am designing an application, one aspect of which is that it is supposed to be able to receive massive amounts of data into SQL database. I designed the database stricture as a single table with bigint identity, something like this one: CREATE TABLE MainTable ( _id bigint IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED, field1, field2, ... ) I will omit how am I intending to perform queries, since it is irrelevant to the question I have. I have written a prototype, which inserts data into this

SqlBulkCopy slow as molasses

坚强是说给别人听的谎言 提交于 2019-12-21 17:06:34
问题 I'm looking for the fastest way to load bulk data via c#. I have this script that does the job but slow. I read testimonies that SqlBulkCopy is the fastest. 1000 records 2.5 seconds. files contain anywhere near 5000 records to 250k What are some of the things that can slow it down? Table Def: CREATE TABLE [dbo].[tempDispositions]( [QuotaGroup] [varchar](100) NULL, [Country] [varchar](50) NULL, [ServiceGroup] [varchar](50) NULL, [Language] [varchar](50) NULL, [ContactChannel] [varchar](10)

How to get identities of inserted data records using SQL bulk copy

孤人 提交于 2019-12-21 09:07:17
问题 I have an ADO.NET DataTable with about 100,000 records. In this table there is a column xyID which has no values in it, because the column is an auto-generated IDENTITY in my SQL Server database. I need to retrieve the generated IDs for other processes. I am looking for a way to bulk copy this DataTable into the SQL Server database, and within the same "step" to "fill" my DataTable with the generated IDs. How can I retrieve the identity values of records inserted into a table using the

How to retrieve server generated Identity values when using SqlBulkCopy

空扰寡人 提交于 2019-12-21 04:41:21
问题 I know I can do a bulk insert into my table with an identity column by not specifying the SqlBulkCopyOptions.KeepIdentity as mentioned here. What I would like to be able to do is get the identity values that the server generates and put them in my datatable, or even a list. I saw this post, but I want my code to be general, and I can't have a version column in all my tables. Any suggestions are much appreciated. Here is my code: public void BulkInsert(DataTable dataTable, string

How to automatically truncate string when do bulk insert?

眉间皱痕 提交于 2019-12-21 03:50:47
问题 I want to insert many rows (constructed from Entity Framework objects) to SQL Server. The problem is, some of string properties have length exceeded length of column in database, which causes an exception, and then all of rows will unable to insert into database. So I wonder that if there is a way to tell SqlBulkCopy to automatically truncate any over-length rows? Of course, I can check and substring each property if it exceeds the limited length, before insert it in to a DataTable, but it

Real-time unidirectional synchronization from sql-server to another data repository

我与影子孤独终老i 提交于 2019-12-20 06:39:08
问题 In my previous question on this portal, I had asked about some insight about syncing data between SQL Server and key-value based data repositories. In lieu of the same problem (one way real-time synchronization from SQL to HBase or any other database), I need to take care of some performance and latency considerations and did not find a very foolproof way of doing it. We have multiple SQL 2008 data shards where data is updated from various sources and processed by many processes at the same

Get the count of rows from a COPY command

偶尔善良 提交于 2019-12-19 03:39:31
问题 When copying data from a file, you get the count of rows in psql with the "command tag": db=# COPY t FROM '/var/lib/postgres/test.sql'; COPY 10 I need the number of rows and would like to avoid a redundant count() on the table. Is there a way to get this count from COPY directly in a PL/pgSQL function? As far as I know there is none, but maybe I am missing something? For PostgreSQL 9.2. But any option in any version would be of interest. 回答1: Not in PG 9.2, but there is in PG 9.3 courtesy of

How to add gridview rows to a datatable?

不羁的心 提交于 2019-12-18 06:55:27
问题 I have a gridview which will contain some 'n' number of rows.... Now i want to add all rows of the gridview to a datatable which will be used for bulkcopy operation... I have found this http://www.codeproject.com/KB/aspnet/GridView_To_DataTable.aspx But i want all columns of my gridview to be added to the datarow of the datatable Grid http://img85.imageshack.us/img85/4044/gridp.jpg I want to convert gridview to datatable on submit.... Any suggestion... EDIT: Answer below works and i have

How to use SqlBulkCopyColumnMappingCollection?

烈酒焚心 提交于 2019-12-17 18:53:09
问题 I want to make one SqlBulkCopy method that I can use for all my bulk inserts by passing in specific data through the parameters. Now I need to do mapping on some of them. I don't know how to make a SqlBulkCopyColumnMappingCollection since that was my plan to pass in the mapping collection in and use it. However I don't know how to make it. I can't make a new object of it. This is what I have now. How can I add it do mapping put pass it in? public void BatchBulkCopy(DataTable dataTable, string