migrate the large data from xml file to the database

↘锁芯ラ 提交于 2019-12-13 15:29:47

问题


Q:

I face the following problem two weeks ago , and i don't know how to handle it taking the performance issues , the data integrity in consideration.

What I do is:

I wanna to migrate the data from the XML file to its similar tables in my database.

for example :

  • i have two nodes (XML file):

    courses , teachers

  • Two tables (database)

    courses , teachers.

I allow the user to upload the XML file to a folder on my server and i begin to read the XML file and insert the data into my database .

The problem is:

if some failure happens during the insertion operation , i wanna to delete all the inserted records in all tables.(or roll back).

I begin to think about transaction , the insertion of each entity will be performed through transaction but i face two problems:

  1. Should i put all insertions of all entities in one transaction or one entity by one in transaction?(all entities data must be all inserted or no insertion at all )for each uploader.

  2. When i have huge number of records say(1500 record).the following exception appear:

    This IfxTransaction has completed; it is no longer usable,no one fixes it.

  3. My team leader told me not to use the transaction , because it will lock the tables and many users use those tables. he wanna some other mechanism .

please i wanna a solution to my problem(detailed explanation), How to handle this case and maintain the performance issues and the data integrity and consistency .


回答1:


I would suggest using SqlBulkCopy. You can google, or read these 2 articles:

  1. http://blogs.msdn.com/b/nikhilsi/archive/2008/06/11/bulk-insert-into-sql-from-c-app.aspx
  2. http://www.dotnetcurry.com/ShowArticle.aspx?ID=323



回答2:


Here is a mechanism that we used in the same issue, start saving the data in temporary tables , if there is no exeption after inserting in the temporary tables, run a stored procedures that copy the content of these temporary tables to the real tables, and after that delete * from temporary tables. In this way you don't lock the access to tables when using transaction mecanism. There is another mechanism but if you use it you have to rethink all the database structure, it is called CQRS ( for .NET there is an API called NCQRS)




回答3:


If I understand this correctly you are doing a batch insert. Why not use Spring Batch, which have facilities of Restart from last failure point , Retry , Chunking , Partitioning of data etc...

I am aware you have tagged asp.net , but loading of data can happen in a technology independent way and decoupled way. Isn't it?



来源:https://stackoverflow.com/questions/6856004/migrate-the-large-data-from-xml-file-to-the-database

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!