问题
I already did a program that uses Sync Framework to sync some tables from Azure to OnPremise (Local Server). I made the program following this example: How to: Configure and Execute Synchronization with SQL Azure.
It works incredible fast with tables that his changes are new rows. The new rows are sync perfect. The first sync takes hours but now all the syncs take seconds.
But we have other tables that are deleted and recreated with mostly the same data (99% is the old and 1% is new data). The program that updates these tables: deleted all the data and add all again (including the new).
In this situation our sync program see as ALL the tables changed. Its terrible slow. All the sync (the first, the second, the third..) takes hours.
Is it possible to fix it?
Thanks,
回答1:
am assuming you're simply deleting the rows and reinserting because if you're actually dropping the tables, you would have broken change tracking already because the change tracking triggers would have been dropped with the table.
If you're reinserting the rows, then the triggers are marking the corresponding rows in the tracking tables as changed. When you sync, the rows are synched even if you've already sync them before. Since the rows already exists on the destination, it runs into conflicts which slows it down. If you have 1 million rows, that's one million conflicts that needs to be resolved.
回答2:
As @JNYRanger: "You might need to write your own provider, since the Sync Framework builds special components into each table that has been provisioned. By deleting the table and recreating the table the framework doesn't know what to do with the knowledge it already has".
来源:https://stackoverflow.com/questions/21052917/how-use-sync-framework-to-sync-tables-that-are-recreated-with-mostly-the-same-da