Handling large databases

前端 未结 14 1698
误落风尘
误落风尘 2021-01-31 00:21

I have been working in a web project(asp.net) for around six months. The final product is about to go live. The project uses SQL Server as the database. We have done performance

14条回答
  •  执念已碎
    2021-01-31 01:16

    First off as many others have said a few million rows is not large. The current application I'm working on has several tables all with over a hundred million rows in which are all normalised.

    We did suffer from some poor performance but this was caused by using the default table statistics settings. Inserting small numbers of records relative to the total size of the table, i.e. inserting a million records into a table containing 100+ million records wasn't causing an automatic update of the table stats and so we'd get poor query plans which manifested itself as serial queries being produced instead of parallel.

    As to whether it's the right decision to denormalise, depends on your schema. Do you have to perform deep queries regularly i.e. loads of joins to get at data that you regularly need access to, if so then partial denormaisation might be a way forward.

    BUT NOT BEFORE you've checked your indexing and table statistic strategies.
    Check that you're using sensible, well structured queries and that your joins are well formed. Check your query plans that your queries are actually parsing the way you expect.

    As others have said SQL Profiler/Database Engine Tuning Advisor do actually make a good job of it.

    For me denormalisation is usually near the bottom of my list of things to do.

    If you're still having problems then check your Server Software and Hardware setup.

    • Are your database and log files on separate physical disks using separate controllers?
    • Does it have enough memory?
    • Is the log file set to autogrow? If so is the autogrow limit to low, i.e. is it growing to often.

提交回复
热议问题