Primary key id reaching limit of bigint data type

后端 未结 4 1495
星月不相逢
星月不相逢 2020-12-31 23:18

I have a table that is exposed to large inserts and deletes on a regular basis (and because of this there are large gaps in the number sequence of the primary id column). It

相关标签:
4条回答
  • 2020-12-31 23:20

    I know it has been already answered a year ago but just to continue on Luc Franken answer,

    If you insert 500 million rows per second, it would take around 1173 years to reach the limit of the BIG INT. So yeah i think don't worry about that

    0 讨论(0)
  • 2020-12-31 23:27

    If you don't need that column because you have another identifier for a record which is unique. Like a supplied measurement_id or whatever or even a combined key: measurement_id + location_id it should not be needed to use an auto increment key. If there is any chance you won't have a unique key than make one for sure.

    What if I need a very very big autoincrement ID?

    Are you really sure you have so many inserts and deletes you will get to the limit?

    0 讨论(0)
  • 2020-12-31 23:35

    maybe it's bit too late, but you can add trigger in DELETE,

    here is sample code in SQL SERVER

    CREATE TRIGGER resetidentity
        ON dbo.[table_name]
        FOR DELETE
    AS
        DECLARE @MaxID INT
        SELECT @MaxID = ISNULL(MAX(ID),0)
        FROM dbo.[table_name]
        DBCC CHECKIDENT('table_name', RESEED, @MaxID)
    GO
    

    In a nutshell, this will reset you ID (in case it is auto increment and primary). Ex: if you have 800 rows and deleted last 400 of them, next time you insert, it will start at 401 instead of 801.

    But the down is it will not rearrange your ID if you delete it on the middle record.EX if you have 800 rows and deleted ID 200-400, ID still count at 801 next time you write new row(s)

    0 讨论(0)
  • 2020-12-31 23:44

    If we inserted 1 hundred thousand (100,000) records per second into the table then it would take 2,924,712 yrs

    If we inserted 1 million (1,000,000) records per second into the table then it would take 292,471 yrs

    If we inserted 10 million (10,000,000) records per second into the table then it would take 29,247 yrs

    If we inserted 100 million records per second into the table then it would take 2,925 yrs

    If we inserted 1000 million records per second into the table then it would take 292 yrs

    So don't worry about it

    0 讨论(0)
提交回复
热议问题