Bulk ingest into Redis

后端 未结 2 1637
南笙
南笙 2020-12-24 08:34

I\'m trying to load a large piece of data into Redis as fast as possible.

My data looks like:

771240491921 SOME;STRING;ABOUT;THIS;LENGTH
345928354912         


        
2条回答
  •  囚心锁ツ
    2020-12-24 08:59

    The fastest way to do this is the following: generate Redis protocol out of this data. The documentation to generate the Redis protocol is on the Redis.io site, it is a trivial protocol. Once you have that, just call it appendonly.log and start redis in append only mode.

    You can even do a FLUSHALL command and finally push the data into your server with netcat, redirecting the output to /dev/null.

    This will be super fast, there is no RTT to wait, it's just a bulk loading of data.

    Less hackish way, just insert things 1000 per time using pipelining. It's almost as fast as generating the protocol, but much more clean :)

提交回复
热议问题