Bulk ingest into Redis

后端 未结 2 1636
南笙
南笙 2020-12-24 08:34

I\'m trying to load a large piece of data into Redis as fast as possible.

My data looks like:

771240491921 SOME;STRING;ABOUT;THIS;LENGTH
345928354912         


        
相关标签:
2条回答
  • 2020-12-24 08:50

    I like what Salvadore proposed, but here you are one more very clear way - generate feed for cli, e.g.

    SET xxx yyy
    SET xxx yyy
    SET xxx yyy
    

    pipe it into cli on server close to you. Then do save, shutdown and move data file to the destination server.

    0 讨论(0)
  • 2020-12-24 08:59

    The fastest way to do this is the following: generate Redis protocol out of this data. The documentation to generate the Redis protocol is on the Redis.io site, it is a trivial protocol. Once you have that, just call it appendonly.log and start redis in append only mode.

    You can even do a FLUSHALL command and finally push the data into your server with netcat, redirecting the output to /dev/null.

    This will be super fast, there is no RTT to wait, it's just a bulk loading of data.

    Less hackish way, just insert things 1000 per time using pipelining. It's almost as fast as generating the protocol, but much more clean :)

    0 讨论(0)
提交回复
热议问题