I\'m trying to load a large piece of data into Redis as fast as possible.
My data looks like:
771240491921 SOME;STRING;ABOUT;THIS;LENGTH
345928354912
I like what Salvadore proposed, but here you are one more very clear way - generate feed for cli, e.g.
SET xxx yyy
SET xxx yyy
SET xxx yyy
pipe it into cli on server close to you. Then do save, shutdown and move data file to the destination server.
The fastest way to do this is the following: generate Redis protocol out of this data. The documentation to generate the Redis protocol is on the Redis.io site, it is a trivial protocol. Once you have that, just call it appendonly.log and start redis in append only mode.
You can even do a FLUSHALL command and finally push the data into your server with netcat, redirecting the output to /dev/null.
This will be super fast, there is no RTT to wait, it's just a bulk loading of data.
Less hackish way, just insert things 1000 per time using pipelining. It's almost as fast as generating the protocol, but much more clean :)