Current location - Loan Platform Complete Network - Big data management - How to efficiently write large amounts of data to Redis
How to efficiently write large amounts of data to Redis
The steps are as follows:

1. Create a new text file containing redis commands

SET Key0 Value0

SET Key1 Value1

...

SET KeyN ValueN

If you have the raw data, it's not that hard to construct this file, e.g., shell, python

2. Convert these commands into Redis Protocol.

Because Redis pipeline functionality supports Redis Protocol, not direct Redis commands.

See the script that follows for how to convert.

3. Inserting with Pipes

cat data.txt | redis-cli --pipe

Shell VS Redis pipe

The following is a test to specifically look at the efficiency of the Shell batch import versus the Redis pipe.

Test idea: Insert 100,000 of the same data into the database via shell script and Redis pipe respectively, and see how long each takes.

Shell

The script is as follows:

#! /bin/bash

for ((i=0;i<100000;i++))

do

echo -en "helloworld" | redis-cli -x set name$i >>redis.log

done

The value inserted each time is helloworld, but with different keys, name0, name1.... . name99999.

Redis pipe

Redis pipe can be a little tricky

1> First construct the text file for the redis commands

Here, I chose python

#! /usr/bin/python

for i in range(100000):

print 'set name'+str(i),'helloworld'

# python 1.py > redis_commands.txt

# head -2 redis_commands.txt

set name0 helloworld

set name1 helloworld

2> Translate these commands into Redis Protocol

Here, I'm utilizing a shell script available on github A shell script,

#! /bin/bash

while read CMD; do

# each command begins with *{number arguments in command}\r\n

XS=($CMD); printf "*${#XS[@]}\r\n"

# for each argument, we append ${length}\r\n{argument}\r\n

for X in $CMD; do printf "\$${#X}\r\n$X\r\n"; done

done < redis_ commands.txt

# sh 20.sh > redis_data.txt

# head -7 redis_data.txt

*3

$3

set

$5

name0

$ 10

helloworld

This completes the data construction.

Test results