Imagine, you have massive data pipeline and, where thousands of requests per seconds needs to read (that’s easy) or write (that’s harder) data. The obvious and often right choice would be to use Redis to handle all that.

But what happens when you start it on production and need to have some historical data, in order to keep consistency? Of course – there is a need to import that. There are many ways to achieve that, including writing some custom script. I urge you to have a look at redis --pipe option, also called Redis Mass Insertion, where you can leverage Redis’ protocol in order to really quickly ingest a lot of data (way faster than writing a custom script to migrate data using Redis SDK).

Once you have redis-cli tool, prepare a file with Redis commands and run redis-cli --pipe:

$ cat migration.txt
SET migrated-test-1 value-1
SET migrated-test-2 value-2
SET migrated-test-3 value-3

$ cat migration.txt | redis-cli  --pipe
All data transferred. Waiting for the last reply...
Last reply received from server.
errors: 0, replies: 3

$ redis-cli get migrated-test-1
"value-1"

In case you want to use different data structure provided by Redis, you can also do that. Let’s take an example Sorted Sets:

$ cat migration-sortedsets.txt
ZADD migrated-ss-1 1 value1
ZADD migrated-ss-1 2 value2
ZADD migrated-ss-2 1 value3
ZADD migrated-ss-2 2 value4

$ cat migration-sortedsets.txt | redis-cli  --pipe
All data transferred. Waiting for the last reply...
Last reply received from server.
errors: 0, replies: 4

$ redis-cli zrange migrated-ss-1 0 -1
1) "value1"
2) "value2"

$ redis-cli zrange migrated-ss-2 0 -1
1) "value3"
2) "value4"

Everything seems perfect so far. But how to quickly create those import files? I often use a dirty trick, which is not the most elegant but works like a charm. Use CONCAT, which is being provided by every SQL engine:

$ mysql> select * from games;
+----------------+----------------+------------+
| studio         | game           | popularity |
+----------------+----------------+------------+
| CD Project Red | Witcher III    |          1 |
| CD Project Red | Cyberpunk 2077 |          2 |
| EA Sports      | Fifa 21        |          1 |
| EA Sports      | NHL 21         |          2 |
+----------------+----------------+------------+
4 rows in set (0.00 sec)

Create query which will transform your data into Redis commands:

SELECT
  CONCAT('ZADD games-', replace(studio, ' ', '_'), ' ', popularity, ' "', game, '"')
FROM
  games;

Now run the query and save result to file and ingest it:

$ cat query.sql | mysql -uuser -ppass dbname 2>/dev/null | tail -n +2 > redis-import.txt

$ cat redis-import.txt
ZADD games-CD_Project_Red 1 "Witcher III"
ZADD games-CD_Project_Red 2 "Cyberpunk 2077"
ZADD games-EA_Sports 1 "Fifa 21"
ZADD games-EA_Sports 2 "NHL 21"

$ cat redis-import.txt | redis-cli --pipe
All data transferred. Waiting for the last reply...
Last reply received from server.
errors: 0, replies: 4

Our data is here!

$ redis-cli zrange games-CD_Project_Red 0 -1
1) "Witcher III"
2) "Cyberpunk 2077"
$ redis-cli zrange games-EA_Sports 0 -1
1) "Fifa 21"
2) "NHL 21"

Please note, that for larger data sets Redis might refuse to ingest it, so I suggest to split into smaller files and import them in bash for-loop.