Quickly ingest initial data to Redis

Imagine, you have massive data pipeline and, where thousands of requests per seconds needs to read (that’s easy) or write (that’s harder) data. The obvious and often right choice would be to use Redis to handle all that.

But what happens when you start it on production and need to have some historical data, in order to keep consistency? Of course – there is a need to import that. There are many ways to achieve that, including writing some custom script. I urge you to have a look at redis --pipe option, also called Redis Mass Insertion, where you can leverage Redis’ protocol in order to really quickly ingest a lot of data (way faster than writing a custom script to migrate data using Redis SDK).

Read more