4
votes

The problem

One of my Python Redis clients fails with the following exception:

redis.exceptions.ResponseError: MISCONF Redis is configured to save RDB snapshots, but is currently not able to persist on disk. Commands that may modify the data set are disabled. Please check Redis logs for details about the error.

I have checked the redis machine, and it seems to be out of memory:

free

             total       used       free     shared    buffers     cached
Mem:          3952       3656        295          0          1          9
-/+ buffers/cache:       3645        306
Swap:            0          0          0

top

top - 15:35:03 up 14:09,  1 user,  load average: 0.06, 0.17, 0.16
Tasks: 114 total,   2 running, 112 sleeping,   0 stopped,   0 zombie
%Cpu(s):  0.2 us,  0.3 sy,  0.0 ni, 99.3 id,  0.0 wa,  0.0 hi,  0.0 si,  0.2 st
KiB Mem:   4046852 total,  3746772 used,   300080 free,     1668 buffers
KiB Swap:        0 total,        0 used,        0 free.    11364 cached Mem

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND
 1102 root      20   0 3678836 3.485g    736 S   1.3 90.3  10:12.53 redis-server
 1332 ubuntu    20   0   41196   3096    972 S   0.0  0.1   0:00.12 zsh
  676 root      20   0   10216   2292      0 S   0.0  0.1   0:00.03 dhclient
  850 syslog    20   0  255836   2288    124 S   0.0  0.1   0:00.39 rsyslogd

I am using a few dozens Redis DBs in a single Redis instance. Each DB is denoted by numeric ids given to redis-cli, e.g.:

$ redis-cli -n 80
127.0.0.1:6379[80]>

How do I know how much memory does each DB consume, and what are the largest keys in each DB?

2

2 Answers

1
votes

How do I know how much memory does each DB consume, and what are the largest keys in each DB?

You CANNOT get the used memory for each DB. With INFO command, you can only get the totally used memory for Redis instance. Redis records the newly allocated memory size, each time it dynamically allocates some memory. However, it doesn't do such record for each DB. Also, it doesn't have any record for the largest keys.

Normally, you should config your Redis instance with the maxmemory and maxmemory-policy (i.e. eviction policy when the maxmemory is reached).

1
votes

You can write some sh-script like to this (show element count in each DB):

#!/bin/bash
max_db=501

i=0

while  [ $i -lt $max_db ]
do
echo "db_nubner: $i"
redis-cli -n $i dbsize
i=$((i+1))
done

Example output:

db_nubner: 0
(integer) 71
db_nubner: 1
(integer) 0
db_nubner: 2
(integer) 1
db_nubner: 3
(integer) 1
db_nubner: 4
(integer) 0
db_nubner: 5
(integer) 1
db_nubner: 6
(integer) 28
db_nubner: 7
(integer) 1

I know that we can have a one database with large key, but anyway, in some cases this script can help.