3
votes

I have a docker container running InfluxDB v1.0 with volume mapping. On the host, the size is of 22G, whereas if I run du -sh /* on the container, it is exactly 5 times lower, 4.4G. Yesterday, I created a retention policy on each DB in Influx which is keeping 5 months worth of data. I had a disk usage of 94% yesterday, after 24hrs, it is 87%. Could this be related to my policies? This is the command I run. Also, I tried deleting manually from each DB using the WHERE filter of time < '2018-12-05'. This is the command I'm using for the container:

docker run --name influxdb   -p 8083:8083 -p 8086:8086 -p 25826:25826/udp -v $PWD/influxdb:/var/lib/influxdb -v $PWD/influxdb.conf:/etc/influxdb/influxdb.conf:ro -v $PWD/types.db:/usr/share/collectd/types.db:ro influxdb:1.0

The volume mapped is the same size, but the container in /var/lib/docker/containers/ is 22G.

When I go into the directory, there is a -json.log file that has this 22G size.

1
Not strictly related, but 1.0 is positively antique. Earliest supported (or at least mentioned in the documentation) is 1.5SiHa

1 Answers

0
votes

I had same problem and got tired of not finding anything then I ended up doing this. echo "" > /var/lib/docker/containers/3cfcad146f78519ea7cfac04dd82c3b92aba58e760f803b74f908c54002ec7bf/3cfcad146f78519ea7cfac04dd82c3b92aba58e760f803b74f908c54002ec7bf-json.log

so far no side affects