DateTagsdocker

The Problem

I recently connected to a remote machines and found a disk space warning.

In order to diagnose the cause of the disk space issue, I used a neat tool called ncdu (NCurses Disk Usage).

sudo ncdu /

Using the excellent interface, I was able to trace the issue to /var/lib/docker/. But a system check of docker had already shown no indications of anything taking too much space.

For example:

$ docker system df
TYPE            TOTAL     ACTIVE    SIZE      RECLAIMABLE
Images          22        9         3.165GB   2.097GB (66%)
Containers      9         7         433.7kB   150B (0%)
Local Volumes   11        4         585.5MB   381.3MB (65%)
Build Cache     0         0         0B        0B

For a more specific and verbose output, try docker system df -v

Anyway, closer inspection showed that the problem file /var/lib/docker/containers/uuid1/uuid2-json.log was actually some docker container log file.

For good measure, I checked the size of the rest of the containers logs:

$ sudo sh -c "du -ch /var/lib/docker/containers/*/*-json.log"
2.8M        /var/lib/docker/containers/409f2a9db0cb4bf3bdd4443bd8ee8618c44dfa4ec77939d5033afc3953d15cb8/409f2a9db0cb4bf3bdd4443bd8ee8618c44dfa4ec77939d5033afc3953d15cb8-json.log
4.0K        /var/lib/docker/containers/609524245e23d3f7b45d197b808ddf6d51d961286134a3686805006c6c6e037c/609524245e23d3f7b45d197b808ddf6d51d961286134a3686805006c6c6e037c-json.log
4.0K        /var/lib/docker/containers/62af58c13b5b08284395fc21748fe42274c1a56b6e832d842c6286d85f51227b/62af58c13b5b08284395fc21748fe42274c1a56b6e832d842c6286d85f51227b-json.log
156K        /var/lib/docker/containers/647362414f7e9f098dd5593334483cf89008a28144134e30a2eb1aa3265a9dd2/647362414f7e9f098dd5593334483cf89008a28144134e30a2eb1aa3265a9dd2-json.log
861M        /var/lib/docker/containers/648ee5d5b7e6bea6ecfe2ebd46cd7bd0a3171e682079a7639588fa6bad5dfa00/648ee5d5b7e6bea6ecfe2ebd46cd7bd0a3171e682079a7639588fa6bad5dfa00-json.log
28K /var/lib/docker/containers/c348113a6e76d72963e01053abb45b48879419142233fe152db831be1d2343b1/c348113a6e76d72963e01053abb45b48879419142233fe152db831be1d2343b1-json.log
14G /var/lib/docker/containers/c8d4ee7b4c2b4c906416e9fd2e7f6b56fb054f819fe94efa08f45ccb2de10a86/c8d4ee7b4c2b4c906416e9fd2e7f6b56fb054f819fe94efa08f45ccb2de10a86-json.log
24K /var/lib/docker/containers/e9289189112c89bd4aa777a44309681b775f0606fec9b76e9ef0ccc62d2b671c/e9289189112c89bd4aa777a44309681b775f0606fec9b76e9ef0ccc62d2b671c-json.log
14G total

The solution

Docker has settings for controlling the log retention of running containers. The full set of options can be seen here.

In my case, I was using docker-compose and was able to fix the issue by adding the following to my yaml file:

logging:
    options:
        max-size: "100m"
        max-file: "10"
        compress: "true"

After changing the setting, I was able to rebuild the container and let docker clean up the mess.