I could never find a decent way to forecast the index sizes in Elastic Search. In the Kibana GUI, under Stack Management, you can see the total index size which you need to divide by the number of nodes that the index data is stored on, and that gives you an idea, but you can’t visualise it.
So, to get some sort of rough average, I wrote some python code to do what i needed. The following is done.
- Pick an index
- Work out the average size of a document in that index
- Count the number of documents in the previous day
- daily index size = (number of docs that day) x (average doc size)
It’s not 100% but it’s going to allow you to see the index sizes and forecast some trends.
A docker image
My Elastic stack is running in docker, so an image is included with a docker-compose file, but the python code can be run on where ever you want.
So you can run the code as often as you want and create a visualisation in Kibana to see the results.
So after a few days running, i can see we are ingesting 9GB a day and also which index is ingesting the most often.
So the green index is the one that is ingesting more data each day.