1
votes

I am facing

max values per tag limit exceeded

issue when trying to write 200k points to Influx DB (version 1.2.4) through Java by specifying the batch size and poll interval. I have also set max-values-per-tag = 0 in /etc/influxdb/influxdb.conf but still facing the following issue.

SEVERE: Batch could not be sent. Data will be lost
org.influxdb.InfluxDBException: {"error":"partial write: max-values-per-tag limit exceeded (100453/100000): measurement=\"samplemeasurement\" tag=\"sampletag\" value=\"samplevalue99195\" dropped=806"}
    at org.influxdb.impl.InfluxDBImpl.execute(InfluxDBImpl.java:511)
    at org.influxdb.impl.InfluxDBImpl.write(InfluxDBImpl.java:312)
    at org.influxdb.impl.BatchProcessor.write(BatchProcessor.java:248)
    at org.influxdb.impl.BatchProcessor$2.run(BatchProcessor.java:278)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:748)
2

2 Answers

4
votes

this problem occurred today.The reason is simple,you cannot define too many different tag value.

the following is quoted from official document:

The maximum number of tag values allowed per tag key. The default setting is 100000. Change the setting to 0 to allow an unlimited number of tag values per tag key.

cuz the this number connected with series-cardinality. too high series-cardinality may kill influxdb process and cause other damage such as OOM.

at last sorry for poor English.Hope it will help you

0
votes

I was getting a similar error. I changed the max-values-per-tag = 0,still I got the same error.Then I changed max-series-per-database = 0, still it didn't work.

But eventually I found out that if you decrease the number of measurements per database and also the number of rows in each measurement then this error doesn't occur.

As soon as my number of measurements per database exceeded 40-50 I started getting this error. So to reduce the number of measurements I used Tags and also made sure that my database gets refreshed(deleting previous values) every 18 hours(roughly when it has 1 lakh rows in it).

I'm not an expert in this thing But it worked for me. Hope it does for you too.