假如我有一条数据:
INSERT INTO canal-test
.my_filed_type(id, my_bigint_unsigned) VALUES(1, 18446744073709551615);
由于es不支持这么大的数据,会报错:
2023-10-08 13:43:29.474 [pool-2-thread-1] ERROR c.a.otter.canal.adapter.launcher.loader.AdapterProcessor - ES sync commit error ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [my_bigint_unsigned] of type [long] in document with id '1'. Preview of field's value: '18446744073709551615']]; nested: ElasticsearchException[Elasticsearch exception [type=input_coercion_exception, reason=Numeric value (18446744073709551615) out of range of long (-9223372036854775808 - 9223372036854775807)
at [Source: (org.elasticsearch.common.bytes.AbstractBytesReference$MarkSupportingStreamInputWrapper); line: 1, column: 43]]];
java.lang.RuntimeException: ES sync commit error ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [my_bigint_unsigned] of type [long] in document with id '1'. Preview of field's value: '18446744073709551615']]; nested: ElasticsearchException[Elasticsearch exception [type=input_coercion_exception, reason=Numeric value (18446744073709551615) out of range of long (-9223372036854775808 - 9223372036854775807)
这样的错误出现后,如何消除这个错误,或是我能忍受丢弃这个binlog也行,因为即使我后续将mysql中的数据修改,之前的报错依然存在。(我理解binlog一直没有被消费)
其实,我的问题就是如何丢弃这样的异常数据????
1条答案
按热度按时间sauutmhj1#
adapter group 配置项 canal.conf.retries 配置的是 -1?如果配置的是正数,重试多次失败后就会丢弃了。