升级到es 7.9后,我现在在搜索时出现以下错误:
{"error"=>{"root_cause"=>[{"type"=>"illegal_argument_exception", "reason"=>"The length of [data.Basic Information.Doc] field of [59921e665c3e743c5befb1c4] doc of [cases] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!"}], "type"=>"search_phase_execution_exception", "reason"=>"all shards failed", "phase"=>"query", "grouped"=>true, "failed_shards"=>[{"shard"=>0, "index"=>"cases", "node"=>"Wrz1BVCJRgOyGOFxC0otMQ", "reason"=>{"type"=>"illegal_argument_exception", "reason"=>"The length of [data.Basic Information.Doc] field of [59921e665c3e743c5befb1c4] doc of [cases] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!"}}], "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"The length of [data.Basic Information.Doc] field of [59921e665c3e743c5befb1c4] doc of [cases] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"The length of [data.Basic Information.Doc] field of [59921e665c3e743c5befb1c4] doc of [cases] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!"}}}, "status"=>400}, @response=#<Net::HTTPBadRequest 400 Bad Request readbody=true>, @headers={"content-type"=>["application/json; charset=UTF-8"], "content-length"=>["1840"]}>
我的理解是,我必须以某种方式设置分析器或索引设置来使用带有偏移量的项向量,但我不知道如何做到这一点。
这是我在elastic.yml中更改的内容,还是我发出了curl命令(如果是这样,您可以帮助使用curl命令)。
谢谢,凯文
1条答案
按热度按时间daolsyd01#
也许这能解决问题。
还要记住:
大文本的纯高亮显示可能需要大量的时间和内存。为了防止出现这种情况,将要分析的文本字符的最大数量限制为1000000。可以使用索引设置index.highlight.max\u offset更改特定索引的默认限制。
浏览文档以了解突出显示。