开发者社区 > 大数据与机器学习 > 检索分析服务 Elasticsearch版 > 正文

请问Elasticsearch出现下面的报错怎么办呢?

请问Elasticsearch出现下面的报错怎么办呢?

com.alibaba.datax.common.exception.DataXException: Code:[ESReader-09], Description:[Reading index / type data exception].  - {"root_cause":[{"type":"circuit_breaking_exception","reason":"[fielddata] Data too large, data for [_id] would be [3097501059/2.8gb], which is larger than the limit of [3097362432/2.8gb]","bytes_wanted":3097501059,"bytes_limit":3097362432,"durability":"PERMANENT"},{"type":"circuit_breaking_exception","reason":"[fielddata] Data too large, data for [_id] would be [3101710554/2.8gb], which is larger than the limit of [3097362432/2.8gb]","bytes_wanted":3101710554,"bytes_limit":3097362432,"durability":"PERMANENT"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"devicedata_2022-07","node":"XicIpBGcTP-Nl9uXPBfYNw","reason":{"type":"exception","reason":"java.util.concurrent.ExecutionException: CircuitBreakingException[[fielddata] Data too large, data for [_id] would be [3097501059/2.8gb], which is larger than the limit of [3097362432/2.8gb]]","caused_by":{"type":"execution_exception","reason":"execution_exception: CircuitBreakingException[[fielddata] Data too large, data for [_id] would be [3097501059/2.8gb], which is larger than the limit of [3097362432/2.8gb]]","caused_by":{"type":"circuit_breaking_exception","reason":"[fielddata] Data too large, data for [_id] would be [3097501059/2.8gb], which is larger than the limit of [3097362432/2.8gb]","bytes_wanted":3097501059,"bytes_limit":3097362432,"durability":"PERMANENT"}}}},{"shard":1,"index":"devicedata_2022-07","node":"lTpJ0vx9SX-7H54FsvEYlA","reason":{"type":"exception","reason":"java.util.concurrent.ExecutionException: CircuitBreakingException[[fielddata] Data too large, data for [_id] would be [3101710554/2.8gb], which is larger than the limit of [3097362432/2.8gb]]","caused_by":{"type":"execution_exception","reason":"execution_exception: CircuitBreakingException[[fielddata] Data too large, data for [_id] would be [3101710554/2.8gb], which is larger than the limit of [3097362432/2.8gb]]","caused_by":{"type":"circuit_breaking_exception","reason":"[fielddata] Data too large, data for [_id] would be [3101710554/2.8gb], which is larger than the limit of [3097362432/2.8gb]","bytes_wanted":3101710554,"bytes_limit":3097362432,"durability":"PERMANENT"}}}}],"caused_by":{"type":"circuit_breaking_exception","reason":"[fielddata] Data too large, data for [_id] would be [3097501059/2.8gb], which is larger than the limit of [3097362432/2.8gb]","bytes_wanted":3097501059,"bytes_limit":3097362432,"durability":"PERMANEN
T```  
"}}

展开
收起
哈喽!小陈 2022-08-08 17:44:16 1437 0
1 条回答
写回答
取消 提交回答
  • 1.优化成bulk写入 2.索引刷新时间优化 3.translog flush 间隔调整 4.自增ID 5.升级配置 主要原因是_id 这个字段,它没有开启doc value,所以可能会占用fielddata触发熔断,尽量避免对这个字段的聚合排序等操作。如果需要排序,请另外加个id字段,启用doc value 此答案整理自钉钉群“Elasticsearch中文技术社区”

    2022-08-08 18:15:34
    赞同 展开评论 打赏

阿里云检索分析服务Elasticsearch版兼容开源ELK功能,免运维全托管,提升企业数据检索与运维分析能力。

相关产品

  • 检索分析服务 Elasticsearch版
  • 热门讨论

    热门文章

    相关电子书

    更多
    阿里云Elasticsearch体系架构与特性解析 立即下载
    开源与云:Elasticsearch应用剖析 立即下载
    《Elasticsearch全观测解决方案》 立即下载