FATAL [circuit_breaking_exception] [parent] Data too large, data for [<http_request>] would be [250592200/246.0mb], which is larger than the limit of [246546432/235.1mb], real usage: [250592200/246.0mb], new bytes reserved: [0/0b], usages [request=16512/16.1kb, fielddata=11330/11kb, in_flight_requests=0/0b, accounting=7160235/6.8mb], with { bytes_wanted=250592200 & bytes_limit=246546432 & durability="PERMANENT" } :: {"path":"/.kibana","query":{},"statusCode":429,"response":"{\"error\":{\"root_cause\":[{\"type\":\"circuit_breaking_exception\",\"reason\":\"[parent] Data too large, data for [<http_request>] would be [250592200/246.0mb], which is larger than the limit of [246546432/235.1mb], real usage: [250592200/246.0mb], new bytes reserved: [0/0b], usages [request=16512/16.1kb, fielddata=11330/11kb, in_flight_requests=0/0b, accounting=7160235/6.8mb]\",\"bytes_wanted\":250592200,\"bytes_limit\":246546432,\"durability\":\"PERMAN
Linux docker 启动Kibana失败:Data too large
最新推荐文章于 2025-07-03 09:48:22 发布
在尝试启动Linux Docker容器内的Kibana时遇到了'circuit_breaking_exception'错误,原因是数据量超出内存限制。通过调整Elasticsearch的集群设置,增大fielddata、request和total的内存限制到40%和100%,并重启ES和Kibana服务,成功解决了启动问题。但设置为100%可能带来其他潜在问题,欢迎有经验的大佬提供指导。

最低0.47元/天 解锁文章
5728

被折叠的 条评论
为什么被折叠?



