Apache Doris 中文技术论坛
Questions Tags Users Badges

【已解决】flink写入数据出现异常

Asked Feb 27, 2024 Modified Feb 28, 2024
Viewed 88
ingestion 2.0

24540eeb06a7ad76efe5dcaf9170f0d.png

这个是什么原因,怎么解决

edited Feb 28, 2024
zhb123319
asked Feb 27, 2024
2 Answers

可以先确定下BE的状态,是否在导入数据期间BE 出现了异常情况,如果是异常BE节点是 cordidate BE, 那么此次streamload job 也会失败的.

同时可以在fe.log 中grep "java.net.SocketTimeoutException: Read timed out" 看看是不是有网络抖动问题

edited Feb 27, 2024
阿渊@SelectDB (没回帖直接加我主页微信)9580
answered Feb 27, 2024

应该是自动修复了,任务自动重试成功了

edited Jan 1, 1970
zhb123319
answered Feb 27, 2024
Related Questions
4.0.3 mysql 通过CREATE JOB 同步数据之后中文变成了?
2 answers
ROUTINE LOAD导入kafka数据,当kafka是cdc数据,JSON格式,带删除标识的:op:d
1 answers
大量数据通过streamload导入报错
1 answers
能使用ROUTINE LOAD将kafka数据批量同步到doris中吗
1 answers
4.0.4 stream load 报错: can not cast from origin type bitmap to target type=varchar(65533)
1 answers
flink 多job checkpoint超时,doirs abort transation
1 answers

Terms of service Privacy policy

Powered by Answer - the open-source software that powers Q&A communities.
Made with love © 2026 Apache Doris 中文技术论坛.