`2018-08-21 05:51:48.895 [destination = zaful , address = /127.0.0.1:3306 , EventParser] WARN c.a.otter.canal.parse.inbound.mysql.MysqlEventParser - prepare to find start position just last position {"identity":{"slaveId":-1,"sourceAddress":{"address":"127.0.0.1","port":3306}},"postion":{"gtid":"","included":false,"journalName":"binlog.002535","position":157229530,"serverId":97153,"timestamp":1534833018000}} 2018-08-21 05:51:48.895 [destination = zaful , address = /127.0.0.1:3306 , EventParser] WARN c.a.otter.canal.parse.inbound.mysql.MysqlEventParser - find start position : EntryPosition[included=false,journalName=binlog.002535,position=157229530,serverId=97153,gtid=,timestamp=1534833018000] 2018-08-21 05:52:36.469 [destination = zaful , address = /127.0.0.1:3306 , EventParser] ERROR c.a.o.canal.parse.inbound.mysql.dbsync.DirectLogFetcher - Socket timeout expired, closing connection java.net.SocketTimeoutException: Timeout occurred, failed to read 9471 bytes in 25000 milliseconds. at com.alibaba.otter.canal.parse.driver.mysql.socket.BioSocketChannel.read(BioSocketChannel.java:123) ~[canal.parse.driver-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.mysql.dbsync.DirectLogFetcher.fetch0(DirectLogFetcher.java:174) ~[canal.parse-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.mysql.dbsync.DirectLogFetcher.fetch(DirectLogFetcher.java:85) ~[canal.parse-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.mysql.MysqlConnection.dump(MysqlConnection.java:206) [canal.parse-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.AbstractEventParser$3.run(AbstractEventParser.java:240) [canal.parse-1.0.26-SNAPSHOT.jar:na] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91] 2018-08-21 05:52:36.470 [destination = zaful , address = /127.0.0.1:3306 , EventParser] ERROR c.a.otter.canal.parse.inbound.mysql.MysqlEventParser - dump address 127.0.0.1/127.0.0.1:3306 has an error, retrying. **caused by _### **###
java.net.SocketTimeoutException: Timeout occurred, failed to read 9471 bytes in 25000 milliseconds
_. at com.alibaba.otter.canal.parse.driver.mysql.socket.BioSocketChannel.read(BioSocketChannel.java:123) ~[canal.parse.driver-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.mysql.dbsync.DirectLogFetcher.fetch0(DirectLogFetcher.java:174) ~[canal.parse-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.mysql.dbsync.DirectLogFetcher.fetch(DirectLogFetcher.java:85) ~[canal.parse-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.mysql.MysqlConnection.dump(MysqlConnection.java:206) ~[canal.parse-1.0.26-SNAPSHOT.jar:na] at com.alibaba.otter.canal.parse.inbound.AbstractEventParser$3.run(AbstractEventParser.java:240) ~[canal.parse-1.0.26-SNAPSHOT.jar:na] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91] 2018-08-21 05:52:36.470 [destination = zaful , address = /127.0.0.1:3306 , EventParser] ERROR com.alibaba.otter.canal.common.alarm.LogAlarmHandler - destination:zaful[java.net.SocketTimeoutException: Timeout occurred, failed to read 9471 bytes in 25000 milliseconds. at com.alibaba.otter.canal.parse.driver.mysql.socket.BioSocketChannel.read(BioSocketChannel.java:123) at com.alibaba.otter.canal.parse.inbound.mysql.dbsync.DirectLogFetcher.fetch0(DirectLogFetcher.java:174) at com.alibaba.otter.canal.parse.inbound.mysql.dbsync.DirectLogFetcher.fetch(DirectLogFetcher.java:85) at com.alibaba.otter.canal.parse.inbound.mysql.MysqlConnection.dump(MysqlConnection.java:206) at com.alibaba.otter.canal.parse.inbound.AbstractEventParser$3.run(AbstractEventParser.java:240) at java.lang.Thread.run(Thread.java:745) ]`
程序启动 运行一段时间后,就一直报这个错!每次fetch超时后,就会kill掉和Mysql的connection,然后又重试,接着又超时,一直这样子死循环,重启也没有用,但是删除掉meta.dat后,能够正常运行一段时间,之后又会报超时的错误。 实在不能理解在有很多binlog文件没有消费的情况下,为什么读几kb数据花25s都读不完!! 有遇到相同问题的同仁吗?
原提问者GitHub用户fangchunsheng
换1.1.0版本试试
如果确认socketInputStream的read调用有timeout,可以考虑下网络问题。 查看dump connection用的网卡有无挤占的情况。
原回答者GitHub用户agapple
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。