开发者社区> 问答> 正文

flink 1.11 发布sql任务到yarn session报错

本人试图将flink-sql-gateway(https://github.com/ververica/flink-sql-gateway)升级到1.11支持版本,将flink sql(用到hbase connector)提交到yarn session后运行时报: org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.util.ByteStringer at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:248) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:221) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:388) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:362) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:142) at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.util.ByteStringer at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:1053) at org.apache.hadoop.hbase.protobuf.RequestConverter.buildScanRequest(RequestConverter.java:496) at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:402) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:274) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:219) ... 7 more 经过搜索怀疑可能是因为hbase-protobuf依赖的protobuf-java版本不对,但我怎么查看运行时jm、tm对应的classpath是什么样的,依赖了什么样的jar,希望给出分析思路或方法,谢谢。 *来自志愿者整理的flink邮件归档

展开
收起
游客sadna6pkvqnz6 2021-12-07 17:21:42 1266 0
1 条回答
写回答
取消 提交回答
  • 终于找到了问题,原因是flink-dist-*.jar包中打入了高版本的protobuf-java(3.7.1),高版本的protobuf-java中LiteralByteString是ByteString的私有内部类: private static class LiteralByteString extends ByteString.LeafByteString { private static final long serialVersionUID = 1L; protected final byte[] bytes; /** * Creates a {@code LiteralByteString} backed by the given array, without copying. * * @param bytes array to wrap */ LiteralByteString(byte[] bytes) { if (bytes == null) { throw new NullPointerException(); } this.bytes = bytes; } 而HBase Connector(1.4.3) 读取数据过程中初始化org.apache.hadoop.hbase.util.ByteStringer时调用了new LiteralByteString(),这样就无法找到该类,从而报了java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.util.ByteStringer。 解决方法:flink打包时去掉了protobuf-java(3.7.1)依赖,提交时将protobuf-java:2.5.0作为依赖即可。*来自志愿者整理的flink

    2021-12-07 20:39:45
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Flink CDC Meetup PPT - 龚中强 立即下载
Flink CDC Meetup PPT - 王赫 立即下载
Flink CDC Meetup PPT - 覃立辉 立即下载