您好,我在使用DataStream API 读取HBase表时,使用了HBaseRowInputFormat,并根据HBaseTableSchema了schema,代码如下:
val env = StreamExecutionEnvironment.getExecutionEnvironment val hbaseTableSchema = TableSchema.builder() .add(TableColumn.of("id", DataTypes.STRING())) .add(TableColumn.of("f1", DataTypes.ROW(DataTypes.FIELD("value", DataTypes.STRING())))) .build() val schema = HBaseTableSchema.fromTableSchema(hbaseTableSchema)
val ds: DataStream[Row] = env.createInput(new HBaseRowInputFormat( hbaseConfig(), tabelName, schema )) ds.print() env.execute(this.getClass.getSimpleName) 运行时报了如下错误:
java.lang.RuntimeException: Row arity of from (2) does not match this serializers field length (1). at org.apache.flink.api.java.typeutils.runtime.RowSerializer.copy(RowSerializer.java:113) at org.apache.flink.api.java.typeutils.runtime.RowSerializer.copy(RowSerializer.java:58) at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:715) at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692) at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672) at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52) at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30) at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104) at org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:93) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63) at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
找到了原因是HBaseRowInputFormat源码中: @Override public TypeInformation getProducedType() { // split the fieldNames String[] famNames = schema.getFamilyNames(); TypeInformation<?>[] typeInfos = new TypeInformation[famNames.length]; int i = 0; for (String family : famNames) { typeInfos[i] = new RowTypeInfo( schema.getQualifierTypes(family), schema.getQualifierNames(family)); i++; } return new RowTypeInfo(typeInfos, famNames); } 此处在构建TypeInformation时,没有加入rowkey的类型
所以这是一个bug吗?*来自志愿者整理的flink邮件归档
可以参照一下 HBaseTableSource 里面的实现方法
HBaseTableSchema hbaseSchema = new HBaseTableSchema(); hbaseSchema.addColumn(xxx) hbaseSchema.setRowKey(xxx);
execEnv.createInput(new HBaseRowInputFormat(conf, tableName, hbaseSchema), getReturnType()) .name(explainSource());*来自志愿者整理的flink邮件归档
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。