summary.typeQuotaInfos.typeQuotaInfo[3].type

简介: summary.typeQuotaInfos.typeQuotaInfo[3].type

1.jpeg

摘要

在使用工具方法==fs.getContentSummary(path)==方法获取hbase库表信息时报错,报错如下:

java.io.IOException: com.google.protobuf.ServiceException: com.google.protobuf.UninitializedMessageException: Message missing required fields: summary.typeQuotaInfos.typeQuotaInfo[3].type
    at org.apache.hadoop.ipc.ProtobufHelper.getRemoteException(ProtobufHelper.java:47) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getContentSummary(ClientNamenodeProtocolTranslatorPB.java:809) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at sun.reflect.GeneratedMethodAccessor163.invoke(Unknown Source) ~[na:na]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_231]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_231]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at com.sun.proxy.$Proxy213.getContentSummary(Unknown Source) ~[na:na]
    at org.apache.hadoop.hdfs.DFSClient.getContentSummary(DFSClient.java:3040) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:725) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:721) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.hdfs.DistributedFileSystem.getContentSummary(DistributedFileSystem.java:721) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at com.geespace.microservices.directory.assets.service.impl.DataAssetsScreenServiceImpl.updateAssetsSize(DataAssetsScreenServiceImpl.java:153) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at com.geespace.microservices.directory.assets.service.impl.DataAssetsScreenServiceImpl{
   mathJaxContainer[0]}cc83a803.invoke(<generated>) [api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) [api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:749) [api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) [api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:294) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:98) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) [api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.springframework.aop.interceptor.AsyncExecutionInterceptor.lambda$invoke$0(AsyncExecutionInterceptor.java:115) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_231]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_231]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_231]
    at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_231]
Caused by: com.google.protobuf.ServiceException: com.google.protobuf.UninitializedMessageException: Message missing required fields: summary.typeQuotaInfos.typeQuotaInfo[3].type
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:271) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at com.sun.proxy.$Proxy212.getContentSummary(Unknown Source) ~[na:na]
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getContentSummary(ClientNamenodeProtocolTranslatorPB.java:806) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    ... 24 common frames omitted
Caused by: com.google.protobuf.UninitializedMessageException: Message missing required fields: summary.typeQuotaInfos.typeQuotaInfo[3].type
    at com.google.protobuf.AbstractMessage$Builder.newUninitializedMessageException(AbstractMessage.java:770) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetContentSummaryResponseProto$Builder.build(ClientNamenodeProtocolProtos.java) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetContentSummaryResponseProto$Builder.build(ClientNamenodeProtocolProtos.java) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:261) ~[api-gateway-1.0-SNAPSHOT.jar:1.0-SNAPSHOT]
    ... 26 common frames omitted

原因一:

服务器hadoop版本和springboot项目依赖的hadoop版本不一致,以服务器hadoop版本为主

解决方案:

修改pom相关的hadoop的version,确保一致即可


原因二:

springboot项目里引入多个依赖,甚至有隐藏依赖,导致出现多个版本,最终由于存在多版本导致服务器调用了不一致版本而报错,截图问题如下:
image.png

解决办法一:

依赖hadoop-common设置了3.1.1即可

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>3.1.1</version>
</dependency>

解决办法二:

还有一种情况是hadoop相关依赖已经设置3.1.1正确版本,但是其他hbase依赖内部包含了2.7.7的版本,所以需要找到它并排除掉,最终只保留一个3.1.1的版本即可,不然report后可能会还是出现两个版本,解决方案如下添加

==说明:我偷懒了所以都拷贝排除了,正常只排除掉你需要排除的即可==

<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase-client</artifactId>
    <version>2.0.0</version>
    <exclusions>
        <exclusion>
            <artifactId>hadoop-hdfs</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hadoop-annotations</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hadoop-yarn-api</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hadoop-yarn-common</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hbase-hadoop-compat</artifactId>
            <groupId>org.apache.hbase</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hbase-hadoop2-compat</artifactId>
            <groupId>org.apache.hbase</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hadoop-common</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
        <exclusion>
            <artifactId>hadoop-client</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
    </exclusions>
</dependency>

其他文章答案讨论

1.比如这个 http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-issues/202012.mbox/%3CJIRA.13337627.1603879426000.278677.1607068440721@Atlassian.JIRA%3E
他说是nn和dn升级后版本不一致,这个针对能否单独对nn和dn升级我不清楚,所以需要你们去评论,当然如果有升级过的可以回复我,共同学习下

相关实践学习
lindorm多模间数据无缝流转
展现了Lindorm多模融合能力——用kafka API写入,无缝流转在各引擎内进行数据存储和计算的实验。
云数据库HBase版使用教程
&nbsp; 相关的阿里云产品:云数据库 HBase 版 面向大数据领域的一站式NoSQL服务,100%兼容开源HBase并深度扩展,支持海量数据下的实时存储、高并发吞吐、轻SQL分析、全文检索、时序时空查询等能力,是风控、推荐、广告、物联网、车联网、Feeds流、数据大屏等场景首选数据库,是为淘宝、支付宝、菜鸟等众多阿里核心业务提供关键支撑的数据库。 了解产品详情:&nbsp;https://cn.aliyun.com/product/hbase &nbsp; ------------------------------------------------------------------------- 阿里云数据库体验:数据库上云实战 开发者云会免费提供一台带自建MySQL的源数据库&nbsp;ECS 实例和一台目标数据库&nbsp;RDS实例。跟着指引,您可以一步步实现将ECS自建数据库迁移到目标数据库RDS。 点击下方链接,领取免费ECS&amp;RDS资源,30分钟完成数据库上云实战!https://developer.aliyun.com/adc/scenario/51eefbd1894e42f6bb9acacadd3f9121?spm=a2c6h.13788135.J_3257954370.9.4ba85f24utseFl
目录
相关文章
|
11月前
|
Java
【ES异常】mapper [sortNum] of different type, current_type [long], merged_type [keyword]
【ES异常】mapper [sortNum] of different type, current_type [long], merged_type [keyword]
118 0
cannot convert parameter 1 from 'class A' to 'class A'
cannot convert parameter 1 from 'class A' to 'class A'
|
Scala
TYPE
TYPE
135 0
|
调度
pt-summary的使用
pt-summary 打印出来的信息包括:CPU、内存、硬盘、网卡等信息,还包括文件系统、磁盘调度和队列大小、LVM、RAID、网络链接信息、netstat 的统计,以及前10的负载占用信息和vmstat信息。
308 0
‘DebugConfig‘ only refers to a type, but is being used as a value here.
‘DebugConfig‘ only refers to a type, but is being used as a value here.
‘DebugConfig‘ only refers to a type, but is being used as a value here.
|
资源调度
R-Description Data(step 3)
R is a data analysis and visualization platform.
1093 0
|
Web App开发 关系型数据库 Java
Data truncation: Data too long for column 'xxx' at row 1
版权声明:本文为 testcs_dn(微wx笑) 原创文章,非商用自由转载-保持署名-注明出处,谢谢。 https://blog.csdn.net/testcs_dn/article/details/78870542 ...
2128 0