Hive job,抛错java.io.FileNotFoundException:/.../container_000001(Is a directory)

简介: 场景: 跑hive job时,夯住 错误: 查看RM WebSLF4J: Class path contains multiple SLF4J bindings.

场景: 跑hive job时,夯住

错误: 查看RM Web
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop-2.7.2/tmp/nm-local-dir/usercache/root/appcache/application_1468583637020_0006/filecache/10/job.jar/job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /hadoop/hadoop-2.7.2/logs/userlogs/application_1468583637020_0006/container_e58_1468583637020_0006_01_000001 (Is a directory)
 at java.io.FileOutputStream.open(Native Method)
 at java.io.FileOutputStream.(FileOutputStream.java:221)
 at java.io.FileOutputStream.(FileOutputStream.java:142)
 at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
 at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
 at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
 at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
 at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
 at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
 at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
 at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
 at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
 at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
 at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
 at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
 at org.apache.log4j.LogManager.(LogManager.java:127)
 at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
 at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
 at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:156)
 at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
 at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
 at org.apache.hadoop.service.AbstractService.(AbstractService.java:43)
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class
Jul 19, 2016 7:32:00 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Jul 19, 2016 7:32:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Jul 19, 2016 7:32:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope "PerRequest"


分析:
1>.wordcount测试MR--ok
2>.调优yarn参数

解决方法: 修改每个节点的yarn-site.xml,然后重启服务生效

点击(此处)折叠或打开

<property>
    <name>yarn.nodemanager.resource.memory-mb</name>
    <value>8192</value>
     <discription>当前节点nodemanager进程分配内存大小</discription>
</property>
<property>
    <name>yarn.scheduler.minimum-allocation-mb</name>
    <value>1024</value>
    <discription>单个任务可申请最少内存,默认1024MB</discription>
 </property>
  
<property>
    <name>yarn.scheduler.maximum-allocation-mb</name>
    <value>8192</value>
    <discription>单个任务可申请最大内存,默认8192MB</discription>
</property>
<property>
    <name>yarn.nodemanager.resource.cpu-vcores</name>
    <value>4</value>
</property>

目录
相关文章
|
6月前
|
SQL Java 数据库连接
java链接hive数据库实现增删改查操作
java链接hive数据库实现增删改查操作
404 0
|
SQL 分布式计算 监控
Hive性能优化之计算Job执行优化 2
Hive性能优化之计算Job执行优化
225 1
|
SQL 前端开发 Java
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制 1
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制
|
2月前
|
SQL JavaScript 前端开发
用Java来开发Hive应用
用Java来开发Hive应用
35 7
|
2月前
|
SQL JavaScript 前端开发
基于Java访问Hive的JUnit5测试代码实现
根据《用Java、Python来开发Hive应用》一文,建立了使用Java、来开发Hive应用的方法,产生的代码如下
69 6
|
2月前
|
SQL JavaScript 前端开发
用Java、Python来开发Hive应用
用Java、Python来开发Hive应用
31 6
|
6月前
|
SQL 关系型数据库 MySQL
Hive【环境搭建 01】【hive-3.1.2版本 安装配置】【含 mysql-connector-java-5.1.47.jar 网盘资源】【详细】
【4月更文挑战第6天】Hive【环境搭建 01】【hive-3.1.2版本 安装配置】【含 mysql-connector-java-5.1.47.jar 网盘资源】【详细】
298 1
|
SQL Java 大数据
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制 2
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制
|
SQL 关系型数据库 MySQL
Hive报错:HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.me
Hive报错:HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.me
669 0
|
6月前
|
SQL 分布式计算 Hadoop
[AIGC ~大数据] 深入理解Hadoop、HDFS、Hive和Spark:Java大师的大数据研究之旅
[AIGC ~大数据] 深入理解Hadoop、HDFS、Hive和Spark:Java大师的大数据研究之旅
180 0