开发者社区> wsc449> 正文
阿里云
为了无法计算的价值
打开APP
阿里云APP内打开

解决Hbase启动报错问题:No such file or directory!

简介: 应用场景 在Hbase搭建完之后,本想开开心心的启动Hbase,进行测试使用hbase,但是发现启动hbase的时候,报各种各样的错误,java_home,hbase,hadoop等找不到文件或目录,no such ...
+关注继续查看

应用场景

在Hbase搭建完之后,本想开开心心的启动Hbase,进行测试使用hbase,但是发现启动hbase的时候,报各种各样的错误,java_home,hbase,hadoop等找不到文件或目录,no such file or directory!

[root@hadoop0 bin]# start-hbase.sh 
/opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
/opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
/opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
/opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
starting master, logging to /opt/hbase1.2.6/logs/hbase-root-master-hadoop0.out
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
hadoop0: +======================================================================+
hadoop0: |                    Error: JAVA_HOME is not set                       |
hadoop0: +----------------------------------------------------------------------+
hadoop0: | Please download the latest Sun JDK from the Sun Java web site        |
hadoop0: |     > http://www.oracle.com/technetwork/java/javase/downloads        |
hadoop0: |                                                                      |
hadoop0: | HBase requires Java 1.7 or later.                                    |
hadoop0: +======================================================================+
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 51: export HADOOP_HOME=/opt/hadoop2.6.0: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 51: export HADOOP_HOME=/opt/hadoop2.6.0: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 55: $'export\302\240HBASE_MANAGES_ZK=false': command not found
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 55: $'export\302\240HBASE_MANAGES_ZK=false': command not found
hadoop2: +======================================================================+
hadoop2: |                    Error: JAVA_HOME is not set                       |
hadoop2: +----------------------------------------------------------------------+
hadoop2: | Please download the latest Sun JDK from the Sun Java web site        |
hadoop2: |     > http://www.oracle.com/technetwork/java/javase/downloads        |
hadoop2: |                                                                      |
hadoop2: | HBase requires Java 1.7 or later.                                    |
hadoop2: +======================================================================+
hadoop1: +======================================================================+
hadoop1: |                    Error: JAVA_HOME is not set                       |
hadoop1: +----------------------------------------------------------------------+
hadoop1: | Please download the latest Sun JDK from the Sun Java web site        |
hadoop1: |     > http://www.oracle.com/technetwork/java/javase/downloads        |
hadoop1: |                                                                      |
hadoop1: | HBase requires Java 1.7 or later.                                    |
hadoop1: +======================================================================+

解决方案

# 查看hbase-env.sh文件

 # cd /opt/hbase1.2.6/conf
 # vim hbase-env.sh

export HBASE_MANAGES_ZK=false
export JAVA_HOME="/opt/jdk1.8"
export HADOOP_HOME="/opt/hadoop2.6.0"
export HBASE_HOME="/opt/hbase1.2.6"
export HBASE_CLASSPATH="/opt/hadoop2.6.0/etc/hadoop"
export HBASE_PID_DIR="/opt/hbase1.2.6/pids"

# 上面这一块配置,全部重新手写,不要从网上复制过来直接黏贴,其中可能有中文字符,导致找不到环境变量

修改后直接在主节点启动Hbase

 # start-hbase.sh    

注:只需要在主节点上启动,从节点会自动全部开启hbase服务

这里写图片描述

这里写图片描述

这里写图片描述

版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。

相关文章
lua.c:80:31: fatal error: readline/readline.h: No such file or directory
make linuxcd src && make linuxmake[1]: Entering directory `/root/lua/lua-5.3.2/src'make all SYSCFLAGS="-DLUA_USE_LINUX" SYSLIBS="-Wl,-E -ldl -lreadline"make[2]: Entering directory `/root/lua/lua-5.
1413 0
jemalloc/jemalloc.h: No such file or directory
Redis 2.6.9 安装报错,提示: zmalloc.h:50:31: error: jemalloc/jemalloc.h: No such file or directoryzmalloc.h:55:2: error: #error "Newer version of jemalloc required"make[1]: *** [adlist.
898 0
Metasploit jboss deployment file repository exploit
MC pushed out a new exploit today (jboss_deploymentfilerrepository) so while it lists 4.
573 0
fatal error: X11/extensions/Xvlib.h/Xdbe.h: No such file or directory
fatal error: X11/extensions/Xvlib.h/Xdbe.h: No such file or directory
0 0
pip安装时 fatal error C1083 无法打开包括文件 “io.h” No such file or directory
pip安装时 fatal error C1083 无法打开包括文件 “io.h” No such file or directory
0 0
pip安装报错:No such file or directory ; 没有那个文件或目录
pip安装报错:No such file or directory ; 没有那个文件或目录
0 0
+关注
wsc449
python,c#,scala
文章
问答
文章排行榜
最热
最新
相关电子书
更多
What’s new in Hadoop Common and HDFS
立即下载
Protecting Enterprise Data in Apache Hadoop
立即下载
低代码开发师(初级)实战教程
立即下载