关于64位Linux编译hadoop2

简介:

Apache官方提供hadoop2的安装包是在32位机器下编译的,生产环境一般使用64的Linux,那么需要在64位机器下重新编译
可以查看hadoop-2.2.0-src下的BUILDING.txt
Build instructions for Hadoop

----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

----------------------------------------------------------------------------------
...
步骤:
1.安装JDK 1.6+ (验证:java -version) 
2.安装Maven 3.0 or later (验证:mvn -version)
3.ProtocolBuffer 2.5.0 (验证:protoc --version)下载地址:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
#为了编译安装protobuf,Linux需要上网,使用YUM在线安装依赖。如不能上网,就比较麻烦,需要一一下载每个依赖包再安装。
sudo yum install gcc 
sudo yum install gcc-c++ 
sudo yum install make

#解压protobuf
sudo tar -zxvf protobuf-2.5.0.tar.gz
#进入到protobuf-2.5.0
cd protobuf-2.5.0
#编译安装
sudo ./configure
sudo make
sudo make install
4.安装CMake 2.6 or newer
sudo yum install cmake
sudo yum install openssl-devel
sudo yum install ncurses-devel
5.编译hadoop-2.2.0
#解压hadoop-2.2.0-src.tar.gz
tar -zxvf hadoop-2.2.0-src.tar.gz
#进入到hadoop-2.2.0-src
cd hadoop-2.2.0-src

#hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/pom.xml有个bug,这里需要修改一下
vim hadoop-common-project/hadoop-auth/pom.xml
#在<dependencies>标签内添加如下内容
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>

#编译
mvn package -DskipTests -Pdist,native

#编译好的hadoop-2.2.0再hadoop-2.2.0-src/hadoop-dist/target目录下

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [ 1.228 s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [ 0.894 s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [ 1.809 s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [ 0.222 s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [ 1.198 s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [ 2.205 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [ 2.169 s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [ 1.583 s]
[INFO] Apache Hadoop Common .............................. SUCCESS [01:02 min]
[INFO] Apache Hadoop NFS ................................. SUCCESS [ 5.132 s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [ 0.038 s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [01:02 min]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [ 9.002 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [ 4.995 s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [ 2.647 s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.058 s]
[INFO] hadoop-yarn ....................................... SUCCESS [ 0.138 s]
[INFO] hadoop-yarn-api ................................... SUCCESS [ 31.854 s]
[INFO] hadoop-yarn-common ................................ SUCCESS [ 20.121 s]
[INFO] hadoop-yarn-server ................................ SUCCESS [ 0.105 s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [ 5.776 s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [ 10.490 s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [ 3.321 s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [ 8.311 s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [ 0.510 s]
[INFO] hadoop-yarn-client ................................ SUCCESS [ 3.929 s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [ 0.060 s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [ 1.720 s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [ 0.062 s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [ 16.204 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [ 1.779 s]
[INFO] hadoop-yarn-site .................................. SUCCESS [ 0.111 s]
[INFO] hadoop-yarn-project ............................... SUCCESS [ 2.287 s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [ 11.855 s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [ 2.560 s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [ 6.985 s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [ 3.319 s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [ 4.021 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [ 1.508 s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [ 4.176 s]
[INFO] hadoop-mapreduce .................................. SUCCESS [ 2.367 s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [ 2.902 s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [ 5.365 s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [ 1.673 s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [ 4.095 s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [ 2.962 s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [ 2.089 s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [ 2.190 s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [ 5.887 s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [ 1.149 s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [ 0.028 s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [ 6.514 s]
[INFO] Apache Hadoop Client .............................. SUCCESS [ 2.199 s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [ 0.121 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS (####看到BUILD SUCCESS说明编译成功####)
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:39 min
[INFO] Finished at: 2014-05-23T13:37:06+08:00
[INFO] Final Memory: 135M/300M
[INFO] ------------------------------------------------------------------------


本文转自SummerChill博客园博客,原文链接:http://www.cnblogs.com/DreamDrive/p/4546792.html,如需转载请自行联系原作者

相关文章
|
23天前
|
Linux API 开发工具
FFmpeg开发笔记(五十九)Linux编译ijkplayer的Android平台so库
ijkplayer是由B站研发的移动端播放器,基于FFmpeg 3.4,支持Android和iOS。其源码托管于GitHub,截至2024年9月15日,获得了3.24万星标和0.81万分支,尽管已停止更新6年。本文档介绍了如何在Linux环境下编译ijkplayer的so库,以便在较新的开发环境中使用。首先需安装编译工具并调整/tmp分区大小,接着下载并安装Android SDK和NDK,最后下载ijkplayer源码并编译。详细步骤包括环境准备、工具安装及库编译等。更多FFmpeg开发知识可参考相关书籍。
73 0
FFmpeg开发笔记(五十九)Linux编译ijkplayer的Android平台so库
|
27天前
|
Linux 编译器 C语言
【Linux快速入门(一)】Linux与ROS学习之编译基础(gcc编译)
【Linux快速入门(一)】Linux与ROS学习之编译基础(gcc编译)
|
10天前
|
Linux
Linux - 如何编译源码安装软件
源码编译安装通常包括三个步骤:1) `./configure` 检测平台特征和依赖项,生成 Makefile;2) `make` 编译源码,生成可执行文件;3) `make install` 将可执行文件安装到指定目录并配置环境变量。
25 0
|
1月前
|
Linux 编译器 C语言
Linux c/c++之多文档编译
这篇文章介绍了在Linux操作系统下使用gcc编译器进行C/C++多文件编译的方法和步骤。
37 0
Linux c/c++之多文档编译
|
26天前
|
Linux 开发工具
【Linux快速入门(二)】Linux与ROS学习之编译基础(make编译)
【Linux快速入门(二)】Linux与ROS学习之编译基础(make编译)
|
3月前
|
监控 机器人 Unix
GoLand——windows下如何编译Linux二进制文件
GoLand——windows下如何编译Linux二进制文件
52 1
GoLand——windows下如何编译Linux二进制文件
|
2月前
|
Linux
用clang编译Linux内核
用clang编译Linux内核
|
3月前
|
Linux C语言
深度探索Linux操作系统 —— 编译过程分析
深度探索Linux操作系统 —— 编译过程分析
27 2
|
3月前
|
Ubuntu Linux Windows
如何在WSL中的ubuntu编译Linux内核并且安装使用ebpf?
请注意,在WSL1中可能会由于内核架构限制而无法成功进行以上过程,WSL2对于Linux内核的完整支持更为合适。此外,部分步骤可能因不同的Linux发行版或内核版本而异。
159 4
|
3月前
|
Java Linux 编译器
【Linux】gcc简介+编译过程
【Linux】gcc简介+编译过程
下一篇
无影云桌面