开发者社区> 问答> 正文

同步hadoop配置文件通常怎么做?

本问题来自阿里云开发者社区的【11大垂直技术领域开发者社群】。 https://developer.aliyun.com/article/706511 点击链接欢迎加入感兴趣的技术领域群。

展开
收起
新闻小助手 2019-08-29 17:15:07 1735 0
2 条回答
写回答
取消 提交回答
    1. 可以找一个admin中控机,最好是设置成可以免密登陆到hadoop集群上每个节点,然后使用这台中控机把hadoop配置文件scp到每个节点。
    2. 或者把hadoop配置文件上传到一个共享存储,比如oss,然后hadoop集群每个节点都从共享存储上下载配置文件。
    2020-03-03 20:27:28
    赞同 展开评论 打赏
  • <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"      xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">      <modelVersion>4.0.0</modelVersion>      <groupId>winksi.com.cn</groupId>      <artifactId>hadoopTest</artifactId>      <version>0.0.1-SNAPSHOT</version>        <!-- 添加镜像依赖 ,在maven的官方库里是没有CDH相关包的,所以只能配置CDH的官网下载地址来下载jar包-->      <repositories>          <repository>              <id>cloudera</id>              <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>          </repository>          <repository>              <id>spring-repo</id>              <url>http://repo.springsource.org/libs-milestone</url>          </repository>      </repositories>      <properties>          <java.version>1.6</java.version>          <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>          <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>          <!-- Spring -->          <spring-framework.version>3.2.3.RELEASE</spring-framework.version>          <!-- Logging -->          <logback.version>1.0.13</logback.version>          <slf4j.version>1.7.5</slf4j.version>          <!-- Test -->          <junit.version>4.11</junit.version>          <!-- hadoop -->          <CDH.version>2.0.0-cdh4.5.0</CDH.version>          <hadoop.core.version>2.0.0-mr1-cdh4.5.0</hadoop.core.version>          <groovy.version>1.8.5</groovy.version>      </properties>        <dependencies>          <!-- Spring and Transactions -->          <dependency>              <groupId>org.springframework</groupId>              <artifactId>spring-context</artifactId>              <version>${spring-framework.version}</version>          </dependency>          <dependency>              <groupId>org.springframework</groupId>              <artifactId>spring-tx</artifactId>              <version>${spring-framework.version}</version>          </dependency>          <!-- Spring-jdbc -->          <dependency>              <groupId>org.springframework</groupId>              <artifactId>spring-jdbc</artifactId>              <version>${spring-framework.version}</version>          </dependency>            <!-- Logging with SLF4J & LogBack -->          <dependency>              <groupId>org.slf4j</groupId>              <artifactId>slf4j-api</artifactId>              <version>${slf4j.version}</version>              <scope>compile</scope>          </dependency>          <dependency>              <groupId>ch.qos.logback</groupId>              <artifactId>logback-classic</artifactId>              <version>${logback.version}</version>              <scope>runtime</scope>          </dependency>            <!-- Test Artifacts -->          <dependency>              <groupId>org.springframework</groupId>              <artifactId>spring-test</artifactId>              <version>${spring-framework.version}</version>              <scope>test</scope>          </dependency>          <dependency>              <groupId>junit</groupId>              <artifactId>junit</artifactId>              <version>${junit.version}</version>              <scope>test</scope>          </dependency>          <dependency>              <groupId>com.alibaba</groupId>              <artifactId>fastjson</artifactId>              <version>1.1.37</version>          </dependency>            <span style="color:#FF0000;"><!-- hadoop -->          <dependency>              <groupId>org.springframework.data</groupId>              <artifactId>spring-data-hadoop</artifactId>              <version>1.0.2.RELEASE-cdh4</version>          </dependency>          <dependency>              <groupId>org.apache.hadoop</groupId>              <artifactId>hadoop-hdfs</artifactId>              <version>${CDH.version}</version>          </dependency>          <dependency>              <groupId>org.apache.hadoop</groupId>              <artifactId>hadoop-core</artifactId>              <version>${hadoop.core.version}</version>          </dependency>          <dependency>              <groupId>org.codehaus.groovy</groupId>              <artifactId>groovy</artifactId>              <version>${groovy.version}</version>          </dependency>          <dependency>              <groupId>org.apache.hadoop</groupId>              <artifactId>hadoop-client</artifactId>              <version>${CDH.version}</version>          </dependency>          <dependency>              <groupId>org.apache.hive</groupId>              <artifactId>hive-service</artifactId>              <version>0.10.0-cdh4.5.0</version>          </dependency>          <dependency>              <groupId>org.apache.hbase</groupId>              <artifactId>hbase</artifactId>              <version>0.94.6-cdh4.5.0</version>          </dependency></span>      </dependencies>  </project>
    

    答案来源网络,供参考,希望对您有帮助

    2019-10-14 19:10:51
    赞同 展开评论 打赏
问答分类:
问答地址:
问答排行榜
最热
最新

相关电子书

更多
《构建Hadoop生态批流一体的实时数仓》 立即下载
零基础实现hadoop 迁移 MaxCompute 之 数据 立即下载
CIO 指南:如何在SAP软件架构中使用Hadoop 立即下载

相关实验场景

更多