22/01/18 10:47:13 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
22/01/18 10:47:13 WARN MetricsSystem: Stopping a MetricsSystem that is not running
22/01/18 10:47:13 ERROR Main: Failed to initialize Spark session.
org.apache.spark.SparkException: Application application_1640680307634_6059 failed 2 times due to AM Container for appattempt_1640680307634_6059_000002 exited with exitCode: 1
Failing this attempt.Diagnostics: [2022-01-18 10:47:13.222]Exception from container-launch.
Container id: container_e35_1640680307634_6059_02_000001
Exit code: 1
[2022-01-18 10:47:13.222]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
/hadoop/yarn/local/usercache/root/appcache/application_1640680307634_6059/container_e35_1640680307634_6059_02_000001/launch_container.sh: line 38: :$PWD:$PWD/__spark_conf__:$PWD/__spark_libs__/*:$HADOOP_CONF_DIR:/usr/hdp/3.1.4.0-315/hadoop/*:/usr/hdp/3.1.4.0-315/hadoop/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:$PWD/__spark_conf__/__hadoop_conf__: bad substitution
基于ambari2.7.4 部署,但ambari上spark版本为2.3.0,本人想用2.4.5+版本spark,然后官网下了spark-2.4.8-bin-hadoop2.7部署,一开始说找不到jersey配置等,将jersey1.19相关包复制到spark jars目录下报错如上
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。