现象:日志中报错:
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3236)
。。。
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1542)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:881)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
解决:主jar包太大了,DataWorks端oom了,可以减小主jar包,其他依赖通过spark.hadoop.odps.cupid.resources参数引用。