开发者社区> 问答> 正文

flink1.12版本,使用yarn-application模式提交任务失败怎么办?

通过脚本提交flink作业,提交命令: /bin/flink run-application -t yarn-application -Dyarn.provided.lib.dirs="hdfs://xx/flink120/" hdfs://xx/flink-example.jar --sqlFilePath /xxx/kafka2print.sql

flink使用的Lib及user jar已经上传到Hdfs路径,但是抛出以下错误:

The program finished with the following exception:

org.apache.flink.client.deployment.ClusterDeploymentException: Couldn't deploy Yarn Application Cluster at org.apache.flink.yarn.YarnClusterDescriptor.deployApplicationCluster(YarnClusterDescriptor.java:465) at org.apache.flink.client.deployment.application.cli.ApplicationClusterDeployer.run(ApplicationClusterDeployer.java:67) at org.apache.flink.client.cli.CliFrontend.runApplication(CliFrontend.java:213) at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1061) at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1136) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754) at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1136) Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://xx/flink120/, expected: file:/// at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:648) at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:82) at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:606) at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824) at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:428) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425) at org.apache.flink.yarn.YarnApplicationFileUploader.lambda$getAllFilesInProvidedLibDirs$2(YarnApplicationFileUploader.java:469) at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedConsumer$3(FunctionUtils.java:93) at java.util.ArrayList.forEach(ArrayList.java:1257) at org.apache.flink.yarn.YarnApplicationFileUploader.getAllFilesInProvidedLibDirs(YarnApplicationFileUploader.java:466) at org.apache.flink.yarn.YarnApplicationFileUploader. (YarnApplicationFileUploader.java:106) at org.apache.flink.yarn.YarnApplicationFileUploader.from(YarnApplicationFileUploader.java:381) at org.apache.flink.yarn.YarnClusterDescriptor.startAppMaster(YarnClusterDescriptor.java:789) at org.apache.flink.yarn.YarnClusterDescriptor.deployInternal(YarnClusterDescriptor.java:592) at org.apache.flink.yarn.YarnClusterDescriptor.deployApplicationCluster(YarnClusterDescriptor.java:458) ... 9 more*来自志愿者整理的flink邮件归档

展开
收起
EXCEED 2021-12-01 14:19:46 1519 0
1 条回答
写回答
取消 提交回答
  • Hi 从你的日志看作业启动失败的原因是: Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://xx/flink120/, expected: file:/// 看上去你设置的地址和 需要的 schema 不一样,你需要解决一下这个问题*来自志愿者整理的flink邮件归档

    2021-12-01 15:09:13
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Flink CDC Meetup PPT - 龚中强 立即下载
Flink CDC Meetup PPT - 王赫 立即下载
Flink CDC Meetup PPT - 覃立辉 立即下载