开发者社区 > 大数据与机器学习 > 实时计算 Flink > 正文

Flink中这个日志有能看出来什么原因吗?

Flink中这个日志有能看出来什么原因吗? Log Type: directory.info

Log Upload Time: 星期五 四月 21 14:49:18 +0800 2023

Log Length: 3795

ls -l: 总用量 28 -rw-r--r--. 1 root root 74 4月 21 22:42 container_tokens -rwx------. 1 root root 696 4月 21 22:42 default_container_executor_session.sh -rwx------. 1 root root 751 4月 21 22:42 default_container_executor.sh lrwxrwxrwx. 1 root root 174 4月 21 22:42 flink-conf.yaml -> /home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/10/application_1681917036129_0006-flink-conf.yaml2550516985980504386.tmp lrwxrwxrwx. 1 root root 131 4月 21 22:42 flink-dist_2.11-1.13.0.jar -> /home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/16/flink-dist_2.11-1.13.0.jar lrwxrwxrwx. 1 root root 132 4月 21 22:42 flink-test-1.0-SNAPSHOT.jar -> /home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/20/flink-test-1.0-SNAPSHOT.jar lrwxrwxrwx. 1 root root 158 4月 21 22:42 job.graph -> /home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/15/application_1681917036129_00061210674536691687301.tmp -rwx------. 1 root root 11318 4月 21 22:42 launch_container.sh drwxr-xr-x. 2 root root 4096 4月 21 22:42 lib drwx--x---. 2 root root 6 4月 21 22:42 tmp find -L . -maxdepth 5 -ls: 34614961 4 drwx--x--- 4 root root 4096 4月 21 22:42 . 101816336 0 drwx--x--- 2 root root 6 4月 21 22:42 ./tmp 34614962 4 -rw-r--r-- 1 root root 74 4月 21 22:42 ./container_tokens 34614963 4 -rw-r--r-- 1 root root 12 4月 21 22:42 ./.container_tokens.crc 34614964 12 -rwx------ 1 root root 11318 4月 21 22:42 ./launch_container.sh 34614965 4 -rw-r--r-- 1 root root 100 4月 21 22:42 ./.launch_container.sh.crc 34614966 4 -rwx------ 1 root root 696 4月 21 22:42 ./default_container_executor_session.sh 34614967 4 -rw-r--r-- 1 root root 16 4月 21 22:42 ./.default_container_executor_session.sh.crc 34614968 4 -rwx------ 1 root root 751 4月 21 22:42 ./default_container_executor.sh 34614969 4 -rw-r--r-- 1 root root 16 4月 21 22:42 ./.default_container_executor.sh.crc 1596385 4 drwxr-xr-x 2 root root 4096 4月 21 22:42 ./lib 68303630 35616 -r-x------ 1 root root 36468695 4月 21 22:42 ./lib/flink-table_2.11-1.13.0.jar 1596377 68 -r-x------ 1 root root 67114 4月 21 22:42 ./lib/log4j-1.2-api-2.12.1.jar 101816327 51200 -r-x------ 1 root root 52426893 4月 21 22:42 ./lib/hudi-flink1.13-bundle_2.11-0.11.1.jar 1596371 152 -r-x------ 1 root root 151996 4月 21 22:42 ./lib/flink-json-1.13.0.jar 68303634 7532 -r-x------ 1 root root 7709740 4月 21 22:42 ./lib/flink-shaded-zookeeper-3.4.14.jar 1596374 96 -r-x------ 1 root root 96144 4月 21 22:42 ./lib/flink-csv-1.13.0.jar 34614953 24 -r-x------ 1 root root 23518 4月 21 22:42 ./lib/log4j-slf4j-impl-2.12.1.jar 67269675 40056 -r-x------ 1 root root 41013390 4月 21 22:42 ./lib/flink-table-blink_2.11-1.13.0.jar 34614956 1636 -r-x------ 1 root root 1674433 4月 21 22:42 ./lib/log4j-core-2.12.1.jar 34614959 272 -r-x------ 1 root root 276771 4月 21 22:42 ./lib/log4j-api-2.12.1.jar 101816333 155668 -r-x------ 1 root root 159400645 4月 21 22:42 ./flink-test-1.0-SNAPSHOT.jar 34614950 4 -r-x------ 1 root root 672 4月 21 22:42 ./flink-conf.yaml 101816330 112932 -r-x------ 1 root root 115641566 4月 21 22:42 ./flink-dist_2.11-1.13.0.jar 68303627 16 -r-x------ 1 root root 15676 4月 21 22:42 ./job.graph broken symlinks(find -L . -maxdepth 5 -type l -ls):

Log Type: jobmanager.err

Log Upload Time: 星期五 四月 21 14:49:18 +0800 2023

Log Length: 97

错误: 找不到或无法加载主类 org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint

Log Type: jobmanager.out

Log Upload Time: 星期五 四月 21 14:49:18 +0800 2023

Log Length: 0

Log Type: launch_container.sh

Log Upload Time: 星期五 四月 21 14:49:18 +0800 2023

Log Length: 11318

Showing 4096 bytes of 11318 total. Click here for the full log.

i-flink1.13-bundle_2.11-0.11.1.jar" "lib/hudi-flink1.13-bundle_2.11-0.11.1.jar" mkdir -p lib ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/13/flink-json-1.13.0.jar" "lib/flink-json-1.13.0.jar" mkdir -p lib ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/23/flink-shaded-zookeeper-3.4.14.jar" "lib/flink-shaded-zookeeper-3.4.14.jar" ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/20/flink-test-1.0-SNAPSHOT.jar" "flink-test-1.0-SNAPSHOT.jar" ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/10/application_1681917036129_0006-flink-conf.yaml2550516985980504386.tmp" "flink-conf.yaml" ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/16/flink-dist_2.11-1.13.0.jar" "flink-dist_2.11-1.13.0.jar" mkdir -p lib ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/17/flink-csv-1.13.0.jar" "lib/flink-csv-1.13.0.jar" ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/15/application_1681917036129_00061210674536691687301.tmp" "job.graph" mkdir -p lib ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/14/log4j-slf4j-impl-2.12.1.jar" "lib/log4j-slf4j-impl-2.12.1.jar" mkdir -p lib ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/11/flink-table-blink_2.11-1.13.0.jar" "lib/flink-table-blink_2.11-1.13.0.jar" mkdir -p lib ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/18/log4j-core-2.12.1.jar" "lib/log4j-core-2.12.1.jar" mkdir -p lib ln -sf -- "/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1681917036129_0006/filecache/22/log4j-api-2.12.1.jar" "lib/log4j-api-2.12.1.jar" echo "Copying debugging information"

Creating copy of launch script

cp "launch_container.sh" "/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/launch_container.sh" chmod 640 "/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/launch_container.sh"

Determining directory contents

echo "ls -l:" 1>"/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/directory.info" ls -l 1>>"/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/directory.info" echo "find -L . -maxdepth 5 -ls:" 1>>"/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/directory.info" find -L . -maxdepth 5 -ls 1>>"/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/directory.info" echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/directory.info" find -L . -maxdepth 5 -type l -ls 1>>"/home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/directory.info" echo "Launching container" exec /bin/bash -c "$JAVA_HOME/bin/java -Xmx1073741824 -Xms1073741824 -XX:MaxMetaspaceSize=268435456 org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint -D jobmanager.memory.off-heap.size=134217728b -D jobmanager.memory.jvm-overhead.min=201326592b -D jobmanager.memory.jvm-metaspace.size=268435456b -D jobmanager.memory.heap.size=1073741824b -D jobmanager.memory.jvm-overhead.max=201326592b 1> /home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/jobmanager.out 2> /home/hadoop-3.1.3/logs/userlogs/application_1681917036129_0006/container_1681917036129_0006_01_000001/jobmanager.err"

Log Type: prelaunch.err

Log Upload Time: 星期五 四月 21 14:49:18 +0800 2023

Log Length: 0

Log Type: prelaunch.out

Log Upload Time: 星期五 四月 21 14:49:18 +0800 2023

Log Length: 100

Setting up env variables Setting up job resources Copying debugging information Launching container

展开
收起
冰激凌甜筒 2023-04-23 22:58:36 158 0
1 条回答
写回答
取消 提交回答
  • 您好,根据您提供的日志信息,我无法直接判断出具体原因。但是,我可以为您提供一些可能的线索,帮助您进一步排查问题。

    日志类型:directory.info

    此日志类型表示 Flink 正在处理目录信息。这通常发生在 Flink 读取数据时,需要扫描目录以查找文件。如果您的作业正在处理大量文件,或者目录结构非常复杂,则可能会导致性能问题。

    日志上传时间:星期五 四月 21 14:49:18 +0800 2023

    此日志是在 2023 年 4 月 21 日 14:49:18 上传的。如果您在该时间段内对作业进行了任何更改,则可能是导致问题的原因。

    日志长度:3795

    此日志的长度为 3795 字节。如果日志非常长,则可能包含大量详细信息,这可能会使排查问题变得更加困难。

    建议您检查作业的配置,确保其与您的数据和环境兼容。您还可以尝试使用 Flink 的性能分析工具来分析作业的性能,并找出可能导致问题的瓶颈。

    2024-02-28 17:52:11
    赞同 展开评论 打赏

实时计算Flink版是阿里云提供的全托管Serverless Flink云服务,基于 Apache Flink 构建的企业级、高性能实时大数据处理系统。提供全托管版 Flink 集群和引擎,提高作业开发运维效率。

相关产品

  • 实时计算 Flink版
  • 相关电子书

    更多
    PostgresChina2018_赖思超_PostgreSQL10_hash索引的WAL日志修改版final 立即下载
    Kubernetes下日志实时采集、存储与计算实践 立即下载
    日志数据采集与分析对接 立即下载