apiVersion: batch/v1
kind: Job
metadata:
name: flink-jobmanager
namespace: flink-resource
spec:
template:
metadata:
labels:
app: flink
component: jobmanager
spec:
restartPolicy: OnFailure
containers:
- name: jobmanager
image: apache/flink:1.13.1
env:
args: [ "--job-classname", "com.ponshine.CorrelateEngine"]
ports:
- containerPort: 6123
name: rpc
- containerPort: 6124
name: blob-server
- containerPort: 8081
name: webui
livenessProbe:
tcpSocket:
port: 6123
initialDelaySeconds: 30
periodSeconds: 60
volumeMounts:
- name: flink-config-volume
mountPath: /opt/flink/conf
- name: job-artifacts-volume
mountPath: /opt/flink/usrlib
#securityContext:
# runAsUser: 9999 # 参考官方 flink 镜像中的 _flink_ 用户,如有必要可以修改
volumes:
- name: flink-config-volume
configMap:
name: flink-config
items:
- key: config.yaml
path: config.yaml
- key: log4j-console.properties
path: log4j-console.properties
- name: job-artifacts-volume
persistentVolumeClaim:
claimName: hostpath-pvc
这是我在rancher中启动flink集群的配置文件,怎么配置都提示
grep: /opt/flink/conf/flink-conf.yaml: No such file or directory
/docker-entrypoint.sh: line 73: /opt/flink/conf/flink-conf.yaml: Read-only file system
grep: /opt/flink/conf/flink-conf.yaml: No such file or directory
/docker-entrypoint.sh: line 73: /opt/flink/conf/flink-conf.yaml: Read-only file system
grep: /opt/flink/conf/flink-conf.yaml: No such file or directory
/docker-entrypoint.sh: line 73: /opt/flink/conf/flink-conf.yaml: Read-only file system
grep: /opt/flink/conf/flink-conf.yaml: No such file or directory
/docker-entrypoint.sh: line 73: /opt/flink/conf/flink-conf.yaml: Read-only file system
/docker-entrypoint.sh: line 88: /opt/flink/conf/flink-conf.yaml: No such file or directory
error: exec: "--job-classname": executable file not found in $PATH,但是我手动docker临时起一个flink镜像发现这些文件,路径啊都存在的,有没有大神帮忙解答一下
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。
实时计算Flink版是阿里云提供的全托管Serverless Flink云服务,基于 Apache Flink 构建的企业级、高性能实时大数据处理系统。提供全托管版 Flink 集群和引擎,提高作业开发运维效率。