flinkcdc 写入hudi报错:org.apache.flink.runtime.client.JobInitializationException: Could not start the JobMaster.
at org.apache.flink.runtime.jobmaster.DefaultJobMasterServiceProcess.lambda$new$0(DefaultJobMasterServiceProcess.java:97)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1609)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.util.FlinkRuntimeException: Failed to create checkpoint storage at checkpoint coordinator side.
at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1606)
... 3 more
Caused by: org.apache.flink.util.FlinkRuntimeException: Failed to create checkpoint storage at checkpoint coordinator side.
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.(CheckpointCoordinator.java:324)
这个错误是由于Flink在创建检查点存储时出现了问题。你可以尝试以下方法来解决这个问题:
- 增加内存分配给Flink任务。你可以在启动Flink任务时,通过设置
taskmanager.memory.process.size
参数来增加内存分配。例如,将内存分配设置为2GB:
flink run -m yarn-cluster -yn 4 -yjm 2g -c com.example.MyJob /path/to/your/jarfile.jar
- 检查你的Hudi配置是否正确。确保Hudi的配置文件(如
hudi-conf.yaml
)中的相关配置项设置正确,例如:
hoodie:
datasource:
write:
type: hudi
table: my_table
hive_sync_enable: true
hive_database: my_database
hive_table: my_table
hive_partition_fields: partition_field1,partition_field2
hive_partition_extractor_class: org.apache.hudi.hive.MultiPartKeysValueExtractor
hive_jdbc_url: jdbc:hive2://localhost:10000/my_database
hive_username: hive_user
hive_password: hive_password
- 如果问题仍然存在,尝试升级Flink和Hudi到最新版本,或者查看Flink和Hudi的官方文档以获取更多关于这个问题的信息。