Spark - ReturnStatementInClosureException: Return statements aren‘t allowed in Spark closures

简介: Spark 使用 RDD 调用 Filter 函数时,dirver 端卡住,报错 ReturnStatementInClosureException: Return statements aren't allowed in Spark closures,即闭包内无法使用 return 函数。

 一.引言

Spark 使用 RDD 调用 Filter 函数时,dirver 端卡住,报错 ReturnStatementInClosureException: Return statements aren't allowed in Spark closures,即闭包内无法使用 return 函数:

image.gif编辑

二.使用场景

使用 rdd.filter 方法过滤 id 时使用了 return 方法,导致上述报错:

rdd.filter(arr => {
       val id = arr(0)
       val l = id.length()
       if (l <= 8) return false
       if (id.startsWith("1")) {
           true
       } else {
           false
       }
     })

image.gif

修改方案:

闭包函数内不使用 return 关键字即可

rdd.filter(arr => {
        val id = arr(0)
        val l = id.length()
        if (l <= 8) {
            false
        } else if (id.startsWith("1")) {
            true
        } else {
            false
        }
    })

image.gif


目录
相关文章
|
7月前
(145) Table ‘./addon_collect_wukong_spider‘ is marked as crashed and should be repaired解决思路
(145) Table ‘./addon_collect_wukong_spider‘ is marked as crashed and should be repaired解决思路
31 0
|
8月前
|
分布式计算 安全 Java
Spark 编译出现 InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
Spark 编译出现 InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
218 0
|
数据库
Caused by: java.sql.SQLException: Table ‘.\***\*****‘ is marked as crashed and should be repaired
Caused by: java.sql.SQLException: Table ‘.\***\*****‘ is marked as crashed and should be repaired
130 0
|
SQL 数据库
this is incompatible with sql_mode=only_full_group_by
this is incompatible with sql_mode=only_full_group_by
103 0
params argument given to the optimizer should be an iterable
params argument given to the optimizer should be an iterable
205 0
params argument given to the optimizer should be an iterable
|
分布式计算 Java Spark
Optimizing Spark job parameters
Optimizing Spark job parameters
271 0

热门文章

最新文章