Spark - ReturnStatementInClosureException: Return statements aren‘t allowed in Spark closures

简介: Spark 使用 RDD 调用 Filter 函数时,dirver 端卡住,报错 ReturnStatementInClosureException: Return statements aren't allowed in Spark closures,即闭包内无法使用 return 函数。

 一.引言

Spark 使用 RDD 调用 Filter 函数时,dirver 端卡住,报错 ReturnStatementInClosureException: Return statements aren't allowed in Spark closures,即闭包内无法使用 return 函数:

image.gif编辑

二.使用场景

使用 rdd.filter 方法过滤 id 时使用了 return 方法,导致上述报错:

rdd.filter(arr => {
       val id = arr(0)
       val l = id.length()
       if (l <= 8) return false
       if (id.startsWith("1")) {
           true
       } else {
           false
       }
     })

image.gif

修改方案:

闭包函数内不使用 return 关键字即可

rdd.filter(arr => {
        val id = arr(0)
        val l = id.length()
        if (l <= 8) {
            false
        } else if (id.startsWith("1")) {
            true
        } else {
            false
        }
    })

image.gif


目录
相关文章
|
5月前
|
SQL 分布式计算 HIVE
[已解决]Job failed with org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in st
[已解决]Job failed with org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in st
93 0
|
10月前
|
SQL 数据库
this is incompatible with sql_mode=only_full_group_by
this is incompatible with sql_mode=only_full_group_by
55 0
Parameter ‘bookID‘ not found. Available parameters are [param1, bookId]
Parameter ‘bookID‘ not found. Available parameters are [param1, bookId]
105 0
|
分布式计算 Java Spark
Optimizing Spark job parameters
Optimizing Spark job parameters
74 0
|
SQL 分布式计算 Spark
my task doorway - where is it parsed
Created by Wang, Jerry on Apr 03, 2015
my task doorway - where is it parsed

热门文章

最新文章