开发者社区> 问答> 正文

PyFlink中通过TableEnvironment.explain_sq查看执行计划的步骤是什么?

PyFlink中通过TableEnvironment.explain_sq查看执行计划的步骤是什么?

展开
收起
游客qzzytmszf3zhq 2021-12-07 14:49:38 247 0
1 条回答
写回答
取消 提交回答
  • == Abstract Syntax Tree ==
    LogicalSink(table=[default_catalog.default_database.my_sink], fields=[EXPR$0])
    +- LogicalProject(EXPR$0=[sub_string($0, 2, 4)])
       +- LogicalTableScan(table=[[default_catalog, default_database, Unregistered_TableSource_1143388267, source: [PythonInputFormatTableSource(a)]]])
    
    == Optimized Logical Plan ==
    Sink(table=[default_catalog.default_database.my_sink], fields=[EXPR$0])
    +- PythonCalc(select=[sub_string(a, 2, 4) AS EXPR$0])
       +- LegacyTableSourceScan(table=[[default_catalog, default_database, Unregistered_TableSource_1143388267, source: [PythonInputFormatTableSource(a)]]], fields=[a])
    
    == Physical Execution Plan ==
    Stage 1 : Data Source
        content : Source: PythonInputFormatTableSource(a)
    
        Stage 2 : Operator
            content : SourceConversion(table=[default_catalog.default_database.Unregistered_TableSource_1143388267, source: [PythonInputFormatTableSource(a)]], fields=[a])
            ship_strategy : FORWARD
    
            Stage 3 : Operator
                content : StreamExecPythonCalc
                ship_strategy : FORWARD
    
                Stage 4 : Data Sink
                    content : Sink: Sink(table=[default_catalog.default_database.my_sink], fields=[EXPR$0])
                    ship_strategy : FORWARD
    
    2021-12-07 14:49:51
    赞同 展开评论 打赏
问答分类:
问答地址:
问答排行榜
最热
最新

相关电子书

更多
Spark SQL:Another 16x faster 立即下载
Spark SQL 2.0/2.1 Experiences 立即下载
GeoMesa on Spark SQL 立即下载