开发者社区> 问答> 正文

Parquet中读写Parquet文件的操作是什么?

Parquet中读写Parquet文件的操作是什么?

展开
收起
游客qzzytmszf3zhq 2021-12-09 20:02:27 306 0
1 条回答
写回答
取消 提交回答
  • // Encoders for most common types are automatically provided by importing spark.implicits._import spark.implicits._
    
    val peopleDF = spark.read.json("examples/src/main/resources/people.json")
    
    // DataFrames can be saved as Parquet files, maintaining the schema informationpeopleDF.write.parquet("people.parquet")
    
    // Read in the parquet file created above// Parquet files are self-describing so the schema is preserved// The result of loading a Parquet file is also a DataFrameval parquetFileDF = spark.read.parquet("people.parquet")
    
    // Parquet files can also be used to create a temporary view and then used in SQL statementsparquetFileDF.createOrReplaceTempView("parquetFile")val namesDF = spark.sql("SELECT name FROM parquetFile WHERE age BETWEEN 13 AND 19")namesDF.map(attributes => "Name: " + attributes(0)).show()// +------------+// |       value|// +------------+// |Name: Justin|// +------------+
    
    2021-12-09 20:02:43
    赞同 展开评论 打赏
问答地址:
问答排行榜
最热
最新

相关电子书

更多
From Apache ORC to AliORC 立即下载
File Format Benchmark - Avro, JSON, ORC, & Parquet 立即下载
Spark + Parquet in Depth 立即下载