How can we read all parquet files in a directory in spark-sql. We are following this example which shows a way to read one file:
// Read in the parquet file created above. Parquet files are self-describing so the schema is preserved.// The result of loading a Parquet file is also a SchemaRDD.val parquetFile = sqlContext.parquetFile("people.parquet") //Parquet files can also be registered as tables and then used in SQL statements.parquetFile.registerTempTable("parquetFile")