Once you have loaded information into a DataFrame, you can use the
*mapPartitionsi
or forEachPartition *operations to both identify the partitions and operate
against them.

http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.DataFrame


On Thu, Feb 25, 2016 at 9:24 AM, Deenar Toraskar <deenar.toras...@gmail.com>
wrote:

> Hi
>
> How does one check for the presence of a partition in a Spark SQL
> partitioned table (save using dataframe.write.partitionedBy("partCol") not
> hive compatible tables), other than physically checking the directory on
> HDFS or doing a count(*)  with the partition cols in the where clause ?
>
>
> Regards
> Deenar
>

Reply via email to