[ 
https://issues.apache.org/jira/browse/PARQUET-1989?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17711367#comment-17711367
 ] 

Steve Loughran commented on PARQUET-1989:
-----------------------------------------

you might want to have a design which can do the scan on a spark rdd, where the 
rdd is simply the deep listFiles(path) scan of the directory tree. This would 
give the best scale for a massive dataset compared to even some parallelised 
scan in a single process.

I do have an RDD which can do line-by-line work, with locality of work 
determined on each file, which lets you schedule the work on the relevant hdfs 
nodes with the data; unfortunately it needs to be in the o.a.spark package to 
build
https://github.com/hortonworks-spark/cloud-integration/blob/master/spark-cloud-integration/src/main/scala/org/apache/spark/cloudera/ParallelizedWithLocalityRDD.scala

...that could maybe be added to spark itself.



> Deep verification of encrypted files
> ------------------------------------
>
>                 Key: PARQUET-1989
>                 URL: https://issues.apache.org/jira/browse/PARQUET-1989
>             Project: Parquet
>          Issue Type: New Feature
>          Components: parquet-cli
>            Reporter: Gidon Gershinsky
>            Assignee: Maya Anderson
>            Priority: Major
>             Fix For: 1.14.0
>
>
> A tools that verifies encryption of parquet files in a given folder. Analyzes 
> the footer, and then every module (page headers, pages, column indexes, bloom 
> filters) - making sure they are encrypted (in relevant columns). Potentially 
> checking the encryption keys.
> We'll start with a design doc, open for discussion.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to