aokolnychyi commented on a change in pull request #26751: [SPARK-30107][SQL] 
Expose nested schema pruning to all V2 sources
URL: https://github.com/apache/spark/pull/26751#discussion_r356602388
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/FileScanBuilder.scala
 ##########
 @@ -27,15 +27,20 @@ abstract class FileScanBuilder(
     dataSchema: StructType) extends ScanBuilder with 
SupportsPushDownRequiredColumns {
   private val partitionSchema = fileIndex.partitionSchema
   private val isCaseSensitive = 
sparkSession.sessionState.conf.caseSensitiveAnalysis
+  protected val supportsNestedSchemaPruning: Boolean = false
   protected var requiredSchema = StructType(dataSchema.fields ++ 
partitionSchema.fields)
 
   override def pruneColumns(requiredSchema: StructType): Unit = {
+    // [SPARK-30107] While the passed `requiredSchema` always have pruned 
nested columns, the actual
+    // data schema of this scan is determined in `readDataSchema`. File 
formats that don't support
+    // nested schema pruning, use `requiredSchema` as a reference and perform 
the pruning partially.
     this.requiredSchema = requiredSchema
 
 Review comment:
   I think you are right and it seems that data sources such as CSV and JSON 
try to simply ignore columns that are not needed in readers (e.g. 
spark.sql.csv.parser.columnPruning.enabled).
   
   While CSV doesn't support nested data, JSON can potentially benefit from 
this. I haven't checked the JSON reader in detail to see whether it will need 
any changed. Sounds like a potential follow-up?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to