rangadi commented on code in PR #38922:
URL: https://github.com/apache/spark/pull/38922#discussion_r1053681591


##########
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/utils/ProtobufOptions.scala:
##########
@@ -38,6 +38,14 @@ private[sql] class ProtobufOptions(
 
   val parseMode: ParseMode =
     parameters.get("mode").map(ParseMode.fromString).getOrElse(FailFastMode)
+
+  // Setting the `recursive.fields.max.depth` to 0 drops all recursive fields,
+  // 1 allows it to be recurse once, and 2 allows it to be recursed twice and 
so on.
+  // A value of `recursive.fields.max.depth` greater than 10 is not permitted. 
If it is not
+  // specified, the default value is -1; recursive fields are not permitted. 
If a protobuf
+  // record has more depth than the allowed value for recursive fields, it 
will be truncated
+  // and some fields may be discarded.
+  val recursiveFieldMaxDepth: Int = 
parameters.getOrElse("recursive.fields.max.depth", "-1").toInt

Review Comment:
   @cloud-fan this is in line with options for Kafka source. e.g. 'kafka.' 
prefix allows setting Kafka clientconfigs.
   
   In addition, we will be passing more options. E.g. for schema registry auth 
configs. They will have a prefix like 'confluent.schemaregistry.[actual 
registry client conf]'
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to