SandishKumarHN opened a new pull request, #37972:
URL: https://github.com/apache/spark/pull/37972

   From SandishKumarHN(sanysand...@gmail.com) and Mohan 
Parthasarathy(mposde...@gmail.com)
   
   # Introduction
   
   Protocol buffers are Google's language-neutral, platform-neutral, extensible 
mechanism for serializing structured data. It is widely used in Kafka-based 
data pipelines. Unlike Avro, Spark does not have native support for protobuf. 
This PR provides two new functions from_proto/to_proto to read and write 
Protobuf data within a data frame. 
   
   The implementation is closely modeled after Avro implementation so that it 
is easy to understand and review the changes. 
   
   Following is an example of a typical usage.
   
    
   
   ```scala
   // `from_proto` requires absolute path of Protobuf schema file
   // and the protobuf message within the file
   val userProtoFile = "./examples/src/main/resources/user.desc"
   val userProtoMsg = "User"
   
   val df = spark
     .readStream
     .format("kafka")
     .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
     .option("subscribe", "proto-topic-in")
     .load()
   
   // 1. Decode the Protobuf data into a struct;
   // 2. Filter by column `favorite_color`;
   // 3. Encode the column `name` in Protobuf format.
   val output = df
     .select(from_proto('value, userProtoFile, userProtoMsg) as 'user)
     .where("user.favorite_color == \"red\"")
     .select(to_proto($"user.name") as 'value)
   
   val query = output
     .writeStream
     .format("kafka")
     .option("kafka.bootstrap.servers", "host1:port1,host2:port2")
     .option("topic", "proto-topic-out")
     .start()
   ```
   
   The new functions are very similar to Avro
   
   - from_proto requires the proto descriptor file and the message type within 
that file which is similar to from_avro requiring the JSON schema.
   - to_proto is similar to to_avro and does not require the proto descriptor 
file as it can build the schema (protobuf descriptor) from the catalyst types. 
Similarly, to_proto (like to_avro) can also take in the descriptor file for 
describing the schema
   
   ## What is supported
   
   - Protobuf format proto3 is supported ( Even though proto2 and proto3 is 
inter-operable, we have explicitly tested only with proto3)
   - Google protobuf supported types
       - Scalar value types
       - Enumerations
       - Message types as field types
       - Nested Messages
       - Maps
       - Unknown fields are well-formed protocol buffer serialized data 
representing fields that the parser does not recognize. Original version of 
proto3 did not include this when there are parsing problems. This feature is 
needed to detect schemas that does not match the message type and needed to 
support FAIL_SAFE and PERMISSIVE mode. This feature is available in proto3 with 
version. 3.5 onwards
   
   ## What is not supported
   
   - Any requires the knowledge of the underlying object type when 
deserializing the message and generally not considered type safe
   - OneOf requires the knowledge of the object type that was encoded when 
deserializing the message
   - Custom Options is an advanced feature within protobuf where the users can 
define their own options
   - Catalyst types that are not natively supported in protobuf. This happens 
normally during serialization and an exception will be thrown when following 
types are encountered
       - DecimalType
       - DateType
       - TimestampType
   
   ## Test cases covered
   
   Tests have been written to test at different levels
   
   - from_proto / to_proto (ProtoFunctionSuite)
   - ProtoToCatalyst / CatalystToProto 
(ProtoCatalystDataConversionSuite****)****
   - ProtoDeserializer / ProtoSerializer (ProtoSerdeSuite)
   
   ### ProtoFunctionSuite
   
   A bunch of roundtrip tests that go through to_proto(from_proto) or 
from_proto(to_proto) and compare the results. It also repeats some of the tests 
where to_proto is called without a descriptor file where the protobuf 
descriptor is built from the catalyst types.
   
   - roundtrip in to_proto and from_proto for struct for protobuf scalar types
   - roundtrip in to_proto(without descriptor params) and from_proto - struct 
for protobuf scalar types
   - roundtrip in from_proto and to_proto - Repeated protobuf types
   - roundtrip in from_proto and to_proto - Repeated Message Once
   - roundtrip in from_proto and to_proto(without descriptor params) - Repeated 
Message Once”
   - roundtrip in from_proto and to_proto - Repeated Message Twice
   - roundtrip in from_proto and to_proto(without descriptor params) - Repeated 
Message Twice
   - roundtrip in from_proto and to_proto - Map
   - roundtrip in from_proto and to_proto(without descriptor params) - Map
   - roundtrip in from_proto and to_proto - Enum
   - roundtrip in from_proto and to_proto - Multiple Message
   - roundtrip in from_proto and to_proto(without descriptor params) - Multiple 
Message
   - roundtrip in to_proto and from_proto - with null
   
   ### ProtoSerdeSuite
   
   - Test basic conversion - serialize(deserialize(message)) == message
   - Fail to convert with field type mismatch - Make sure the right exception 
is thrown for incompatible schema for serializer and deserializer
   - Fail to convert with missing nested Proto fields
   - Fail to convert with deeply nested field type mismatch
   - Fail to convert with missing Catalyst fields
   
   ### ****ProtoCatalystDataConversionSuite****
   
   - ProtoToCatalyst(to_proto(basic_catalyst_types )): 
Boolean,Integer,Double,Float,Binary,String,Byte,Shost
   - Handle unsupported input of Message type: Serialize a message first and 
deserialize using a bad schema. Test with FAILFAST to get an exception and 
PERMISSIVE to get a null row
   - filter push-down to proto deserializer: Filtering the rows based on filter 
during proto deserialization
   - Test ProtoDeserializer with binary message type


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to