Hello!

I'm trying to create my own Spark Connect Plugin by implementing
`org.apache.sql.connect.plugin.RelationPlugin`. My target is spark 4.0;

What I did:

1. I added compile time dependency `spark-connect-common_2.13`
2. I added `import public "spark/connect/base.proto";` to my message
3. I added `spark.connect.Plan data = 1;` to my message

I can successfully run `mvn generate-sources` but generated code cannot
be compiled.

I got a lot of errors:
- `org.sparkproject.connect.protobuf.Descriptors.FileDescriptor cannot
be converted to com.google.protobuf.Descriptors.FileDescriptor`
- `type argument org.apache.spark.connect.proto.Plan is not within
bounds of type-variable MType`
- `org.apache.spark.connect.proto.Plan cannot be converted to
com.google.protobuf.MessageLite`

I do not use any `LITE` runtime optimization for my proto message. I
even try to add explicitly `option java_generate_equals_and_hash =
true;` and `option optimize_for=SPEED;` but it did not help.

Is there any way to use `spark.connect.Proto` from `spark-connect-
common` in custom plugins? Or is there any other way to pass
`DataFrame` from client to a plugin that implements `RelationPlugin`?

Thanks in advance!

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to