Github user dongjoon-hyun commented on the issue: https://github.com/apache/spark/pull/21878 Thank you, @gengliangwang and @HyukjinKwon . For the message, I follow the way to use Apache Spark `external` module; e.g. we need to specify `kafka` like the following. `avro` is also designed as `external` module. We should follow that. ``` ./bin/spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1 ``` @gengliangwang . I assumed that you are the best person to add `avro` document. If you need me to do something, please feel free to let me know. :)
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org