[GitHub] [hudi] leesf commented on issue #5488: [SUPPORT] Read hive Table fail when HoodieCatalog used
leesf commented on issue #5488: URL: https://github.com/apache/hudi/issues/5488#issuecomment-1146421860 Closing the issue, @parisni please reopen if you have new problems. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [hudi] leesf commented on issue #5488: [SUPPORT] Read hive Table fail when HoodieCatalog used
leesf commented on issue #5488: URL: https://github.com/apache/hudi/issues/5488#issuecomment-1132950771 > yeah sorry this was internal code leading to the same result: > SparkSession.builder().config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.hudi.catalog.HoodieCatalog").config("spark.sql.extensions", "org.apache.spark.sql.hudi.HoodieSparkSessionExtension").getOrCreate() > you can test my code snippet in OP and reproduce the error on your side > […](#) > On Fri May 20, 2022 at 9:08 AM CEST, leesf wrote: > nope, sadly adding bellow configs don't solve the issue > > ``` > sparkConf.set( > "spark.sql.catalog.spark_catalog", "org.apache.spark.sql.hudi.catalog.HoodieCatalog"); > sparkConf.set("spark.sql.extensions", "org.apache.spark.sql.hudi.HoodieSparkSessionExtension"); > ``` @parisni I think you are not set the config correctly, please use the following code ``` SparkSession.builder().config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.hudi.catalog.HoodieCatalog").config("spark.sql.extensions", "org.apache.spark.sql.hudi.HoodieSparkSessionExtension").getOrCreate() ``` or pyspark --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.11.0 --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' --conf 'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog' --conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' use this command to open a new spark shell -- Reply to this email directly or view it on GitHub: [#5488 (comment)](https://github.com/apache/hudi/issues/5488#issuecomment-1132552145) You are receiving this because you were mentioned. Message ID: ***@***.***> I tried in my local env and show the same result as xushiyan pasted. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [hudi] leesf commented on issue #5488: [SUPPORT] Read hive Table fail when HoodieCatalog used
leesf commented on issue #5488: URL: https://github.com/apache/hudi/issues/5488#issuecomment-1132552145 > nope, sadly adding bellow configs don't solve the issue > > ``` > sparkConf.set( > "spark.sql.catalog.spark_catalog", "org.apache.spark.sql.hudi.catalog.HoodieCatalog"); > sparkConf.set("spark.sql.extensions", "org.apache.spark.sql.hudi.HoodieSparkSessionExtension"); > ``` @parisni I think you are not set the config correctly, please use the following code ``` SparkSession.builder().config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.hudi.catalog.HoodieCatalog").config("spark.sql.extensions", "org.apache.spark.sql.hudi.HoodieSparkSessionExtension").getOrCreate() ``` or pyspark --packages org.apache.hudi:hudi-spark3.2-bundle_2.12:0.11.0 --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' --conf 'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog' --conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' use this command to open a new spark shell -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [hudi] leesf commented on issue #5488: [SUPPORT] Read hive Table fail when HoodieCatalog used
leesf commented on issue #5488: URL: https://github.com/apache/hudi/issues/5488#issuecomment-1126579180 The conf `--conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension` is missed in [Spark Guide](https://hudi.apache.org/docs/quick-start-guide) for scala and python -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [hudi] leesf commented on issue #5488: [SUPPORT] Read hive Table fail when HoodieCatalog used
leesf commented on issue #5488: URL: https://github.com/apache/hudi/issues/5488#issuecomment-1126577288 @parisni would you please add the conf `--conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'` as well? this would solve your problem. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org