AwasthiSomesh commented on issue #12235:
URL: https://github.com/apache/iceberg/issues/12235#issuecomment-2653907845
@nastra my use case is data transfer from source to target .
source will read data from AWSDATACATALOG1 and target will write in to
AWSDATACATALOG2
Then how we can use single spark to pass two default catalog value
is there any way or its limitation from spark side.
Could you please confirm this point only.
**Note:-** Spark is allowing "spark.sql.extensions" key with 2 diff value
using semicolon separated but not for other keys.
.config("spark.sql.extensions",
"org.apache.iceberg.spark.extensions.DeltaSparkSessionExtensions ,
org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
Thanks,
Somesh
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]