I am calling the Spark Dataset API (map method) and getting exceptions on
deserialization of task results. I am calling this API from Clojure using
standard JVM interop syntax.
This gist has a tiny Clojure program that shows the problem, as well as the
corresponding (working) Scala
Dear Spark Expert,
I have a issue with below "--conf 'spark.redaction.regex"
Issue:
I am passing some secret keys in spark-submit command. I am using below to
redact the key: --conf 'spark.redaction.regex='secret_key'
though it is working, the secret_key is visible in sparkUI during job
I met the same problem in spark 2.3.x, and find the pull request
https://github.com/apache/spark/pull/24616 to fixed, at first I think it should
solve this problem ,but it seem not yet, I’m wondering which part limit the
delete performance?
发件人: Kohki Nishio [mailto:tarop...@gmail.com]
发送时间: