The Apache Spark on Kubernetes Community Development Project is pleased to
announce the latest release of Apache Spark with native Scheduler Backend
for Kubernetes!  Features provided in this release include:


   -

   Cluster-mode submission of Spark jobs to a Kubernetes cluster
   -

   Support for Scala, Java and PySpark
   -

   Static and Dynamic Allocation for Executors
   -

   Automatic staging of local resources onto Driver and Executor pods
   -

   Configurable security and credential management
   -

   HDFS, running on the Kubernetes cluster or externally
   -

   Launch jobs using kubectl proxy
   -

   Built against Apache Spark 2.1 and 2.2
   -

   Support for Kubernetes 1.5 - 1.7
   -

   Pre-built docker images


Apache Spark on Kubernetes is currently being developed as an independent
community project, with several actively contributing companies in
collaboration. The project resides at the apache-spark-on-k8s GitHub
organization, and tracks upstream Apache Spark releases:

https://github.com/apache-spark-on-k8s/spark

If you have any questions or issues, we are happy to help! Please feel free
to reach out to the Spark on Kubernetes community on these channels:

   -

   Slack: https://kubernetes.slack.com #sig-big-data
   -

   User Documentation: https://apache-spark-on-k8s.github.io/userdocs/
   -

   GitHub issues: https://github.com/apache-spark-on-k8s/spark/issues
   -

   SIG: https://github.com/kubernetes/community/tree/master/sig-big-data

Reply via email to