I haven't used Spark in the last year and a half. I am about to start a
project with a new team, and we need to decide whether to use pyspark or
Scala.

We are NOT a java shop. So some of the build tools/procedures will require
some learning overhead if we go the Scala route. What I want to know is: is
the Scala version of Spark still far enough ahead of pyspark to be well
worth any initial training overhead?

Particularly, we will be using Spark Streaming. I know a couple of years
ago that practically forced the decision to use Scala.  Is this still the
case?

Thanks in advance!

Reply via email to