hello spark-world,

I am new to spark and want to learn how to use it.

I come from the Python world.

I see an example at the url below:

http://spark.apache.org/docs/latest/ml-pipeline.html#example-estimator-transformer-and-param

What would be an optimal way to run the above example?

In the Python world I would just feed the name of the script to Python on
the command line.

In the spark-world would people just start spark-shell and use a mouse to
feed in the syntax?

Perhaps people would follow the example here which uses a combo of sbt and
spark-submit:

http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications

??

Perhaps people usually have a Java-mindset and use an IDE built for
spark-development?
If so, which would be considered the best IDE for Spark? IntelliJ?

Reply via email to