You can run Spark code using the command line or by creating a JAR file
(via IntelliJ or other IDE); however, you may wish to try a Databricks
Community Edition account instead. They offer Spark as a managed service,
and you can run Spark commands one at a time via interactive notebooks.
There are built-in visualization tools, with the ability to integrate to
3rd party ones if you wish.

This type of development is very similar to iPython, but they provide a 6GB
cluster with their free accounts. They also provide many example notebooks
to help you learn various aspects of Spark.

https://databricks.com/try-databricks

Thanks,
Kevin

On Fri, Sep 23, 2016 at 2:37 PM, Dan Bikle <bikle...@gmail.com> wrote:

> hello spark-world,
>
> I am new to spark and want to learn how to use it.
>
> I come from the Python world.
>
> I see an example at the url below:
>
> http://spark.apache.org/docs/latest/ml-pipeline.html#
> example-estimator-transformer-and-param
>
> What would be an optimal way to run the above example?
>
> In the Python world I would just feed the name of the script to Python on
> the command line.
>
> In the spark-world would people just start spark-shell and use a mouse to
> feed in the syntax?
>
> Perhaps people would follow the example here which uses a combo of sbt and
> spark-submit:
>
> http://spark.apache.org/docs/latest/quick-start.html#self-
> contained-applications
>
> ??
>
> Perhaps people usually have a Java-mindset and use an IDE built for
> spark-development?
> If so, which would be considered the best IDE for Spark? IntelliJ?
>
>

Reply via email to