Thanks guys, this is very useful :)

@Stephen, I know spark-shell will create a SC for me. But I don't
understand why we still need to do "new SparkContext(...)" in our code.
Shouldn't we get it from some where? e.g. "SparkContext.get".

Another question, if I want my spark code to run in YARN later, how should
I create the SparkContext? Or I can just specify "--marst yarn" on command
line?


Thanks,
David


On Fri, Mar 6, 2015 at 12:38 PM Koen Vantomme <koen.vanto...@gmail.com>
wrote:

> use the spark-shell command and the shell will open
> type :paste abd then paste your code, after control-d
>
> open spark-shell:
> sparks/bin
> ./spark-shell
>
> Verstuurd vanaf mijn iPhone
>
> Op 6-mrt.-2015 om 02:28 heeft "fightf...@163.com" <fightf...@163.com> het
> volgende geschreven:
>
> Hi,
>
> You can first establish a scala ide to develop and debug your spark
> program, lets say, intellij idea or eclipse.
>
> Thanks,
> Sun.
>
> ------------------------------
> fightf...@163.com
>
>
> *From:* Xi Shen <davidshe...@gmail.com>
> *Date:* 2015-03-06 09:19
> *To:* user@spark.apache.org
> *Subject:* Spark code development practice
> Hi,
>
> I am new to Spark. I see every spark program has a main() function. I
> wonder if I can run the spark program directly, without using spark-submit.
> I think it will be easier for early development and debug.
>
>
> Thanks,
> David
>
>

Reply via email to