Re: Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Mark Hamstra
It sounds like you should be writing an application and not trying to force the spark-shell to do more than what it was intended for. On Tue, Sep 13, 2016 at 11:53 AM, Kevin Burton wrote: > I sort of agree but the problem is that some of this should be code. > > Some of our

Re: Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Kevin Burton
I sort of agree but the problem is that some of this should be code. Some of our ES indexes have 100-200 columns. Defining which ones are arrays on the command line is going to get ugly fast. On Tue, Sep 13, 2016 at 11:50 AM, Sean Owen wrote: > You would generally use

Re: Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Sean Owen
You would generally use --conf to set this on the command line if using the shell. On Tue, Sep 13, 2016, 19:22 Kevin Burton wrote: > The problem is that without a new spark context, with a custom conf, > elasticsearch-hadoop is refusing to read in settings about the ES

Re: Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Kevin Burton
The problem is that without a new spark context, with a custom conf, elasticsearch-hadoop is refusing to read in settings about the ES setup... if I do a sc.stop() , then create a new one, it seems to work fine. But it isn't really documented anywhere and all the existing documentation is now

Re: Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Mich Talebzadeh
I think this works in a shell but you need to allow multiple spark contexts Spark context Web UI available at http://50.140.197.217:5 Spark context available as 'sc' (master = local, app id = local-1473789661846). Spark session available as 'spark'. Welcome to __

Re: Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Marcelo Vanzin
You're running spark-shell. It already creates a SparkContext for you and makes it available in a variable called "sc". If you want to change the config of spark-shell's context, you need to use command line option. (Or stop the existing context first, although I'm not sure how well that will

Re: Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Sean Owen
But you're in the shell there, which already has a SparkContext for you as sc. On Tue, Sep 13, 2016 at 6:49 PM, Kevin Burton wrote: > I'm rather confused here as to what to do about creating a new > SparkContext. > > Spark 2.0 prevents it... (exception included below) > >

Spark 2.0.0 won't let you create a new SparkContext?

2016-09-13 Thread Kevin Burton
I'm rather confused here as to what to do about creating a new SparkContext. Spark 2.0 prevents it... (exception included below) yet a TON of examples I've seen basically tell you to create a new SparkContext as standard practice: