you can also access SparkConf using sc.getConf in Spark shell though for
StreamingContext you can directly refer sc as Akhil suggested.

On Sun, Dec 28, 2014 at 12:13 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> In the shell you could do:
>
> val ssc = StreamingContext(*sc*, Seconds(1))
>
> as *sc* is the SparkContext, which is already instantiated.
>
> Thanks
> Best Regards
>
> On Sun, Dec 28, 2014 at 6:55 AM, Thomas Frisk <tfris...@gmail.com> wrote:
>
>> Yes you are right - thanks for that :)
>>
>> On 27 December 2014 at 23:18, Ilya Ganelin <ilgan...@gmail.com> wrote:
>>
>>> Are you trying to do this in the shell? Shell is instantiated with a
>>> spark context named sc.
>>>
>>> -Ilya Ganelin
>>>
>>> On Sat, Dec 27, 2014 at 5:24 PM, tfrisk <tfris...@gmail.com> wrote:
>>>
>>>>
>>>> Hi,
>>>>
>>>> Doing:
>>>>    val ssc = new StreamingContext(conf, Seconds(1))
>>>>
>>>> and getting:
>>>>    Only one SparkContext may be running in this JVM (see SPARK-2243). To
>>>> ignore this error, set spark.driver.allowMultipleContexts = true.
>>>>
>>>>
>>>> But I dont think that I have another SparkContext running. Is there any
>>>> way
>>>> I can check this or force kill ?  I've tried restarting the server as
>>>> I'm
>>>> desperate but still I get the same issue.  I was not getting this
>>>> earlier
>>>> today.
>>>>
>>>> Any help much appreciated .....
>>>>
>>>> Thanks,
>>>>
>>>> Thomas
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-StreamingContext-getting-SPARK-2243-tp20869.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>
>>
>

Reply via email to