Re: Streaming Job gives error after changing to version 1.5.2

2015-11-19 Thread swetha kasireddy
That was actually an issue with our Mesos.

On Wed, Nov 18, 2015 at 5:29 PM, Tathagata Das <t...@databricks.com> wrote:

> If possible, could you give us the root cause and solution for future
> readers of this thread.
>
> On Wed, Nov 18, 2015 at 6:37 AM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> It works fine after some changes.
>>
>> -Thanks,
>> Swetha
>>
>> On Tue, Nov 17, 2015 at 10:22 PM, Tathagata Das <t...@databricks.com>
>> wrote:
>>
>>> Can you verify that the cluster is running the correct version of Spark.
>>> 1.5.2.
>>>
>>> On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <
>>> swethakasire...@gmail.com> wrote:
>>>
>>>> Sorry compile makes it work locally. But, the cluster
>>>> still seems to have issues with provided. Basically it
>>>> does not seem to process any records, no data is shown in any of the tabs
>>>> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
>>>> etc show empty RDDs.
>>>>
>>>> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
>>>> swethakasire...@gmail.com> wrote:
>>>>
>>>>> Hi TD,
>>>>>
>>>>> Basically, I see two issues. With provided the job
>>>>> does not start localy. It does start in Cluster but seems  no data is
>>>>> getting processed.
>>>>>
>>>>> Thanks,
>>>>> Swetha
>>>>>
>>>>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram <tim.barth...@iag.com.au
>>>>> > wrote:
>>>>>
>>>>>> If you are running a local context, could it be that you should use:
>>>>>>
>>>>>>
>>>>>>
>>>>>> provided
>>>>>>
>>>>>>
>>>>>>
>>>>>> ?
>>>>>>
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Tim
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>>>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>>>>> *To:* Tathagata Das
>>>>>> *Cc:* user
>>>>>> *Subject:* Re: Streaming Job gives error after changing to version
>>>>>> 1.5.2
>>>>>>
>>>>>>
>>>>>>
>>>>>> This error I see locally.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das <t...@databricks.com>
>>>>>> wrote:
>>>>>>
>>>>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha <swethakasire...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>>>>> version to 1.5.2. Any idea as to why this is happening? Following are
>>>>>> my
>>>>>> dependencies and the error that I get.
>>>>>>
>>>>>>   
>>>>>> org.apache.spark
>>>>>> spark-core_2.10
>>>>>> ${sparkVersion}
>>>>>> provided
>>>>>> 
>>>>>>
>>>>>>
>>>>>> 
>>>>>> org.apache.spark
>>>>>> spark-streaming_2.10
>>>>>> ${sparkVersion}
>>>>>> provided
>>>>>> 
>>>>>>
>>>>>>
>>>>>> 
>>>>>> org.apache.spark
>>>>>> spark-sql_2.10
>>>>>> ${sparkVersion}
>>>>>> provided
>>>>>> 
>>>>>>
>>>>>>
>>>>>> 
>>>>>> org.apache.spark
>>>>>> spark-hive_2.10
>>>>>> ${sparkVersion}
>>>&g

Re: Streaming Job gives error after changing to version 1.5.2

2015-11-18 Thread swetha kasireddy
It works fine after some changes.

-Thanks,
Swetha

On Tue, Nov 17, 2015 at 10:22 PM, Tathagata Das <t...@databricks.com> wrote:

> Can you verify that the cluster is running the correct version of Spark.
> 1.5.2.
>
> On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> Sorry compile makes it work locally. But, the cluster
>> still seems to have issues with provided. Basically it
>> does not seem to process any records, no data is shown in any of the tabs
>> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
>> etc show empty RDDs.
>>
>> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
>> swethakasire...@gmail.com> wrote:
>>
>>> Hi TD,
>>>
>>> Basically, I see two issues. With provided the job does
>>> not start localy. It does start in Cluster but seems  no data is
>>> getting processed.
>>>
>>> Thanks,
>>> Swetha
>>>
>>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram <tim.barth...@iag.com.au>
>>> wrote:
>>>
>>>> If you are running a local context, could it be that you should use:
>>>>
>>>>
>>>>
>>>>     provided
>>>>
>>>>
>>>>
>>>> ?
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Tim
>>>>
>>>>
>>>>
>>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>>> *To:* Tathagata Das
>>>> *Cc:* user
>>>> *Subject:* Re: Streaming Job gives error after changing to version
>>>> 1.5.2
>>>>
>>>>
>>>>
>>>> This error I see locally.
>>>>
>>>>
>>>>
>>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das <t...@databricks.com>
>>>> wrote:
>>>>
>>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>>
>>>>
>>>>
>>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha <swethakasire...@gmail.com>
>>>> wrote:
>>>>
>>>>
>>>>
>>>> Hi,
>>>>
>>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>>> version to 1.5.2. Any idea as to why this is happening? Following are my
>>>> dependencies and the error that I get.
>>>>
>>>>   
>>>> org.apache.spark
>>>> spark-core_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-streaming_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-sql_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-hive_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-streaming-kafka_2.10
>>>> ${sparkVersion}
>>>> 
>>>>
>>>>
>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> org/apache/spark/streaming/StreamingContext
>>>> at java.lang.Class.getDeclaredMethods0(Native Method)
>>>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>>>> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>>>> at java.lang.Class.getMethod0(Class.java:3010)
>>>> at java.lang.Class.getMethod(Class.java:1776)
>>>> at
>>>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> org.apache.spark.streaming.StreamingContext
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>>>> at java.net.URLClassLoader$1.run(URLClassLo

Re: Streaming Job gives error after changing to version 1.5.2

2015-11-18 Thread Tathagata Das
If possible, could you give us the root cause and solution for future
readers of this thread.

On Wed, Nov 18, 2015 at 6:37 AM, swetha kasireddy <swethakasire...@gmail.com
> wrote:

> It works fine after some changes.
>
> -Thanks,
> Swetha
>
> On Tue, Nov 17, 2015 at 10:22 PM, Tathagata Das <t...@databricks.com>
> wrote:
>
>> Can you verify that the cluster is running the correct version of Spark.
>> 1.5.2.
>>
>> On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <
>> swethakasire...@gmail.com> wrote:
>>
>>> Sorry compile makes it work locally. But, the cluster
>>> still seems to have issues with provided. Basically it
>>> does not seem to process any records, no data is shown in any of the tabs
>>> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
>>> etc show empty RDDs.
>>>
>>> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
>>> swethakasire...@gmail.com> wrote:
>>>
>>>> Hi TD,
>>>>
>>>> Basically, I see two issues. With provided the job does
>>>> not start localy. It does start in Cluster but seems  no data is
>>>> getting processed.
>>>>
>>>> Thanks,
>>>> Swetha
>>>>
>>>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram <tim.barth...@iag.com.au>
>>>> wrote:
>>>>
>>>>> If you are running a local context, could it be that you should use:
>>>>>
>>>>>
>>>>>
>>>>> provided
>>>>>
>>>>>
>>>>>
>>>>> ?
>>>>>
>>>>>
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Tim
>>>>>
>>>>>
>>>>>
>>>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>>>> *To:* Tathagata Das
>>>>> *Cc:* user
>>>>> *Subject:* Re: Streaming Job gives error after changing to version
>>>>> 1.5.2
>>>>>
>>>>>
>>>>>
>>>>> This error I see locally.
>>>>>
>>>>>
>>>>>
>>>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das <t...@databricks.com>
>>>>> wrote:
>>>>>
>>>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>>>
>>>>>
>>>>>
>>>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha <swethakasire...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>>>> version to 1.5.2. Any idea as to why this is happening? Following are
>>>>> my
>>>>> dependencies and the error that I get.
>>>>>
>>>>>   
>>>>> org.apache.spark
>>>>> spark-core_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-streaming_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-sql_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-hive_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-streaming-kafka_2.10
>>>>> ${sparkVersion}
>>>>> 
>>>>>
>>>>>
>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>> org/apache/spark/streaming/StreamingContext
>>>>> at java.lang.Class.getDeclaredMeth

Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread Tathagata Das
Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?

On Tue, Nov 17, 2015 at 5:34 PM, swetha  wrote:

>
>
> Hi,
>
> I see  java.lang.NoClassDefFoundError after changing the Streaming job
> version to 1.5.2. Any idea as to why this is happening? Following are my
> dependencies and the error that I get.
>
>   
> org.apache.spark
> spark-core_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-streaming_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-sql_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-hive_2.10
> ${sparkVersion}
> provided
> 
>
>
>
> 
> org.apache.spark
> spark-streaming-kafka_2.10
> ${sparkVersion}
> 
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/streaming/StreamingContext
> at java.lang.Class.getDeclaredMethods0(Native Method)
> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
> at java.lang.Class.getMethod0(Class.java:3010)
> at java.lang.Class.getMethod(Class.java:1776)
> at
> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.streaming.StreamingContext
> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread swetha kasireddy
This error I see locally.

On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das  wrote:

> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>
> On Tue, Nov 17, 2015 at 5:34 PM, swetha  wrote:
>
>>
>>
>> Hi,
>>
>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>> version to 1.5.2. Any idea as to why this is happening? Following are my
>> dependencies and the error that I get.
>>
>>   
>> org.apache.spark
>> spark-core_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-streaming_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-sql_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-hive_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>>
>> 
>> org.apache.spark
>> spark-streaming-kafka_2.10
>> ${sparkVersion}
>> 
>>
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/spark/streaming/StreamingContext
>> at java.lang.Class.getDeclaredMethods0(Native Method)
>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>> at java.lang.Class.getMethod0(Class.java:3010)
>> at java.lang.Class.getMethod(Class.java:1776)
>> at
>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.spark.streaming.StreamingContext
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread Tathagata Das
Can you verify that the cluster is running the correct version of Spark.
1.5.2.

On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <swethakasire...@gmail.com
> wrote:

> Sorry compile makes it work locally. But, the cluster
> still seems to have issues with provided. Basically it
> does not seem to process any records, no data is shown in any of the tabs
> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
> etc show empty RDDs.
>
> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> Hi TD,
>>
>> Basically, I see two issues. With provided the job does
>> not start localy. It does start in Cluster but seems  no data is getting
>> processed.
>>
>> Thanks,
>> Swetha
>>
>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram <tim.barth...@iag.com.au>
>> wrote:
>>
>>> If you are running a local context, could it be that you should use:
>>>
>>>
>>>
>>> provided
>>>
>>>
>>>
>>> ?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Tim
>>>
>>>
>>>
>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>> *To:* Tathagata Das
>>> *Cc:* user
>>> *Subject:* Re: Streaming Job gives error after changing to version 1.5.2
>>>
>>>
>>>
>>> This error I see locally.
>>>
>>>
>>>
>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das <t...@databricks.com>
>>> wrote:
>>>
>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>
>>>
>>>
>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha <swethakasire...@gmail.com>
>>> wrote:
>>>
>>>
>>>
>>> Hi,
>>>
>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>> version to 1.5.2. Any idea as to why this is happening? Following are my
>>> dependencies and the error that I get.
>>>
>>>   
>>> org.apache.spark
>>> spark-core_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-streaming_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-sql_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-hive_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-streaming-kafka_2.10
>>> ${sparkVersion}
>>> 
>>>
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> org/apache/spark/streaming/StreamingContext
>>> at java.lang.Class.getDeclaredMethods0(Native Method)
>>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>>> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>>> at java.lang.Class.getMethod0(Class.java:3010)
>>> at java.lang.Class.getMethod(Class.java:1776)
>>> at
>>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.spark.streaming.StreamingContext
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
&g