It works fine after some changes.

-Thanks,
Swetha

On Tue, Nov 17, 2015 at 10:22 PM, Tathagata Das <t...@databricks.com> wrote:

> Can you verify that the cluster is running the correct version of Spark.
> 1.5.2.
>
> On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> Sorry <scope>compile</scope> makes it work locally. But, the cluster
>> still seems to have issues with <scope>provided</scope>. Basically it
>> does not seem to process any records, no data is shown in any of the tabs
>> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
>> etc show empty RDDs.
>>
>> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
>> swethakasire...@gmail.com> wrote:
>>
>>> Hi TD,
>>>
>>> Basically, I see two issues. With <scope>provided</scope> the job does
>>> not start localy. It does start in Cluster but seems  no data is
>>> getting processed.
>>>
>>> Thanks,
>>> Swetha
>>>
>>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram <tim.barth...@iag.com.au>
>>> wrote:
>>>
>>>> If you are running a local context, could it be that you should use:
>>>>
>>>>
>>>>
>>>>             <scope>provided</scope>
>>>>
>>>>
>>>>
>>>> ?
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Tim
>>>>
>>>>
>>>>
>>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>>> *To:* Tathagata Das
>>>> *Cc:* user
>>>> *Subject:* Re: Streaming Job gives error after changing to version
>>>> 1.5.2
>>>>
>>>>
>>>>
>>>> This error I see locally.
>>>>
>>>>
>>>>
>>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das <t...@databricks.com>
>>>> wrote:
>>>>
>>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>>
>>>>
>>>>
>>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha <swethakasire...@gmail.com>
>>>> wrote:
>>>>
>>>>
>>>>
>>>> Hi,
>>>>
>>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>>> version to 1.5.2. Any idea as to why this is happening? Following are my
>>>> dependencies and the error that I get.
>>>>
>>>>   <dependency>
>>>>             <groupId>org.apache.spark</groupId>
>>>>             <artifactId>spark-core_2.10</artifactId>
>>>>             <version>${sparkVersion}</version>
>>>>             <scope>provided</scope>
>>>>         </dependency>
>>>>
>>>>
>>>>         <dependency>
>>>>             <groupId>org.apache.spark</groupId>
>>>>             <artifactId>spark-streaming_2.10</artifactId>
>>>>             <version>${sparkVersion}</version>
>>>>             <scope>provided</scope>
>>>>         </dependency>
>>>>
>>>>
>>>>         <dependency>
>>>>             <groupId>org.apache.spark</groupId>
>>>>             <artifactId>spark-sql_2.10</artifactId>
>>>>             <version>${sparkVersion}</version>
>>>>             <scope>provided</scope>
>>>>         </dependency>
>>>>
>>>>
>>>>         <dependency>
>>>>             <groupId>org.apache.spark</groupId>
>>>>             <artifactId>spark-hive_2.10</artifactId>
>>>>             <version>${sparkVersion}</version>
>>>>             <scope>provided</scope>
>>>>         </dependency>
>>>>
>>>>
>>>>
>>>>         <dependency>
>>>>             <groupId>org.apache.spark</groupId>
>>>>             <artifactId>spark-streaming-kafka_2.10</artifactId>
>>>>             <version>${sparkVersion}</version>
>>>>         </dependency>
>>>>
>>>>
>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> org/apache/spark/streaming/StreamingContext
>>>>         at java.lang.Class.getDeclaredMethods0(Native Method)
>>>>         at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>>>>         at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>>>>         at java.lang.Class.getMethod0(Class.java:3010)
>>>>         at java.lang.Class.getMethod(Class.java:1776)
>>>>         at
>>>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> org.apache.spark.streaming.StreamingContext
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> _____________________________________________________________________
>>>>
>>>> The information transmitted in this message and its attachments (if
>>>> any) is intended
>>>> only for the person or entity to which it is addressed.
>>>> The message may contain confidential and/or privileged material. Any
>>>> review,
>>>> retransmission, dissemination or other use of, or taking of any action
>>>> in reliance
>>>> upon this information, by persons or entities other than the intended
>>>> recipient is
>>>> prohibited.
>>>>
>>>> If you have received this in error, please contact the sender and
>>>> delete this e-mail
>>>> and associated material from any computer.
>>>>
>>>> The intended recipient of this e-mail may only use, reproduce, disclose
>>>> or distribute
>>>> the information contained in this e-mail and any attached files, with
>>>> the permission
>>>> of the sender.
>>>>
>>>> This message has been scanned for viruses.
>>>> _____________________________________________________________________
>>>>
>>>
>>>
>>
>

Reply via email to