Try to reproduce what the spark-submit shell script does, setting up the class 
path etc. 

Sent from my rotary phone. 


> On Nov 9, 2015, at 7:07 AM, Tathagata Das <t...@databricks.com> wrote:
> 
> You cannot submit from eclipse to a cluster that easily. You can run locally 
> (master set to local...), and it should work with just the pom.
> 
>> On Mon, Nov 9, 2015 at 2:49 AM, أنس الليثي <dev.fano...@gmail.com> wrote:
>> If I packaged the application and submit it, it works fine but I need to run 
>> it from eclipse.  
>> 
>> Is there any problem running the application from eclipse ? 
>> 
>> 
>> 
>>> On 9 November 2015 at 12:27, Tathagata Das <t...@databricks.com> wrote:
>>> How are you submitting the spark application? 
>>> You are supposed to submit the fat-jar of the application that include the 
>>> spark-streaming-twitter dependency (and its subdeps) but not 
>>> spark-streaming and spark-core. 
>>> 
>>>> On Mon, Nov 9, 2015 at 1:02 AM, أنس الليثي <dev.fano...@gmail.com> wrote:
>>>> I tried to remove maven and adding the dependencies manually using build 
>>>> path > configure build path > add external jars, then adding the jars 
>>>> manually but it did not work.
>>>> 
>>>> I tried to create another project and copied the code from the first app 
>>>> but the problem still the same. 
>>>> 
>>>> I event tried to change eclipse with another version, but the same problem 
>>>> exist.
>>>> 
>>>> :( :( :( :( 
>>>> 
>>>>> On 9 November 2015 at 10:47, أنس الليثي <dev.fano...@gmail.com> wrote:
>>>>> I tried both, the same exception still thrown 
>>>>> 
>>>>>> On 9 November 2015 at 10:45, Sean Owen <so...@cloudera.com> wrote:
>>>>>> You included a very old version of the Twitter jar - 1.0.0. Did you mean 
>>>>>> 1.5.1?
>>>>>> 
>>>>>> On Mon, Nov 9, 2015 at 7:36 AM, fanooos <dev.fano...@gmail.com> wrote:
>>>>>> > This is my first Spark Stream application. The setup is as following
>>>>>> >
>>>>>> > 3 nodes running a spark cluster. One master node and two slaves.
>>>>>> >
>>>>>> > The application is a simple java application streaming from twitter and
>>>>>> > dependencies managed by maven.
>>>>>> >
>>>>>> > Here is the code of the application
>>>>>> >
>>>>>> > public class SimpleApp {
>>>>>> >
>>>>>> >     public static void main(String[] args) {
>>>>>> >
>>>>>> >         SparkConf conf = new SparkConf().setAppName("Simple
>>>>>> > Application").setMaster("spark://rethink-node01:7077");
>>>>>> >
>>>>>> >         JavaStreamingContext sc = new JavaStreamingContext(conf, new
>>>>>> > Duration(1000));
>>>>>> >
>>>>>> >         ConfigurationBuilder cb = new ConfigurationBuilder();
>>>>>> >
>>>>>> >         cb.setDebugEnabled(true).setOAuthConsumerKey("ConsumerKey")
>>>>>> >                 .setOAuthConsumerSecret("ConsumerSecret")
>>>>>> >                 .setOAuthAccessToken("AccessToken")
>>>>>> >                 .setOAuthAccessTokenSecret("TokenSecret");
>>>>>> >
>>>>>> >         OAuthAuthorization auth = new OAuthAuthorization(cb.build());
>>>>>> >
>>>>>> >         JavaDStream<Status> tweets = TwitterUtils.createStream(sc, 
>>>>>> > auth);
>>>>>> >
>>>>>> >          JavaDStream<String> statuses = tweets.map(new Function<Status,
>>>>>> > String>() {
>>>>>> >              public String call(Status status) throws Exception {
>>>>>> >                 return status.getText();
>>>>>> >             }
>>>>>> >         });
>>>>>> >
>>>>>> >          statuses.print();;
>>>>>> >
>>>>>> >          sc.start();
>>>>>> >
>>>>>> >          sc.awaitTermination();
>>>>>> >
>>>>>> >     }
>>>>>> >
>>>>>> > }
>>>>>> >
>>>>>> >
>>>>>> > here is the pom file
>>>>>> >
>>>>>> > <project xmlns="http://maven.apache.org/POM/4.0.0";
>>>>>> > xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
>>>>>> >     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
>>>>>> > http://maven.apache.org/xsd/maven-4.0.0.xsd";>
>>>>>> >     <modelVersion>4.0.0</modelVersion>
>>>>>> >     <groupId>SparkFirstTry</groupId>
>>>>>> >     <artifactId>SparkFirstTry</artifactId>
>>>>>> >     <version>0.0.1-SNAPSHOT</version>
>>>>>> >
>>>>>> >     <dependencies>
>>>>>> >         <dependency>
>>>>>> >             <groupId>org.apache.spark</groupId>
>>>>>> >             <artifactId>spark-core_2.10</artifactId>
>>>>>> >             <version>1.5.1</version>
>>>>>> >             <scope>provided</scope>
>>>>>> >         </dependency>
>>>>>> >
>>>>>> >         <dependency>
>>>>>> >             <groupId>org.apache.spark</groupId>
>>>>>> >             <artifactId>spark-streaming_2.10</artifactId>
>>>>>> >             <version>1.5.1</version>
>>>>>> >             <scope>provided</scope>
>>>>>> >         </dependency>
>>>>>> >
>>>>>> >         <dependency>
>>>>>> >             <groupId>org.twitter4j</groupId>
>>>>>> >             <artifactId>twitter4j-stream</artifactId>
>>>>>> >             <version>3.0.3</version>
>>>>>> >         </dependency>
>>>>>> >         <dependency>
>>>>>> >             <groupId>org.apache.spark</groupId>
>>>>>> >             <artifactId>spark-streaming-twitter_2.10</artifactId>
>>>>>> >             <version>1.0.0</version>
>>>>>> >         </dependency>
>>>>>> >
>>>>>> >     </dependencies>
>>>>>> >
>>>>>> >     <build>
>>>>>> >         <sourceDirectory>src</sourceDirectory>
>>>>>> >         <plugins>
>>>>>> >             <plugin>
>>>>>> >                 <artifactId>maven-compiler-plugin</artifactId>
>>>>>> >                 <version>3.3</version>
>>>>>> >                 <configuration>
>>>>>> >                     <source>1.8</source>
>>>>>> >                     <target>1.8</target>
>>>>>> >                 </configuration>
>>>>>> >             </plugin>
>>>>>> >             <plugin>
>>>>>> >                 <artifactId>maven-assembly-plugin</artifactId>
>>>>>> >                 <configuration>
>>>>>> >                     <archive>
>>>>>> >                         <manifest>
>>>>>> >
>>>>>> > <mainClass>com.test.sparkTest.SimpleApp</mainClass>
>>>>>> >                         </manifest>
>>>>>> >                     </archive>
>>>>>> >                     <descriptorRefs>
>>>>>> >                         
>>>>>> > <descriptorRef>jar-with-dependencies</descriptorRef>
>>>>>> >                     </descriptorRefs>
>>>>>> >                 </configuration>
>>>>>> >             </plugin>
>>>>>> >
>>>>>> >         </plugins>
>>>>>> >     </build>
>>>>>> > </project>
>>>>>> >
>>>>>> >
>>>>>> > The application starts successfully but no tweets comes and this 
>>>>>> > exception
>>>>>> > is thrown
>>>>>> >
>>>>>> > 15/11/08 15:55:46 WARN TaskSetManager: Lost task 0.0 in stage 4.0 (TID 
>>>>>> > 78,
>>>>>> > 192.168.122.39): java.io.IOException: java.lang.ClassNotFoundException:
>>>>>> > org.apache.spark.streaming.twitter.TwitterReceiver
>>>>>> >     at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1163)
>>>>>> >     at
>>>>>> > org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
>>>>>> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> >     at
>>>>>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>> >     at
>>>>>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>> >     at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>> >     at
>>>>>> > java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>>>> >     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>>>>>> >     at
>>>>>> > org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>>>>>> >     at
>>>>>> > org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>>>>>> >     at 
>>>>>> > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
>>>>>> >     at
>>>>>> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>>>> >     at
>>>>>> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>>>> >     at java.lang.Thread.run(Thread.java:745)
>>>>>> > Caused by: java.lang.ClassNotFoundException:
>>>>>> > org.apache.spark.streaming.twitter.TwitterReceiver
>>>>>> >     at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>>> >     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>>> >     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>>> >     at java.lang.Class.forName0(Native Method)
>>>>>> >     at java.lang.Class.forName(Class.java:348)
>>>>>> >     at
>>>>>> > org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>>>> >     at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1707)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1345)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>>>> >     at 
>>>>>> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
>>>>>> >     at
>>>>>> > java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:501)
>>>>>> >     at
>>>>>> > org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
>>>>>> >     at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1160)
>>>>>> >     ... 20 more
>>>>>> > I run the application locally from Eclipse. It is clear the problem is 
>>>>>> > in
>>>>>> > the dependencies but I can not figure out how to solve it.
>>>>>> >
>>>>>> >
>>>>>> >
>>>>>> > --
>>>>>> > View this message in context: 
>>>>>> > http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-ClassNotFoundException-org-apache-spark-streaming-twitter-TwitterReceiver-tp25324.html
>>>>>> > Sent from the Apache Spark User List mailing list archive at 
>>>>>> > Nabble.com.
>>>>>> >
>>>>>> > ---------------------------------------------------------------------
>>>>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>> > For additional commands, e-mail: user-h...@spark.apache.org
>>>>>> >
>>>>> 
>>>>> 
>>>>> 
>>>>> -- 
>>>>> Anas Rabei
>>>>> Senior Software Developer
>>>>> Mubasher.info
>>>>> anas.ra...@mubasher.info
>>>> 
>>>> 
>>>> 
>>>> -- 
>>>> Anas Rabei
>>>> Senior Software Developer
>>>> Mubasher.info
>>>> anas.ra...@mubasher.info
>>> 
>> 
>> 
>> 
>> -- 
>> Anas Rabei
>> Senior Software Developer
>> Mubasher.info
>> anas.ra...@mubasher.info
> 

Reply via email to