Can you try with maven ?

diff --git a/streaming/pom.xml b/streaming/pom.xml
index b8b8f2e..6cc8102 100644
--- a/streaming/pom.xml
+++ b/streaming/pom.xml
@@ -68,6 +68,11 @@
       <artifactId>junit-interface</artifactId>
       <scope>test</scope>
     </dependency>
+    <dependency>
+      <groupId>com.datastax.spark</groupId>
+      <artifactId>spark-cassandra-connector_2.10</artifactId>
+      <version>1.1.0</version>
+    </dependency>
   </dependencies>
   <build>

 <outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>

You can use the following command:
mvn -pl core,streaming package -DskipTests

Cheers

On Fri, Dec 5, 2014 at 9:35 AM, Ashic Mahtab <as...@live.com> wrote:

> Hi,
>
> Seems adding the cassandra connector and spark streaming causes "issues".
> I've added by build and code file. Running "sbt compile" gives weird errors
> like Seconds is not part of org.apache.spark.streaming and object Receiver
> is not a member of package org.apache.spark.streaming.receiver. If I take
> out cassandraConnector from the list of dependencies, "sbt compile"
> succeeds.
>
>
> How is adding the dependency removing things from spark streaming
> packages? Is there something I can do (perhaps in sbt) to not have this
> break?
>
>
> Here's my build file:
>
>
>
> *import sbt.Keys._import sbt._*
>
>
> *name := "untitled99"*
>
>
> *version := "1.0"*
>
>
> *scalaVersion := "2.10.4"*
>
>
>
>
> *val spark = "org.apache.spark" %% "spark-core" % "1.1.0"val
> sparkStreaming = "org.apache.spark" %% "spark-streaming" % "1.1.0"val
> cassandraConnector = "com.datastax.spark" %% "spark-cassandra-connector" %
> "1.1.0" withSources() withJavadoc()*
>
>
>
>
>
>
> *libraryDependencies ++= Seq(cassandraConnector,spark,sparkStreaming)*
>
>
> *resolvers += "Akka Repository" at "http://repo.akka.io/releases/
> <http://repo.akka.io/releases/>"*
>
>
> And here's my code:
>
>
>
>
>
> *import org.apache.spark.SparkContextimport
> org.apache.spark.storage.StorageLevelimport
> org.apache.spark.streaming.{Seconds, StreamingContext}import
> org.apache.spark.streaming.receiver.Receiver*
>
>
>
>
>
>
> *object Foo {def main(args: Array[String]) {val context = new
> SparkContext()val ssc = new StreamingContext(context, Seconds(2))}}*
>
>
> *class Bar extends Receiver[Int]
> <https://github.com/datastax/spark-cassandra-connector/issues/StorageLevel.MEMORY_AND_DISK_2>{override
> def onStart(): Unit = ???*
>
>
> *override def onStop(): Unit = ???}*
>
>

Reply via email to