Re: [Structured spak streaming] How does cassandra connector readstream deals with deleted record

2020-06-26 Thread Russell Spitzer
The connector uses Java driver cql request under the hood which means it responds to the changing database like a normal application would. This means retries may result in a different set of data than the original request if the underlying database changed. On Fri, Jun 26, 2020, 9:42 PM Jungtaek

Re: [Structured spak streaming] How does cassandra connector readstream deals with deleted record

2020-06-26 Thread Jungtaek Lim
I'm not sure how it is implemented, but in general I wouldn't expect such behavior on the connectors which read from non-streaming fashion storages. The query result may depend on "when" the records are fetched. If you need to reflect the changes in your query you'll probably want to find a way

[Structured spak streaming] How does cassandra connector readstream deals with deleted record

2020-06-24 Thread Rahul Kumar
Hello everyone, I was wondering, how Cassandra spark connector deals with deleted/updated record while readstream operation. If the record was already fetched in spark memory, and it got updated or deleted in database, does it get reflected in streaming join? Thanks, Rahul -- Sent from:

Re: Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel.

2016-11-26 Thread kant kodali
; HDFS and then read it using spark.read.json . > > Cheers, > Anastasios > > > > On Sat, Nov 26, 2016 at 9:34 AM, kant kodali <kanth...@gmail.com> wrote: > >> up vote >> down votefavorite >> <http://stackoverflow.com/questions/40797231/apache-spark

Re: Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel.

2016-11-26 Thread Anastasios Zouzias
che-spark-or-spark-cassandra-connector-doesnt-look-like-it-is-reading-multipl?noredirect=1#> > > Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading > multiple partitions in parallel. > > Here is my code using spark-shell > &

Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel.

2016-11-26 Thread kant kodali
up vote down votefavorite <http://stackoverflow.com/questions/40797231/apache-spark-or-spark-cassandra-connector-doesnt-look-like-it-is-reading-multipl?noredirect=1#> Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel. Here is my code

Re: Python - Spark Cassandra Connector on DC/OS

2016-11-01 Thread Andrew Holway
Sorry: Spark 2.0.0 On Tue, Nov 1, 2016 at 10:04 AM, Andrew Holway < andrew.hol...@otternetworks.de> wrote: > Hello, > > I've been getting pretty serious with DC/OS which I guess could be > described as a somewhat polished distribution of Mesos. I'm not sure how > relevant DC/OS is to this

Python - Spark Cassandra Connector on DC/OS

2016-11-01 Thread Andrew Holway
Hello, I've been getting pretty serious with DC/OS which I guess could be described as a somewhat polished distribution of Mesos. I'm not sure how relevant DC/OS is to this problem. I am using this pyspark program to test the cassandra connection: http://bit.ly/2eWAfxm (github) I can that the

Re: unresolved dependency: datastax#spark-cassandra-connector;2.0.0-s_2.11-M3-20-g75719df: not found

2016-09-21 Thread Kevin Mellott
3 You can verify the available versions by searching Maven at http://search.maven.org. Thanks, Kevin On Wed, Sep 21, 2016 at 3:38 AM, muhammet pakyürek <mpa...@hotmail.com> wrote: > while i run the spark-shell as below > > spark-shell --jars '/home/ktuser/spark-cassandra- > connector

unresolved dependency: datastax#spark-cassandra-connector;2.0.0-s_2.11-M3-20-g75719df: not found

2016-09-21 Thread muhammet pakyürek
while i run the spark-shell as below spark-shell --jars '/home/ktuser/spark-cassandra-connector/target/scala-2.11/root_2.11-2.0.0-M3-20-g75719df.jar' --packages datastax:spark-cassandra-connector:2.0.0-s_2.11-M3-20-g75719df --conf spark.cassandra.connection.host=localhost i get the error

cassandra 3.7 is compatible with datastax Spark Cassandra Connector 2.0?

2016-09-19 Thread muhammet pakyürek

Re: clear steps for installation of spark, cassandra and cassandra connector to run on spyder 2.3.7 using python 3.5 and anaconda 2.4 ipython 4.0

2016-09-06 Thread ayan guha
Spark has pretty extensive documentation, that should be your starting point. I do not use Cassandra much, but Cassandra connector should be a spark package, so look for spark package website. If I may say so, all docs should be one or two Google search away :) On 6 Sep 2016 20:34, "muh

clear steps for installation of spark, cassandra and cassandra connector to run on spyder 2.3.7 using python 3.5 and anaconda 2.4 ipython 4.0

2016-09-06 Thread muhammet pakyürek
could u send me documents and links to satisfy all above requirements of installation of spark, cassandra and cassandra connector to run on spyder 2.3.7 using python 3.5 and anaconda 2.4 ipython 4.0

Spark-Cassandra connector

2016-06-21 Thread Joaquin Alzola
Hi List I am trying to install the Spark-Cassandra connector through maven or sbt but neither works. Both of them try to connect to the Internet (which I do not have connection) to download certain files. Is there a way to install the files manually? I downloaded from the maven repository

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-09 Thread Andy Davidson
1.3.0 \ --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.10" export PYSPARK_PYTHON=python3 export PYSPARK_DRIVER_PYTHON=python3 IPYTHON_OPTS=notebook $SPARK_ROOT/bin/pyspark $extraPkgs --conf spark.cassandra.connection.host=localhost --conf spark.cassandra.connection.port=9043 $* df = sqlCo

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Ted Yu
onnection.port > > https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md#cassandra-connection-parameters > > Looking at the logs, it seems your port config is not being set and it's > falling back to default. > Let me know if that helps. > > Saurab

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Saurabh Bajaj
Hi Andy, I believe you need to set the host and port settings separately spark.cassandra.connection.host spark.cassandra.connection.port https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md#cassandra-connection-parameters Looking at the logs, it seems your port

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Andy Davidson
Hi Ted I believe by default cassandra listens on 9042 From: Ted Yu <yuzhih...@gmail.com> Date: Tuesday, March 8, 2016 at 6:11 PM To: Andrew Davidson <a...@santacruzintegration.com> Cc: "user @spark" <user@spark.apache.org> Subject: Re: pyspark spark-cassandra-c

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Ted Yu
Have you contacted spark-cassandra-connector related mailing list ? I wonder where the port 9042 came from. Cheers On Tue, Mar 8, 2016 at 6:02 PM, Andy Davidson <a...@santacruzintegration.com > wrote: > > I am using spark-1.6.0-bin-hadoop2.6. I am trying to write a python > note

pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Andy Davidson
am doing wrong : java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042 Thanks in advance Andy $ extraPkgs="--packages com.databricks:spark-csv_2.11:1.3.0 \ --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.11" $ export PYSP

Re: metrics not reported by spark-cassandra-connector

2016-02-23 Thread Sa Xiao
Hi Yin, Thanks for your reply. I didn't realize there is a specific mailing list for spark-Cassandra-connector. I will ask there. Thanks! -Sa On Tuesday, February 23, 2016, Yin Yang <yy201...@gmail.com> wrote: > Hi, Sa: > Have you asked on spark-cassandra-connector mailing list ? &

Re: metrics not reported by spark-cassandra-connector

2016-02-23 Thread Yin Yang
Hi, Sa: Have you asked on spark-cassandra-connector mailing list ? Seems you would get better response there. Cheers

metrics not reported by spark-cassandra-connector

2016-02-23 Thread Sa Xiao
Hi there, I am trying to enable the metrics collection by spark-cassandra-connector, following the instruction here: https://github.com/datastax/spark-cassandra-connector/blob/master/doc/11_metrics.md However, I was not able to see any metrics reported. I'm using spark-cassandra-connector_2.10

Re: [Cassandra-Connector] No Such Method Error despite correct versions

2016-02-22 Thread Jan Algermissen
; Spark 1.5.2 > Cassandra java-drive 3.0.0 > Cassandra-Connector 1.5.0-RC1 > > All with scala 2.11.7 > > Nevertheless, I get the following error from my Spark job: > > java.lang.NoSuchMethodError: > com.datastax.driver.core.TableMetadata.getIndexes()Ljava/util/List;

[Cassandra-Connector] No Such Method Error despite correct versions

2016-02-22 Thread Jan Algermissen
Hi, I am using Cassandra 2.1.5 Spark 1.5.2 Cassandra java-drive 3.0.0 Cassandra-Connector 1.5.0-RC1 All with scala 2.11.7 Nevertheless, I get the following error from my Spark job: java.lang.NoSuchMethodError: com.datastax.driver.core.TableMetadata.getIndexes()Ljava/util/List

spark-cassandra-connector BulkOutputWriter

2016-02-09 Thread Alexandr Dzhagriev
Hello all, I looked through the cassandra spark integration ( https://github.com/datastax/spark-cassandra-connector) and couldn't find any usages of the BulkOutputWriter ( http://www.datastax.com/dev/blog/bulk-loading) - an awesome tool for creating local sstables, which could be later uploaded

RE: spark-cassandra-connector BulkOutputWriter

2016-02-09 Thread Mohammed Guller
Alex – I suggest posting this question on the Spark Cassandra Connector mailing list. The SCC developers are pretty responsive. Mohammed Author: Big Data Analytics with Spark<http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> From: Alexandr Dzhagriev [mail

Re: Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread fightf...@163.com
Hi, Vivek M I had ever tried 1.5.x spark-cassandra connector and indeed encounter some classpath issues, mainly for the guaua dependency. I believe that can be solved by some maven config, but have not tried that yet. Best, Sun. fightf...@163.com From: vivek.meghanat...@wipro.com Date

Re: Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread mwy
t;user@spark.apache.org> 主题: Re: Spark 1.5.2 compatible spark-cassandra-connector Hi, Vivek M I had ever tried 1.5.x spark-cassandra connector and indeed encounter some classpath issues, mainly for the guaua dependency. I believe that can be solved by some maven config, but have not tried

RE: Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread vivek.meghanathan
r <user@spark.apache.org> Subject: Re: Spark 1.5.2 compatible spark-cassandra-connector 2.10-1.5.0-M3 & spark 1.5.2 work for me. The jar is built by sbt-assembly. Just for reference. 发件人: "fightf...@163.com<mailto:fightf...@163.com>" <fightf...@163.com<mailt

Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread vivek.meghanathan
All, What is the compatible spark-cassandra-connector for spark 1.5.2? I can only find the latest connector version spark-cassandra-connector_2.10-1.5.0-M3 which has dependency with 1.5.1 spark. Can we use the same for 1.5.2? Any classpath issues needs to be handled or any jars needs

Re: error in spark cassandra connector

2015-12-24 Thread Ted Yu
Mind providing a bit more detail ? Release of Spark version of Cassandra connector How job was submitted complete stack trace Thanks On Thu, Dec 24, 2015 at 2:06 AM, Vijay Kandiboyina <vi...@inndata.in> wrote: > java.lang.NoClassDefFoundError: > com/datastax/spark/c

error in spark cassandra connector

2015-12-24 Thread Vijay Kandiboyina
java.lang.NoClassDefFoundError: com/datastax/spark/connector/rdd/CassandraTableScanRDD

Spark- Cassandra Connector Error

2015-11-25 Thread ahlusar
erflow.com/questions/33896937/spark-connector-error-warn-nettyutil-found-nettys-native-epoll-transport-but Thank you for your help and for your support. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-Connector-Error-tp25483.html Sent from

Re: Spark-Cassandra-connector

2015-08-21 Thread Ted Yu
is YES, is there a way to create a connectionPool for each executor, so that multiple task can dump data to cassandra in parallel? Regards, Samya -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-connector-tp24378.html Sent from

Spark-Cassandra-connector

2015-08-20 Thread Samya
.1001560.n3.nabble.com/Spark-Cassandra-connector-tp24378.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h

Re: Spark Cassandra Connector issue

2015-08-10 Thread satish chandra j
HI All, I have tried Commands as mentioned below but still it is errors dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars /home/missingmerch/ postgresql-9.4-1201.jdbc41.jar,/home/missingmerch/dse.jar,/home/missingmerch/spark- cassandra-connector-java_2.10-1.1.1.jar

Spark Cassandra Connector issue

2015-08-10 Thread satish chandra j
HI All, Please help me to fix Spark Cassandra Connector issue, find the details below *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar *Error:* WARN 2015

Re: Spark Cassandra Connector issue

2015-08-10 Thread Dean Wampler
://twitter.com/deanwampler http://polyglotprogramming.com On Mon, Aug 10, 2015 at 7:44 AM, satish chandra j jsatishchan...@gmail.com wrote: HI All, Please help me to fix Spark Cassandra Connector issue, find the details below *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class

Re: Spark Cassandra Connector issue

2015-08-10 Thread Dean Wampler
-1201.jdbc41.jar,/home/missingmerch/dse.jar,/home/missingmerch/spark- cassandra-connector-java_2.10-1.1.1.jar /home/missingmerch/etl-0.0. 1-SNAPSHOT.jar I also removed the extra //. Or put file: in front of them so they are proper URLs. Note the snapshot jar isn't in the --jars list. I assume that's

Re: Spark Cassandra Connector issue

2015-08-10 Thread satish chandra j
Hi, Thanks for quick input, now I am getting class not found error *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar ///home/missingmerch/dse.jar ///home/missingmerch/spark-cassandra-connector-java_2.10

Spark-Cassandra connector DataFrame

2015-07-28 Thread simon wang
Hi, I would like to get the recommendations to use Spark-Cassandra connector DataFrame feature. I was trying to save a Dataframe containing 8 Million rows to Cassandra through the Spark-Cassandra connector. Based on the Spark log, this single action took about 60 minutes to complete. I think

Re: Spark Cassandra connector number of Tasks

2015-05-10 Thread vijaypawnarkar
Looking for help with this. Thank you! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-connector-number-of-Tasks-tp22820p22839.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Spark Cassandra connector number of Tasks

2015-05-08 Thread vijaypawnarkar
I am using the Spark Cassandra connector to work with a table with 3 million records. Using .where() API to work with only a certain rows in this table. Where clause filters the data to 1 rows. CassandraJavaUtil.javaFunctions(sparkContext) .cassandraTable(KEY_SPACE, MY_TABLE

Spark Cassandra Connector

2015-04-18 Thread DStrip
libraryDependencies += com.datastax.spark %% spark-cassandra-connector % 1.2.0-rc3 in order to create the previous jar version of the connector and not the default one (i.e. spark-cassandra-connector-assembly-1.3.0-SNAPSHOT.jar ) I am really new on working with sbt. Any guidance / help would be really

Re: Using the DataStax Cassandra Connector from PySpark

2014-12-26 Thread Stephen Boesch
...@gmail.com: Hi there, I'm using Spark 1.1.0 and experimenting with trying to use the DataStax Cassandra Connector (https://github.com/datastax/spark-cassandra-connector) from within PySpark. As a baby step, I'm simply trying to validate that I have access to classes that I'd need via Py4J

Spark Cassandra Connector proper usage

2014-10-23 Thread Ashic Mahtab
Connector, I see I can do this: https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md#connecting-manually-to-cassandra Now I don't want to open a session and close it for every row in the source (am I right in not wanting this? Usually, I have one session for the entire

RE: Spark Cassandra Connector proper usage

2014-10-23 Thread Ashic Mahtab
, Ashic. Date: Thu, 23 Oct 2014 14:27:47 +0200 Subject: Re: Spark Cassandra Connector proper usage From: gerard.m...@gmail.com To: as...@live.com Ashic, With the Spark-cassandra connector you would typically create an RDD from the source table, update what you need, filter out what you don't update

Re: Spark Cassandra Connector proper usage

2014-10-23 Thread Gerard Maas
about session management in order to issue custom update queries. Thanks, Ashic. -- Date: Thu, 23 Oct 2014 14:27:47 +0200 Subject: Re: Spark Cassandra Connector proper usage From: gerard.m...@gmail.com To: as...@live.com Ashic, With the Spark-cassandra connector

RE: Spark Cassandra Connector proper usage

2014-10-23 Thread Ashic Mahtab
Hi Gerard, I've gone with option 1, and seems to be working well. Option 2 is also quite interesting. Thanks for your help in this. Regards, Ashic. From: gerard.m...@gmail.com Date: Thu, 23 Oct 2014 17:07:56 +0200 Subject: Re: Spark Cassandra Connector proper usage To: as...@live.com CC: user

Spark Cassandra connector issue

2014-10-21 Thread Ankur Srivastava
Hi, I am creating a cassandra java rdd and transforming it using the where clause. It works fine when I run it outside the mapValues, but when I put the code in mapValues I get an error while creating the transformation. Below is my sample code: CassandraJavaRDDReferenceData

Using the DataStax Cassandra Connector from PySpark

2014-10-21 Thread Mike Sukmanowsky
Hi there, I'm using Spark 1.1.0 and experimenting with trying to use the DataStax Cassandra Connector (https://github.com/datastax/spark-cassandra-connector) from within PySpark. As a baby step, I'm simply trying to validate that I have access to classes that I'd need via Py4J. Sample python

Re: Spark Cassandra connector issue

2014-10-21 Thread Ankur Srivastava
Is this because I am calling a transformation function on an rdd from inside another transformation function? Is it not allowed? Thanks Ankut On Oct 21, 2014 1:59 PM, Ankur Srivastava ankur.srivast...@gmail.com wrote: Hi Gerard, this is the code that may be helpful. public class

Spark Cassandra Connector Issue and performance

2014-09-24 Thread pouryas
like to know if people seeing good performance reading from cassandra using spark as oppose to reading data from HDFS. Kind of an open question but would like to see how others are using it. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra

Anyone have successful recipe for spark cassandra connector?

2014-09-19 Thread gzoller
I'm running out of options trying to integrate cassandra, spark, and the spark-cassandra-connector. I quickly found out just grabbing the latest versions of everything (drivers, etc.) doesn't work--binary incompatibilities it would seem. So last I tried using versions of drivers from the spark

problem in using Spark-Cassandra connector

2014-09-11 Thread Karunya Padala
Hi, I am new to spark. I encountered an issue when trying to connect to Cassandra using Spark Cassandra connector. Can anyone help me. Following are the details. 1) Following Spark and Cassandra versions I am using on LUbuntu12.0. i)spark-1.0.2-bin-hadoop2 ii) apache-cassandra-2.0.10 2

Re: problem in using Spark-Cassandra connector

2014-09-11 Thread Reddy Raja
encountered an issue when trying to connect to Cassandra using Spark Cassandra connector. Can anyone help me. Following are the details. 1) Following Spark and Cassandra versions I am using on LUbuntu12.0. i)spark-1.0.2-bin-hadoop2 ii) apache-cassandra-2.0.10 2) In the Cassandra, i created

RE: problem in using Spark-Cassandra connector

2014-09-11 Thread Karunya Padala
Padala Cc: u...@spark.incubator.apache.org Subject: Re: problem in using Spark-Cassandra connector You will have to create create KeySpace and Table. See the message, Table not found: EmailKeySpace.Emails Looks like you have not created the Emails table. On Thu, Sep 11, 2014 at 6:04 PM, Karunya

Cassandra connector

2014-09-10 Thread wwilkins
Hi, I am having difficulty getting the Cassandra connector running within the spark shell. My jars looks like: [wwilkins@phel-spark-001 logs]$ ls -altr /opt/connector/ total 14588 drwxr-xr-x. 5 root root4096 Sep 9 22:15 .. -rw-r--r-- 1 root root 242809 Sep 9 22:20 spark-cassandra

Re: Cassandra connector

2014-09-10 Thread gtinside
Are you using spark 1.1 ? If yes you would have to update the datastax cassandra connector code and remove ref to log methods from CassandraConnector.scala Regards, Gaurav -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cassandra-connector-tp13896p13897

RE: Cassandra connector

2014-09-10 Thread Wade Wilkins
AM To: u...@spark.incubator.apache.org Subject: Re: Cassandra connector Are you using spark 1.1 ? If yes you would have to update the datastax cassandra connector code and remove ref to log methods from CassandraConnector.scala Regards, Gaurav -- View this message in context: http://apache

Spark-cassandra-connector 1.0.0-rc5: java.io.NotSerializableException

2014-09-05 Thread Shing Hing Man
Hi, My version of Spark is 1.0.2. I am trying to use Spark-cassandra-connector to execute an update csql statement inside an CassandraConnector(conf).withSessionDo block : CassandraConnector(conf).withSessionDo { session = { myRdd.foreach { case (ip

Re: How to use spark-cassandra-connector in spark-shell?

2014-08-08 Thread chutium
).count -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-spark-cassandra-connector-in-spark-shell-tp11757p11781.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: How to use spark-cassandra-connector in spark-shell?

2014-08-08 Thread Thomas Nieborowski
(cassandra.connection.host, your-cassandra-host) val sc = new SparkContext(local[1], cassandra-driver, conf) import com.datastax.driver.spark._ sc.cassandraTable(db1, table1).select(key).count -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-spark-cassandra

How to use spark-cassandra-connector in spark-shell?

2014-08-07 Thread Gary Zhao
Hello Is it possible to use spark-cassandra-connector in spark-shell? Thanks Gary

Re: How to use spark-cassandra-connector in spark-shell?

2014-08-07 Thread Andrew Ash
Yes, I've done it before. On Thu, Aug 7, 2014 at 10:18 PM, Gary Zhao garyz...@gmail.com wrote: Hello Is it possible to use spark-cassandra-connector in spark-shell? Thanks Gary

Re: How to use spark-cassandra-connector in spark-shell?

2014-08-07 Thread Gary Zhao
Thanks Andrew. How did you do it? On Thu, Aug 7, 2014 at 10:20 PM, Andrew Ash and...@andrewash.com wrote: Yes, I've done it before. On Thu, Aug 7, 2014 at 10:18 PM, Gary Zhao garyz...@gmail.com wrote: Hello Is it possible to use spark-cassandra-connector in spark-shell? Thanks Gary

Re: How to use spark-cassandra-connector in spark-shell?

2014-08-07 Thread Andrew Ash
I don't remember the details, but I think it just took adding the spark-cassandra-connector jar to the spark shell's classpath with --jars or maybe ADD_JARS and then it worked. On Thu, Aug 7, 2014 at 10:24 PM, Gary Zhao garyz...@gmail.com wrote: Thanks Andrew. How did you do it? On Thu, Aug

spark-cassandra-connector issue

2014-08-06 Thread Gary Zhao
Hello I'm trying to modify Spark sample app to integrate with Cassandra, however I saw exception when submitting the app. Anyone knows why it happens? Exception in thread main java.lang.NoClassDefFoundError: com/datastax/spark/connector/rdd/reader/RowReaderFactory at