; HDFS and then read it using spark.read.json .
>
> Cheers,
> Anastasios
>
>
>
> On Sat, Nov 26, 2016 at 9:34 AM, kant kodali <kanth...@gmail.com> wrote:
>
>> up vote
>> down votefavorite
>> <http://stackoverflow.com/questions/40797231/apache-spark
che-spark-or-spark-cassandra-connector-doesnt-look-like-it-is-reading-multipl?noredirect=1#>
>
> Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading
> multiple partitions in parallel.
>
> Here is my code using spark-shell
>
&
up vote
down votefavorite
<http://stackoverflow.com/questions/40797231/apache-spark-or-spark-cassandra-connector-doesnt-look-like-it-is-reading-multipl?noredirect=1#>
Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading
multiple partitions in parallel.
Here is my code
Sorry: Spark 2.0.0
On Tue, Nov 1, 2016 at 10:04 AM, Andrew Holway <
andrew.hol...@otternetworks.de> wrote:
> Hello,
>
> I've been getting pretty serious with DC/OS which I guess could be
> described as a somewhat polished distribution of Mesos. I'm not sure how
> relevant DC/OS is to this
Hello,
I've been getting pretty serious with DC/OS which I guess could be
described as a somewhat polished distribution of Mesos. I'm not sure how
relevant DC/OS is to this problem.
I am using this pyspark program to test the cassandra connection:
http://bit.ly/2eWAfxm (github)
I can that the
3
You can verify the available versions by searching Maven at
http://search.maven.org.
Thanks,
Kevin
On Wed, Sep 21, 2016 at 3:38 AM, muhammet pakyürek <mpa...@hotmail.com>
wrote:
> while i run the spark-shell as below
>
> spark-shell --jars '/home/ktuser/spark-cassandra-
> connector
while i run the spark-shell as below
spark-shell --jars
'/home/ktuser/spark-cassandra-connector/target/scala-2.11/root_2.11-2.0.0-M3-20-g75719df.jar'
--packages datastax:spark-cassandra-connector:2.0.0-s_2.11-M3-20-g75719df
--conf spark.cassandra.connection.host=localhost
i get the error
Hi List
I am trying to install the Spark-Cassandra connector through maven or sbt but
neither works.
Both of them try to connect to the Internet (which I do not have connection) to
download certain files.
Is there a way to install the files manually?
I downloaded from the maven repository
quot;9043")
sqlContext.setConf("host","localhost")
sqlContext.setConf("port","9043”)
Thanks
Andy
From: Saurabh Bajaj <bajaj.onl...@gmail.com>
Date: Tuesday, March 8, 2016 at 9:13 PM
To: Andrew Davidson <a...@santacruzintegration.com>
Cc: Te
onnection.port
>
> https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md#cassandra-connection-parameters
>
> Looking at the logs, it seems your port config is not being set and it's
> falling back to default.
> Let me know if that helps.
>
> Saurab
Hi Andy,
I believe you need to set the host and port settings separately
spark.cassandra.connection.host
spark.cassandra.connection.port
https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md#cassandra-connection-parameters
Looking at the logs, it seems your port
Hi Ted
I believe by default cassandra listens on 9042
From: Ted Yu <yuzhih...@gmail.com>
Date: Tuesday, March 8, 2016 at 6:11 PM
To: Andrew Davidson <a...@santacruzintegration.com>
Cc: "user @spark" <user@spark.apache.org>
Subject: Re: pyspark spark-cassandra-c
Have you contacted spark-cassandra-connector related mailing list ?
I wonder where the port 9042 came from.
Cheers
On Tue, Mar 8, 2016 at 6:02 PM, Andy Davidson <a...@santacruzintegration.com
> wrote:
>
> I am using spark-1.6.0-bin-hadoop2.6. I am trying to write a python
> note
I am using spark-1.6.0-bin-hadoop2.6. I am trying to write a python notebook
that reads a data frame from Cassandra.
I connect to cassadra using an ssh tunnel running on port 9043. CQLSH works
how ever I can not figure out how to configure my notebook. I have tried
various hacks any idea what I
Hi Yin,
Thanks for your reply. I didn't realize there is a specific mailing list
for spark-Cassandra-connector. I will ask there. Thanks!
-Sa
On Tuesday, February 23, 2016, Yin Yang <yy201...@gmail.com> wrote:
> Hi, Sa:
> Have you asked on spark-cassandra-connector mailing list ?
&
Hi, Sa:
Have you asked on spark-cassandra-connector mailing list ?
Seems you would get better response there.
Cheers
Hi there,
I am trying to enable the metrics collection by spark-cassandra-connector,
following the instruction here:
https://github.com/datastax/spark-cassandra-connector/blob/master/doc/11_metrics.md
However, I was not able to see any metrics reported. I'm using
spark-cassandra-connector_2.10
Hello all,
I looked through the cassandra spark integration (
https://github.com/datastax/spark-cassandra-connector) and couldn't find
any usages of the BulkOutputWriter (
http://www.datastax.com/dev/blog/bulk-loading) - an awesome tool for
creating local sstables, which could be later uploaded
Alex – I suggest posting this question on the Spark Cassandra Connector mailing
list. The SCC developers are pretty responsive.
Mohammed
Author: Big Data Analytics with
Spark<http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/>
From: Alexandr Dzhagriev [mail
Hi, Vivek M
I had ever tried 1.5.x spark-cassandra connector and indeed encounter some
classpath issues, mainly for the guaua dependency.
I believe that can be solved by some maven config, but have not tried that yet.
Best,
Sun.
fightf...@163.com
From: vivek.meghanat...@wipro.com
Date
t;user@spark.apache.org>
主题: Re: Spark 1.5.2 compatible spark-cassandra-connector
Hi, Vivek M
I had ever tried 1.5.x spark-cassandra connector and indeed encounter some
classpath issues, mainly for the guaua dependency.
I believe that can be solved by some maven config, but have not tried
r <user@spark.apache.org>
Subject: Re: Spark 1.5.2 compatible spark-cassandra-connector
2.10-1.5.0-M3 & spark 1.5.2 work for me. The jar is built by sbt-assembly.
Just for reference.
发件人: "fightf...@163.com<mailto:fightf...@163.com>"
<fightf...@163.com<mailt
All,
What is the compatible spark-cassandra-connector for spark 1.5.2? I can only
find the latest connector version spark-cassandra-connector_2.10-1.5.0-M3 which
has dependency with 1.5.1 spark. Can we use the same for 1.5.2? Any classpath
issues needs to be handled or any jars needs
Mind providing a bit more detail ?
Release of Spark
version of Cassandra connector
How job was submitted
complete stack trace
Thanks
On Thu, Dec 24, 2015 at 2:06 AM, Vijay Kandiboyina wrote:
> java.lang.NoClassDefFoundError:
>
java.lang.NoClassDefFoundError:
com/datastax/spark/connector/rdd/CassandraTableScanRDD
erflow.com/questions/33896937/spark-connector-error-warn-nettyutil-found-nettys-native-epoll-transport-but
Thank you for your help and for your support.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-Connector-Error-tp25483.html
Sent from
is YES, is there a way to create a connectionPool
for
each executor, so that multiple task can dump data to cassandra in
parallel?
Regards,
Samya
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-connector-tp24378.html
Sent from
.1001560.n3.nabble.com/Spark-Cassandra-connector-tp24378.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
HI All,
I have tried Commands as mentioned below but still it is errors
dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars /home/missingmerch/
postgresql-9.4-1201.jdbc41.jar,/home/missingmerch/dse.jar,/home/missingmerch/spark-
cassandra-connector-java_2.10-1.1.1.jar
HI All,
Please help me to fix Spark Cassandra Connector issue, find the details
below
*Command:*
dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
*Error:*
WARN 2015
://twitter.com/deanwampler
http://polyglotprogramming.com
On Mon, Aug 10, 2015 at 7:44 AM, satish chandra j jsatishchan...@gmail.com
wrote:
HI All,
Please help me to fix Spark Cassandra Connector issue, find the details
below
*Command:*
dse spark-submit --master spark://10.246.43.15:7077 --class
-1201.jdbc41.jar,/home/missingmerch/dse.jar,/home/missingmerch/spark-
cassandra-connector-java_2.10-1.1.1.jar /home/missingmerch/etl-0.0.
1-SNAPSHOT.jar
I also removed the extra //. Or put file: in front of them so they are
proper URLs. Note the snapshot jar isn't in the --jars list. I assume
that's
Hi,
Thanks for quick input, now I am getting class not found error
*Command:*
dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
///home/missingmerch/dse.jar
///home/missingmerch/spark-cassandra-connector-java_2.10
Hi,
I would like to get the recommendations to use Spark-Cassandra connector
DataFrame feature.
I was trying to save a Dataframe containing 8 Million rows to Cassandra through
the Spark-Cassandra connector. Based on the Spark log, this single action took
about 60 minutes to complete. I think
Looking for help with this. Thank you!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-connector-number-of-Tasks-tp22820p22839.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
I am using the Spark Cassandra connector to work with a table with 3 million
records. Using .where() API to work with only a certain rows in this table.
Where clause filters the data to 1 rows.
CassandraJavaUtil.javaFunctions(sparkContext) .cassandraTable(KEY_SPACE,
MY_TABLE
libraryDependencies += com.datastax.spark %% spark-cassandra-connector %
1.2.0-rc3
in order to create the previous jar version of the connector and not the
default one
(i.e. spark-cassandra-connector-assembly-1.3.0-SNAPSHOT.jar )
I am really new on working with sbt. Any guidance / help would be really
Connector, I see I can
do this:
https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md#connecting-manually-to-cassandra
Now I don't want to open a session and close it for every row in the source (am
I right in not wanting this? Usually, I have one session for the entire
,
Ashic.
Date: Thu, 23 Oct 2014 14:27:47 +0200
Subject: Re: Spark Cassandra Connector proper usage
From: gerard.m...@gmail.com
To: as...@live.com
Ashic,
With the Spark-cassandra connector you would typically create an RDD from the
source table, update what you need, filter out what you don't update
about session management in order to issue custom update queries.
Thanks,
Ashic.
--
Date: Thu, 23 Oct 2014 14:27:47 +0200
Subject: Re: Spark Cassandra Connector proper usage
From: gerard.m...@gmail.com
To: as...@live.com
Ashic,
With the Spark-cassandra connector
Hi Gerard,
I've gone with option 1, and seems to be working well. Option 2 is also quite
interesting. Thanks for your help in this.
Regards,
Ashic.
From: gerard.m...@gmail.com
Date: Thu, 23 Oct 2014 17:07:56 +0200
Subject: Re: Spark Cassandra Connector proper usage
To: as...@live.com
CC: user
Hi,
I am creating a cassandra java rdd and transforming it using the where
clause.
It works fine when I run it outside the mapValues, but when I put the code
in mapValues I get an error while creating the transformation.
Below is my sample code:
CassandraJavaRDDReferenceData
Is this because I am calling a transformation function on an rdd from
inside another transformation function?
Is it not allowed?
Thanks
Ankut
On Oct 21, 2014 1:59 PM, Ankur Srivastava ankur.srivast...@gmail.com
wrote:
Hi Gerard,
this is the code that may be helpful.
public class
like to know if people seeing good performance
reading from cassandra using spark as oppose to reading data from HDFS. Kind
of an open question but would like to see how others are using it.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra
I'm running out of options trying to integrate cassandra, spark, and the
spark-cassandra-connector.
I quickly found out just grabbing the latest versions of everything
(drivers, etc.) doesn't work--binary incompatibilities it would seem.
So last I tried using versions of drivers from the
spark
Hi,
I am new to spark. I encountered an issue when trying to connect to Cassandra
using Spark Cassandra connector. Can anyone help me. Following are the details.
1) Following Spark and Cassandra versions I am using on LUbuntu12.0.
i)spark-1.0.2-bin-hadoop2
ii) apache-cassandra-2.0.10
2
encountered an issue when trying to connect to
Cassandra using Spark Cassandra connector. Can anyone help me. Following
are the details.
1) Following Spark and Cassandra versions I am using on LUbuntu12.0.
i)spark-1.0.2-bin-hadoop2
ii) apache-cassandra-2.0.10
2) In the Cassandra, i created
Padala
Cc: u...@spark.incubator.apache.org
Subject: Re: problem in using Spark-Cassandra connector
You will have to create create KeySpace and Table.
See the message,
Table not found: EmailKeySpace.Emails
Looks like you have not created the Emails table.
On Thu, Sep 11, 2014 at 6:04 PM, Karunya
Hi,
My version of Spark is 1.0.2.
I am trying to use Spark-cassandra-connector to execute an update csql
statement inside
an CassandraConnector(conf).withSessionDo block :
CassandraConnector(conf).withSessionDo {
session =
{
myRdd.foreach {
case (ip
).count
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-spark-cassandra-connector-in-spark-shell-tp11757p11781.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
(cassandra.connection.host,
your-cassandra-host)
val sc = new SparkContext(local[1], cassandra-driver, conf)
import com.datastax.driver.spark._
sc.cassandraTable(db1, table1).select(key).count
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-spark-cassandra
Hello
Is it possible to use spark-cassandra-connector in spark-shell?
Thanks
Gary
Yes, I've done it before.
On Thu, Aug 7, 2014 at 10:18 PM, Gary Zhao garyz...@gmail.com wrote:
Hello
Is it possible to use spark-cassandra-connector in spark-shell?
Thanks
Gary
Thanks Andrew. How did you do it?
On Thu, Aug 7, 2014 at 10:20 PM, Andrew Ash and...@andrewash.com wrote:
Yes, I've done it before.
On Thu, Aug 7, 2014 at 10:18 PM, Gary Zhao garyz...@gmail.com wrote:
Hello
Is it possible to use spark-cassandra-connector in spark-shell?
Thanks
Gary
I don't remember the details, but I think it just took adding the
spark-cassandra-connector jar to the spark shell's classpath with --jars or
maybe ADD_JARS and then it worked.
On Thu, Aug 7, 2014 at 10:24 PM, Gary Zhao garyz...@gmail.com wrote:
Thanks Andrew. How did you do it?
On Thu, Aug
Hello
I'm trying to modify Spark sample app to integrate with Cassandra, however
I saw exception when submitting the app. Anyone knows why it happens?
Exception in thread main java.lang.NoClassDefFoundError:
com/datastax/spark/connector/rdd/reader/RowReaderFactory
at
57 matches
Mail list logo