It is strange that writes works but read does not. If it was a Cassandra 
connectivity issue, then neither write or read would work. Perhaps the problem 
is somewhere else.

Can you send the complete exception trace?

Also, just to make sure that there is no DNS issue, try this:
~/cassandra/apache-cassandra-2.1.5$ bin/cassandra-cli -h 127.0.0.1 -p 9160

Mohammed

From: Yasemin Kaya [mailto:godo...@gmail.com]
Sent: Tuesday, June 9, 2015 11:32 AM
To: Yana Kadiyska
Cc: Gerard Maas; Mohammed Guller; user@spark.apache.org
Subject: Re: Cassandra Submit

I removed core and streaming jar. And the exception still same.

I tried what you said then results:

~/cassandra/apache-cassandra-2.1.5$ bin/cassandra-cli -h localhost -p 9160
Connected to: "Test Cluster" on localhost/9160
Welcome to Cassandra CLI version 2.1.5

The CLI is deprecated and will be removed in Cassandra 3.0.  Consider migrating 
to cqlsh.
CQL is fully backwards compatible with Thrift data; see 
http://www.datastax.com/dev/blog/thrift-to-cql3

Type 'help;' or '?' for help.
Type 'quit;' or 'exit;' to quit.

[default@unknown]

and

~/cassandra/apache-cassandra-2.1.5$ bin/cqlsh
Connected to Test Cluster at 127.0.0.1:9042<http://127.0.0.1:9042>.
[cqlsh 5.0.1 | Cassandra 2.1.5 | CQL spec 3.2.0 | Native protocol v3]
Use HELP for help.
cqlsh>

Thank you for your kind responses ...


2015-06-09 20:59 GMT+03:00 Yana Kadiyska 
<yana.kadiy...@gmail.com<mailto:yana.kadiy...@gmail.com>>:
Hm, jars look ok, although it's a bit of a mess -- you have spark-assembly 
1.3.0 but then core and streaming 1.3.1...It's generally a bad idea to mix 
versions. Spark-assembly bundless all spark packages, so either do them 
separately or use spark-assembly but don't mix like you've shown.

As to the port issue -- what about this:

$bin/cassandra-cli -h localhost -p 9160
Connected to: "Test Cluster" on localhost/9160
Welcome to Cassandra CLI version 2.1.5


On Tue, Jun 9, 2015 at 1:29 PM, Yasemin Kaya 
<godo...@gmail.com<mailto:godo...@gmail.com>> wrote:
My jar files are:

cassandra-driver-core-2.1.5.jar
cassandra-thrift-2.1.3.jar
guava-18.jar
jsr166e-1.1.0.jar
spark-assembly-1.3.0.jar
spark-cassandra-connector_2.10-1.3.0-M1.jar
spark-cassandra-connector-java_2.10-1.3.0-M1.jar
spark-core_2.10-1.3.1.jar
spark-streaming_2.10-1.3.1.jar

And my code from datastax 
spark-cassandra-connector<https://github.com/datastax/spark-cassandra-connector/blob/master/spark-cassandra-connector-demos/simple-demos/src/main/java/com/datastax/spark/connector/demo/JavaApiDemo.java>.

Thanx alot.
yasemin

2015-06-09 18:58 GMT+03:00 Yana Kadiyska 
<yana.kadiy...@gmail.com<mailto:yana.kadiy...@gmail.com>>:
hm. Yeah, your port is good...have you seen this thread: 
http://stackoverflow.com/questions/27288380/fail-to-use-spark-cassandra-connector
 ? It seems that you might be running into version mis-match issues?

What versions of Spark/Cassandra-connector are you trying to use?

On Tue, Jun 9, 2015 at 10:18 AM, Yasemin Kaya 
<godo...@gmail.com<mailto:godo...@gmail.com>> wrote:
Sorry my answer I hit terminal lsof -i:9160: result is

lsof -i:9160
COMMAND  PID    USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
java    7597 inosens  101u  IPv4  85754      0t0  TCP localhost:9160 (LISTEN)

so 9160 port is available or not ?

2015-06-09 17:16 GMT+03:00 Yasemin Kaya 
<godo...@gmail.com<mailto:godo...@gmail.com>>:
Yes my cassandra is listening on 9160 I think. Actually I know from yaml file. 
The file includes :

rpc_address: localhost
# port for Thrift to listen for clients on
rpc_port: 9160

I check the port "nc -z localhost 9160; echo $?" it returns me "0". I think it 
close, should I open this port ?

2015-06-09 16:55 GMT+03:00 Yana Kadiyska 
<yana.kadiy...@gmail.com<mailto:yana.kadiy...@gmail.com>>:
Is your cassandra installation actually listening on 9160?

lsof -i :9160COMMAND   PID     USER   FD   TYPE   DEVICE SIZE/OFF NODE NAME

java    29232 ykadiysk   69u  IPv4 42152497      0t0  TCP localhost:9160 
(LISTEN)
​
I am running an out-of-the box cassandra conf where

rpc_address: localhost
# port for Thrift to listen for clients on
rpc_port: 9160



On Tue, Jun 9, 2015 at 7:36 AM, Yasemin Kaya 
<godo...@gmail.com<mailto:godo...@gmail.com>> wrote:
I couldn't find any solution. I can write but I can't read from Cassandra.

2015-06-09 8:52 GMT+03:00 Yasemin Kaya 
<godo...@gmail.com<mailto:godo...@gmail.com>>:
Thanks alot Mohammed, Gerard and Yana.
I can write to table, but exception returns me. It says "Exception in thread 
"main" java.io.IOException: Failed to open thrift connection to Cassandra at 
127.0.0.1:9160<http://127.0.0.1:9160>"

In yaml file :
rpc_address: localhost
rpc_port: 9160

And at project :

.set("spark.cassandra.connection.host", "127.0.0.1")
.set("spark.cassandra.connection.rpc.port", "9160");

or

.set("spark.cassandra.connection.host", "localhost")
.set("spark.cassandra.connection.rpc.port", "9160");

whatever I write setting,  I get same exception. Any help ??


2015-06-08 18:23 GMT+03:00 Yana Kadiyska 
<yana.kadiy...@gmail.com<mailto:yana.kadiy...@gmail.com>>:
yes, whatever you put for listen_address in cassandra.yaml. Also, you should 
try to connect to your cassandra cluster via bin/cqlsh to make sure you have 
connectivity before you try to make a a connection via spark.

On Mon, Jun 8, 2015 at 4:43 AM, Yasemin Kaya 
<godo...@gmail.com<mailto:godo...@gmail.com>> wrote:
Hi,
I run my project on local. How can find ip address of my cassandra host ? From 
cassandra.yaml or ??

yasemin

2015-06-08 11:27 GMT+03:00 Gerard Maas 
<gerard.m...@gmail.com<mailto:gerard.m...@gmail.com>>:
????? = <ip address of your cassandra host>

On Mon, Jun 8, 2015 at 10:12 AM, Yasemin Kaya 
<godo...@gmail.com<mailto:godo...@gmail.com>> wrote:
Hi ,

How can I find spark.cassandra.connection.host? And what should I change ? 
Should I change cassandra.yaml ?

Error says me "Exception in thread "main" java.io.IOException: Failed to open 
native connection to Cassandra at {127.0.1.1}:9042"

What should I add SparkConf sparkConf = new 
SparkConf().setAppName("JavaApiDemo").set("spark.driver.allowMultipleContexts", 
"true").set("spark.cassandra.connection.host", ?????);

Best
yasemin

2015-06-06 3:04 GMT+03:00 Mohammed Guller 
<moham...@glassbeam.com<mailto:moham...@glassbeam.com>>:
Check your spark.cassandra.connection.host setting. It should be pointing to 
one of your Cassandra nodes.

Mohammed

From: Yasemin Kaya [mailto:godo...@gmail.com<mailto:godo...@gmail.com>]
Sent: Friday, June 5, 2015 7:31 AM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Cassandra Submit

Hi,

I am using cassandraDB in my project. I had that error Exception in thread 
"main" java.io.IOException: Failed to open native connection to Cassandra at 
{127.0.1.1}:9042

I think I have to modify the submit line. What should I add or remove when I 
submit my project?

Best,
yasemin


--
hiç ender hiç



--
hiç ender hiç




--
hiç ender hiç




--
hiç ender hiç



--
hiç ender hiç




--
hiç ender hiç



--
hiç ender hiç




--
hiç ender hiç




--
hiç ender hiç

Reply via email to