Re: spark cassandra issue

2016-09-04 Thread Selvam Raman
Hi Russell. if possible pleae help me to solve the below issue. val df = sqlContext.read. format("org.apache.spark.sql.cassandra"). options(Map("c_table"->"restt","keyspace"->"sss")). load() com.datastax.driver.core.TransportException: [/192.23.2.100:9042] Cannot connect at

Re: spark cassandra issue

2016-09-04 Thread Russell Spitzer
This would also be a better question for the SCC user list :) https://groups.google.com/a/lists.datastax.com/forum/#!forum/spark-connector-user On Sun, Sep 4, 2016 at 9:31 AM Russell Spitzer wrote: > >

Re: spark cassandra issue

2016-09-04 Thread Russell Spitzer
https://github.com/datastax/spark-cassandra-connector/blob/v1.3.1/doc/14_data_frames.md In Spark 1.3 it was illegal to use "table" as a key in Spark SQL so in that version of Spark the connector needed to use the option "c_table" val df = sqlContext.read. |

Re: spark cassandra issue

2016-09-04 Thread Mich Talebzadeh
and your Cassandra table is there etc? Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com *Disclaimer:*

Re: spark cassandra issue

2016-09-04 Thread Selvam Raman
Hey Mich, I am using the same one right now. Thanks for the reply. import org.apache.spark.sql.cassandra._ import com.datastax.spark.connector._ //Loads implicit functions sc.cassandraTable("keyspace name", "table name") On Sun, Sep 4, 2016 at 8:48 PM, Mich Talebzadeh

Re: spark cassandra issue

2016-09-04 Thread Mich Talebzadeh
Hi Selvan. I don't deal with Cassandra but have you tried other options as described here https://github.com/datastax/spark-cassandra-connector/blob/master/doc/2_loading.md To get a Spark RDD that represents a Cassandra table, call the cassandraTable method on the SparkContext object. import

Re: spark cassandra issue

2016-09-04 Thread Selvam Raman
its very urgent. please help me guys. On Sun, Sep 4, 2016 at 8:05 PM, Selvam Raman wrote: > Please help me to solve the issue. > > spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:1.3.0 > --conf spark.cassandra.connection.host=** > > val df =

spark cassandra issue

2016-09-04 Thread Selvam Raman
Please help me to solve the issue. spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:1.3.0 --conf spark.cassandra.connection.host=** val df = sqlContext.read. | format("org.apache.spark.sql.cassandra"). | options(Map( "table" -> "", "keyspace" -> "***")).