both jars in SPARK_CLASSPATH as a colon
separated string.
On 10 Aug 2015 06:20, Jonathan Haddad j...@jonhaddad.com wrote:
I'm trying to write a simple job for Pyspark 1.4 migrating data from
MySQL to Cassandra. I can work with either the MySQL JDBC jar or the
cassandra jar separately without
I'm trying to write a simple job for Pyspark 1.4 migrating data from MySQL
to Cassandra. I can work with either the MySQL JDBC jar or the cassandra
jar separately without issue, but when I try to reference both of them it
throws an exception:
Py4JJavaError: An error occurred while calling
I'm trying to use the json4s library in a spark job to push data back into
kafka. Everything was working fine when I was hard coding a string, but
now that I'm trying to render a string from a simple map it's failing. The
code works in sbt console.
working console code:
Write out the rdd to a cassandra table. The datastax driver provides
saveToCassandra() for this purpose.
On Tue Feb 03 2015 at 8:59:15 AM Adamantios Corais
adamantios.cor...@gmail.com wrote:
Hi,
After some research I have decided that Spark (SQL) would be ideal for
building an OLAP engine.
Could you be hitting this? https://issues.apache.org/jira/browse/SPARK-3178
On Sun, Aug 24, 2014 at 10:21 AM, Forest D dev24a...@gmail.com wrote:
Hi folks,
I have been trying to run the AMPLab’s twitter streaming example
This is probably a bit ridiculous, but I'm wondering if it's possible
to use scala libraries in a python module? The Cassandra connector
here https://github.com/datastax/spark-cassandra-connector is in
Scala, would I need a Python version of that library to use Python
Spark?
Personally I have no