Hi Niko,
It looks like you are calling a method on DStream, which does not exist.
Check out:
https://spark.apache.org/docs/1.1.0/streaming-programming-guide.html#output-operations-on-dstreams
for the method saveAsTextFiles
Harold
On Fri, Nov 14, 2014 at 10:39 AM, Niko Gamulin niko.gamu
Hi Kevin,
Yes, Spark can read and write to Cassandra without Hadoop. Have you seen
this:
https://github.com/datastax/spark-cassandra-connector
Harold
On Wed, Nov 12, 2014 at 9:28 PM, Kevin Burton bur...@spinn3r.com wrote:
We have all our data in Cassandra so I’d prefer to not have to bring
this, and was willing to
share as an example :) This seems to be the exact use case that will help
me!
Thanks!
Harold
the examples I've seen, inserting into Cassandra is
something like:
val collection = sc.parralellize(Seq(foo, bar)))
Where foo and bar could be elements in the arr array. So I would like
to know how to insert into Cassandra at the worker level.
Best wishes,
Harold
On Thu, Oct 30, 2014 at 11:48
Hi all,
In Spark Streaming, when I do foreachRDD on my DStreams, I get a
NonSerializable exception when I try to do something like:
DStream.foreachRDD( rdd = {
var sc.parallelize(Seq((test, blah)))
})
Is there any way around that ?
Thanks,
Harold
Hi,
Sorry, there's a typo there:
val arr = rdd.toArray
Harold
On Thu, Oct 30, 2014 at 9:58 AM, Harold Nguyen har...@nexgate.com wrote:
Hi all,
I'd like to be able to modify values in a DStream, and then send it off to
an external source like Cassandra, but I keep getting Serialization
, I can't do this within the
foreacRDD but only at the driver level. How do I use the arr variable
to do something like that ?
Thanks for any help,
Harold
(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 18 more
Thanks,
Harold
, 2014 at 11:50 AM, Harold Nguyen har...@nexgate.com wrote:
Hi,
Just wondering if you've seen the following error when reading from Kafka:
ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting
receiver 0 - java.lang.NoClassDefFoundError: scala/reflect/ClassManifest
wrote:
Looks like the kafka jar that you are using isn't compatible with your
scala version.
Thanks
Best Regards
On Wed, Oct 29, 2014 at 11:50 AM, Harold Nguyen har...@nexgate.com wrote:
Hi,
Just wondering if you've seen the following error when reading from Kafka:
ERROR ReceiverTracker
Hi all,
I followed the guide here:
http://spark.apache.org/docs/latest/streaming-kinesis-integration.html
But got this error:
Exception in thread main java.lang.NoClassDefFoundError:
com/amazonaws/auth/AWSCredentialsProvider
Would you happen to know what dependency or jar is needed ?
Harold
/DnsResolver;)V
It look every similar to this post:
http://stackoverflow.com/questions/24788949/nosuchmethoderror-while-running-aws-s3-client-on-spark-while-javap-shows-otherwi
Since I'm a little new to everything, would someone be able to provide a
step-by-step guidance for that ?
Harold
On Wed, Oct 29
,
Harold
this:
(SECRETWORDthebest_hello, 2), (SECRETWORDthebest_world, 2),
(SECRETWORDthebest_you, 1), etc...
Harold
On Wed, Oct 29, 2014 at 3:36 PM, Sean Owen so...@cloudera.com wrote:
What would it mean to make a DStream into a String? it's inherently a
sequence of things over time, each of which might be a string
/spark/connector/mapper/ColumnMapper
Thanks,
Harold
-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar
Am I issuing the spark-submit command incorrectly ? Each of the workers has
that built jar in their respective directories
(spark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar)
Thanks,
Harold
,
com.datastax.spark %% spark-cassandra-connector % 1.1.0-alpha3
withSources() withJavadoc(),
org.apache.spark %% spark-sql % 1.1.0
)
=
Any help would be appreciated! Thanks so much!
Harold
, Oct 27, 2014 at 9:22 PM, Harold Nguyen har...@nexgate.com wrote:
Hi Spark friends,
I'm trying to connect Spark Streaming into Cassandra by modifying the
NetworkWordCount.scala streaming example, and doing the make as few
changes as possible but having it insert data into Cassandra.
Could you
18 matches
Mail list logo