Hi All,
I am desperately looking for some help.
My cluster is 6 nodes having dual core and 8GB ram each. Spark version
running on the cluster is spark-0.9.0-incubating-bin-cdh4.
I am getting OutOfMemoryError when running a Spark Streaming job
(non-streaming version works fine) which queries Cass
Hi,
I managed to solve the issue. The problem was related to Netty. (Ref.
https://spark-project.atlassian.net/browse/SPARK-1138)
Changed the dependencies with Netty exclusions and included Netty 3.6.6. as
a dependency.
org.apache.spark
spark-
Hi,
In standalone mode I am trying to perform some Cassandra CQL read/write
operations. Following is my Maven dependencies.
org.apache.spark
spark-core_2.10
0.9.0-incubating
Thanks Mayur for your clarification.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-JSON-string-from-REST-Api-in-Spring-tp2358p2451.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Thanks Mayur for your response.
I think I need to clarify the first part of my query. The JSON based REST
API will be called by external interfaces. These requests needs to be
processed in a streaming mode in Spark. I am not clear about the following
points
1. How can JSON request string (50 per
Hi,
I am very new to Spark and currently trying to implement a use case. We have
a JSON based REST Api implemented in Spring which gets around 50 calls/sec.
I would like to stream these JSON strings to Spark for processing and
aggregation. We are having strict SLA and would like to know the best w