batch insert in db. How about this use case, where we can
process(parse string JSON to obj) and send back those objects to master and
then send a bulk insert request. Is there any benefit for sending individually
using connection pool vs use of bulk operation in the master?
A.K.M
Thanks Chris,
That is what I wanted to know :)
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources
On Mar 2, 2015, at 2:04 AM, Chris Fregly ch...@fregly.com wrote:
hey
Sorry guys may bad,
Here is a high level code sample,
val unionStreams = ssc.union(kinesisStreams)
unionStreams.foreachRDD(rdd = {
rdd.foreach(tweet =
val strTweet = new String(tweet, UTF-8)
val interaction = InteractionParser.parser(strTweet)
interactionDAL.insert(interaction)
)
the issue. I will do a memory leak test. But this
is a simple and small application. I don’t see a leak there with naked eyes.
Can any one help me with how I should investigate?
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out
.
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources
On Nov 26, 2014, at 6:23 PM, A.K.M. Ashrafuzzaman ashrafuzzaman...@gmail.com
wrote:
Hi guys,
When we are using Kinesis
from EC2 and now the kinesis
is getting consumed.
4 cores Single machine - works
2 cores Single machine - does not work
2 cores 2 workers - does not work
So my question is that do we need a cluster of (#KinesisShards + 1) workers to
be able to consume from Kinesis?
A.K.M. Ashrafuzzaman
Lead
using,
scala: 2.10.4
java version: 1.8.0_25
Spark: 1.1.0
spark-streaming-kinesis-asl: 1.1.0
A.K.M. Ashrafuzzaman
Lead Software Engineer
NewsCred
(M) 880-175-5592433
Twitter | Blog | Facebook
Check out The Academy, your #1 source
for free content marketing resources