Hi,
I have grouped all my customers in JavaPairRDD
by there customerId (of Long type). Means every customerId have a List or
ProductBean.
Now i want to save all ProductBean to DB irrespective of customerId. I got
all values by using method
JavaRDD values =
wski | https://medium.com/@jaceklaskowski/ |
> http://blog.jaceklaskowski.pl
> Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
> Follow me at https://twitter.com/jaceklaskowski
> Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>
>
> On Tue, Dec 1, 20
Hi,
I have developed a spark realtime app and started spark-standalone on my
laptop. But when i tried to submit that app in Spark it is always
in WAITING state & Cores is always Zero.
I have set:
export SPARK_WORKER_CORES="2"
export SPARK_EXECUTOR_CORES="1"
in spark-env.sh, but still nothing
om/0LjTWLfm
Thanks
Shams
On Thu, Mar 10, 2016 at 8:11 PM, Ted Yu <yuzhih...@gmail.com> wrote:
> Can you provide a bit more information ?
>
> Release of Spark
> command for submitting your app
> code snippet of your app
> pastebin of log
>
> Thanks
>
> On Thu
Hi,
I want to kill a Spark Streaming job gracefully, so that whatever Spark has
picked from Kafka have processed. My Spark version is: 1.6.0
When i tried killing a Spark Streaming Job from Spark UI dosen't stop app
completely. In Spark-UI job is moved to COMPLETED section, but in log it
Any one have any idea? or should i raise a bug for that?
Thanks,
Shams
On Fri, Mar 11, 2016 at 3:40 PM, Shams ul Haque <sham...@cashcare.in> wrote:
> Hi,
>
> I want to kill a Spark Streaming job gracefully, so that whatever Spark
> has picked from Kafka have processed
Hi,
I want to implement Streaming using Mongo Tailable. Please give me hint how
can i do this.
I think i have to extend some class and used its method to do the stuff.
Please give me a hint.
Thanks and regards
Shams ul Haque
> - Hareesh
>
> On 3 May 2016 at 12:35, Shams ul Haque <sham...@cashcare.in> wrote:
>
>> Hi all,
>>
>> I am facing strange issue when running Spark Streaming app.
>>
>> What i was doing is, When i submit my app by *spark-submit *it works
>> fin
processing all data received from Kafka and then get shutdown.
I am trying to figure out why is this happening. Please help me if you know
anything.
Thanks and regards
Shams ul Haque
Hey Hareesh,
Thanks for the help, they were starving. I increased the core + memory on
that machine. Now it is working fine.
Thanks again
On Tue, May 3, 2016 at 12:57 PM, Shams ul Haque <sham...@cashcare.in> wrote:
> No, i made a cluster of 2 machines. And after submission
Hi,
I have a cluster of 4 machines for Spark. I want my Spark app to run on 2
machines only. And rest 2 machines for other Spark apps.
So my question is, can I restrict my app to run on that 2 machines only by
passing some IP at the time of setting SparkConf or by any other setting?
Thanks,
11 matches
Mail list logo