unsubscribe
unsubscribe smime.p7s Description: S/MIME cryptographic signature
unsubscribe
smime.p7s Description: S/MIME cryptographic signature
unsubscribe
unsubscribe smime.p7s Description: S/MIME cryptographic signature
unsubscribe
unsubscribe smime.p7s Description: S/MIME cryptographic signature
unsubscribe
smime.p7s Description: S/MIME cryptographic signature
Request for a working example of using Pregel API in GraphX using Spark Scala
Hello All I am a beginner in Spark, trying to use GraphX for an iterative processing by connecting to Kafka Stream Processing Looking for any git reference to real application example, in Scala Please revert with any reference to it, or if someone is trying to build, I could join them Regards Basavaraj K N smime.p7s Description: S/MIME cryptographic signature
Re: Checking if cascading graph computation is possible in Spark
I have checked broadcast of accumulated values, but not satellite stateful stabbing But, I am not sure how that helps here On Fri, 5 Apr 2019, 10:13 pm Jason Nerothin, wrote: > Have you looked at Arbitrary Stateful Streaming and Broadcast Accumulators? > > On Fri, Apr 5, 2019 at 10:55 AM Basavaraj wrote: > >> Hi >> >> Have two questions >> >> #1 >> I am trying to process events in realtime, outcome of the processing has >> to find a node in the GraphX and update that node as well (in case if any >> anomaly or state change), If a node is updated, I have to update the >> related nodes as well, want to know if GraphX can help in this by providing >> some native support >> >> #2 >> I want to do the above as a event driven way, without using the batches >> (i tried micro batches, but I realised that’s not what I want), i.e., for >> each arriving event or as soon as a event message come my stream, not by >> accumulating the event >> >> I humbly welcome any pointers, constructive criticism >> >> Regards >> Basav >> - To >> unsubscribe e-mail: user-unsubscr...@spark.apache.org > > > > -- > Thanks, > Jason >
Checking if cascading graph computation is possible in Spark
HiHave two questions #1 I am trying to process events in realtime, outcome of the processing has to find a node in the GraphX and update that node as well (in case if any anomaly or state change), If a node is updated, I have to update the related nodes as well, want to know if GraphX can help in this by providing some native support#2 I want to do the above as a event driven way, without using the batches (i tried micro batches, but I realised that’s not what I want), i.e., for each arriving event or as soon as a event message come my stream, not by accumulating the event I humbly welcome any pointers, constructive criticism RegardsBasav - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Checking if cascading graph computation is possible in Spark
Hi Have two questions #1 I am trying to process events in realtime, outcome of the processing has to find a node in the GraphX and update that node as well (in case if any anomaly or state change), If a node is updated, I have to update the related nodes as well, want to know if GraphX can help in this by providing some native support #2 I want to do the above as a event driven way, without using the batches (i tried micro batches, but I realised that’s not what I want), i.e., for each arriving event or as soon as a event message come my stream, not by accumulating the event I humbly welcome any pointers, constructive criticism Regards Basav smime.p7s Description: S/MIME cryptographic signature
Re: Job submission API
The following might be helpful. http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/What-dependencies-to-submit-Spark-jobs-programmatically-not-via/td-p/24721 http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/ On 7 April 2015 at 16:32, michal.klo...@gmail.com wrote: > A SparkContext can submit jobs remotely. > > The spark-submit options in general can be populated into a SparkConf and > passed in when you create a SparkContext. > > We personally have not had too much success with yarn-client remote > submission, but standalone cluster mode was easy to get going. > > M > > > > On Apr 7, 2015, at 7:01 PM, Prashant Kommireddi > wrote: > > Hello folks, > > Newbie here! Just had a quick question - is there a job submission API > such as the one with hadoop > > https://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/mapreduce/Job.html#submit() > to submit Spark jobs to a Yarn cluster? I see in example that > bin/spark-submit is what's out there, but couldn't find any APIs around it. > > Thanks, > Prashant > > -- Regards vybs