Re: Caching causes later actions to get stuck

2015-11-01 Thread Sampo Niskanen
Hi, Any ideas what's going wrong or how to fix it? Do I have to downgrade to 0.9.x to be able to use Spark? Best regards, *Sampo Niskanen* *Lead developer / Wellmo* sampo.niska...@wellmo.com +358 40 820 5291 On Fri, Oct 30, 2015 at 4:57 PM, Sampo Niskanen wrote: > Hi,

Caching causes later actions to get stuck

2015-10-30 Thread Sampo Niskanen
but couldn't make out anything useful. I'm also facing another issue with loading a lot of data from MongoDB, which might be related, but the error is different: https://groups.google.com/forum/#!topic/mongodb-user/Knj406szd74 Any ideas? *Sampo Niskanen* *Lead developer / Wellmo* sampo.niska...@wellmo.com +358 40 820 5291

Re: Analyzing consecutive elements

2015-10-22 Thread Sampo Niskanen
t; > > sc.stop() > } > } > > > > prints > > > > WrappedArray(((1,A),(3,B)), ((3,B),(7,C)), ((7,C),(8,D)), ((8,D),(9,E))) > > > > Otherwise you could try to convert your RDD to a DataFrame then use windowing > functions in SparkSQL with

Re: Analyzing consecutive elements

2015-10-22 Thread Sampo Niskanen
or session id) that makes > you algorithm parallel. In that case you can use the snippet above in a > reduceByKey. > > hope this helps > -adrian > > Sent from my iPhone > > On 22 Oct 2015, at 09:36, Sampo Niskanen > wrote: > > Hi, > > I have anal

Analyzing consecutive elements

2015-10-21 Thread Sampo Niskanen
me other way to analyze time-related elements.) How can this be achieved? * Sampo Niskanen* *Lead developer / Wellmo* sampo.niska...@wellmo.com +358 40 820 5291

Re: Implementing a custom Spark shell

2014-03-06 Thread Sampo Niskanen
(line 2528) Thanks. *Sampo Niskanen* *Lead developer / Wellmo* sampo.niska...@wellmo.com +358 40 820 5291 On Fri, Feb 28, 2014 at 10:46 AM, Prashant Sharma wrote: > You can enable debug logging for repl, thankfully it uses sparks logging > framework. Trouble must be with wrappe

Re: Implementing a custom Spark shell

2014-02-27 Thread Sampo Niskanen
va:606) at sbt.Run.invokeMain(Run.scala:68) at sbt.Run.run0(Run.scala:61) at sbt.Run.execute$1(Run.scala:50) at sbt.Run$$anonfun$run$1.apply$mcV$sp(Run.scala:54) at sbt.TrapExit$.executeMain$1(TrapExit.scala:33) at sbt.TrapExit$$anon$1.run(TrapExit.scala:42) Spark context available as sc.

Implementing a custom Spark shell

2014-02-25 Thread Sampo Niskanen
ected within the standard spark-shell? Thanks. *Sampo Niskanen* *Lead developer / Wellmo*