RE: Spark and Play

2014-11-13 Thread Mohammed Guller
Hi Patrick,
Although we are able to use Spark 1.1.0 with Play 2.2.x, as you mentioned, Akka 
incompatibility prevents us from using Spark with the current stable releases 
of Play (2.3.6) and Akka (2.3.7). Are there any plans to address this issue in 
Spark 1.2?

Thanks,
Mohammed

From: John Meehan [mailto:jnmee...@gmail.com]
Sent: Tuesday, November 11, 2014 11:35 PM
To: Mohammed Guller
Cc: Patrick Wendell; Akshat Aranya; user@spark.apache.org
Subject: Re: Spark and Play

You can also build a Play 2.2.x + Spark 1.1.0 fat jar with sbt-assembly for, 
e.g. yarn-client support or using with spark-shell for debugging:

play.Project.playScalaSettings

libraryDependencies ~= { _ map {
  case m if m.organization == com.typesafe.play =
m.exclude(commons-logging, commons-logging)
  case m = m
}}

assemblySettings

test in assembly := {}

mergeStrategy in assembly = (mergeStrategy in assembly) { (old) =
  {
case m if m.toLowerCase.endsWith(manifest.mf) = MergeStrategy.discard
case m if m.startsWith(META-INF) = MergeStrategy.discard
case PathList(javax, servlet, xs @ _*) = MergeStrategy.first
case PathList(org, apache, xs @ _*) = MergeStrategy.first
case PathList(org, jboss, xs @ _*) = MergeStrategy.first
case PathList(org, slf4j, xs @ _*) = MergeStrategy.discard
case about.html  = MergeStrategy.rename
case reference.conf = MergeStrategy.concat
case _ = MergeStrategy.first
  }
}

On Tue, Nov 11, 2014 at 3:04 PM, Mohammed Guller 
moham...@glassbeam.commailto:moham...@glassbeam.com wrote:
Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

Here is a sample build.sbt file:

name := xyz

version := 0.1 

scalaVersion := 2.10.4

libraryDependencies ++= Seq(
  jdbc,
  anorm,
  cache,
  org.apache.spark %% spark-core % 1.1.0,
  com.typesafe.akka %% akka-actor % 2.2.3,
  com.typesafe.akka %% akka-slf4j % 2.2.3,
  org.apache.spark %% spark-sql % 1.1.0
)

play.Project.playScalaSettings


Mohammed

-Original Message-
From: Patrick Wendell [mailto:pwend...@gmail.commailto:pwend...@gmail.com]
Sent: Tuesday, November 11, 2014 2:06 PM
To: Akshat Aranya
Cc: user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: Spark and Play

Hi There,

Because Akka versions are not binary compatible with one another, it might not 
be possible to integrate Play with Spark 1.1.0.

- Patrick

On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya 
aara...@gmail.commailto:aara...@gmail.com wrote:
 Hi,

 Sorry if this has been asked before; I didn't find a satisfactory
 answer when searching.  How can I integrate a Play application with
 Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x
 uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither
 of which work fine with Spark 1.1.0.  Is there something I should do
 with libraryDependencies in my build.sbt to make it work?

 Thanks,
 Akshat

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.orgmailto:user-unsubscr...@spark.apache.org For 
additional commands, e-mail: 
user-h...@spark.apache.orgmailto:user-h...@spark.apache.org


-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.orgmailto:user-unsubscr...@spark.apache.org
For additional commands, e-mail: 
user-h...@spark.apache.orgmailto:user-h...@spark.apache.org



Re: Spark and Play

2014-11-12 Thread Donald Szeto
Hi Akshat,

If your application is to serve results directly from a SparkContext, you
may want to take a look at http://prediction.io. It integrates Spark with
spray.io (another REST/web toolkit by Typesafe). Some heavy lifting is done
here:
https://github.com/PredictionIO/PredictionIO/blob/develop/core/src/main/scala/workflow/CreateServer.scala

Regards,
Donald
ᐧ

On Tue, Nov 11, 2014 at 11:35 PM, John Meehan jnmee...@gmail.com wrote:

 You can also build a Play 2.2.x + Spark 1.1.0 fat jar with sbt-assembly
 for, e.g. yarn-client support or using with spark-shell for debugging:

 play.Project.playScalaSettings

 libraryDependencies ~= { _ map {
   case m if m.organization == com.typesafe.play =
 m.exclude(commons-logging, commons-logging)
   case m = m
 }}

 assemblySettings

 test in assembly := {}

 mergeStrategy in assembly = (mergeStrategy in assembly) { (old) =
   {
 case m if m.toLowerCase.endsWith(manifest.mf) =
 MergeStrategy.discard
 case m if m.startsWith(META-INF) = MergeStrategy.discard
 case PathList(javax, servlet, xs @ _*) = MergeStrategy.first
 case PathList(org, apache, xs @ _*) = MergeStrategy.first
 case PathList(org, jboss, xs @ _*) = MergeStrategy.first
 case PathList(org, slf4j, xs @ _*) = MergeStrategy.discard
 case about.html  = MergeStrategy.rename
 case reference.conf = MergeStrategy.concat
 case _ = MergeStrategy.first
   }
 }

 On Tue, Nov 11, 2014 at 3:04 PM, Mohammed Guller moham...@glassbeam.com
 wrote:

 Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

 Here is a sample build.sbt file:

 name := xyz

 version := 0.1 

 scalaVersion := 2.10.4

 libraryDependencies ++= Seq(
   jdbc,
   anorm,
   cache,
   org.apache.spark %% spark-core % 1.1.0,
   com.typesafe.akka %% akka-actor % 2.2.3,
   com.typesafe.akka %% akka-slf4j % 2.2.3,
   org.apache.spark %% spark-sql % 1.1.0
 )

 play.Project.playScalaSettings


 Mohammed

 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Tuesday, November 11, 2014 2:06 PM
 To: Akshat Aranya
 Cc: user@spark.apache.org
 Subject: Re: Spark and Play

 Hi There,

 Because Akka versions are not binary compatible with one another, it
 might not be possible to integrate Play with Spark 1.1.0.

 - Patrick

 On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
  Hi,
 
  Sorry if this has been asked before; I didn't find a satisfactory
  answer when searching.  How can I integrate a Play application with
  Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x
  uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither
  of which work fine with Spark 1.1.0.  Is there something I should do
  with libraryDependencies in my build.sbt to make it work?
 
  Thanks,
  Akshat

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
 commands, e-mail: user-h...@spark.apache.org


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





-- 
Donald Szeto
PredictionIO


Re: Spark and Play

2014-11-11 Thread Patrick Wendell
Hi There,

Because Akka versions are not binary compatible with one another, it
might not be possible to integrate Play with Spark 1.1.0.

- Patrick

On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
 Hi,

 Sorry if this has been asked before; I didn't find a satisfactory answer
 when searching.  How can I integrate a Play application with Spark?  I'm
 getting into issues of akka-actor versions.  Play 2.2.x uses akka-actor 2.0,
 whereas Play 2.3.x uses akka-actor 2.3.4, neither of which work fine with
 Spark 1.1.0.  Is there something I should do with libraryDependencies in my
 build.sbt to make it work?

 Thanks,
 Akshat

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Spark and Play

2014-11-11 Thread Mohammed Guller
Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

Here is a sample build.sbt file:

name := xyz

version := 0.1 

scalaVersion := 2.10.4

libraryDependencies ++= Seq(
  jdbc,
  anorm,
  cache,
  org.apache.spark %% spark-core % 1.1.0,
  com.typesafe.akka %% akka-actor % 2.2.3,
  com.typesafe.akka %% akka-slf4j % 2.2.3,
  org.apache.spark %% spark-sql % 1.1.0
)

play.Project.playScalaSettings


Mohammed

-Original Message-
From: Patrick Wendell [mailto:pwend...@gmail.com] 
Sent: Tuesday, November 11, 2014 2:06 PM
To: Akshat Aranya
Cc: user@spark.apache.org
Subject: Re: Spark and Play

Hi There,

Because Akka versions are not binary compatible with one another, it might not 
be possible to integrate Play with Spark 1.1.0.

- Patrick

On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
 Hi,

 Sorry if this has been asked before; I didn't find a satisfactory 
 answer when searching.  How can I integrate a Play application with 
 Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x 
 uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither 
 of which work fine with Spark 1.1.0.  Is there something I should do 
 with libraryDependencies in my build.sbt to make it work?

 Thanks,
 Akshat

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark and Play

2014-11-11 Thread John Meehan
You can also build a Play 2.2.x + Spark 1.1.0 fat jar with sbt-assembly
for, e.g. yarn-client support or using with spark-shell for debugging:

play.Project.playScalaSettings

libraryDependencies ~= { _ map {
  case m if m.organization == com.typesafe.play =
m.exclude(commons-logging, commons-logging)
  case m = m
}}

assemblySettings

test in assembly := {}

mergeStrategy in assembly = (mergeStrategy in assembly) { (old) =
  {
case m if m.toLowerCase.endsWith(manifest.mf) = MergeStrategy.discard
case m if m.startsWith(META-INF) = MergeStrategy.discard
case PathList(javax, servlet, xs @ _*) = MergeStrategy.first
case PathList(org, apache, xs @ _*) = MergeStrategy.first
case PathList(org, jboss, xs @ _*) = MergeStrategy.first
case PathList(org, slf4j, xs @ _*) = MergeStrategy.discard
case about.html  = MergeStrategy.rename
case reference.conf = MergeStrategy.concat
case _ = MergeStrategy.first
  }
}

On Tue, Nov 11, 2014 at 3:04 PM, Mohammed Guller moham...@glassbeam.com
wrote:

 Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

 Here is a sample build.sbt file:

 name := xyz

 version := 0.1 

 scalaVersion := 2.10.4

 libraryDependencies ++= Seq(
   jdbc,
   anorm,
   cache,
   org.apache.spark %% spark-core % 1.1.0,
   com.typesafe.akka %% akka-actor % 2.2.3,
   com.typesafe.akka %% akka-slf4j % 2.2.3,
   org.apache.spark %% spark-sql % 1.1.0
 )

 play.Project.playScalaSettings


 Mohammed

 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Tuesday, November 11, 2014 2:06 PM
 To: Akshat Aranya
 Cc: user@spark.apache.org
 Subject: Re: Spark and Play

 Hi There,

 Because Akka versions are not binary compatible with one another, it might
 not be possible to integrate Play with Spark 1.1.0.

 - Patrick

 On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
  Hi,
 
  Sorry if this has been asked before; I didn't find a satisfactory
  answer when searching.  How can I integrate a Play application with
  Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x
  uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither
  of which work fine with Spark 1.1.0.  Is there something I should do
  with libraryDependencies in my build.sbt to make it work?
 
  Thanks,
  Akshat

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
 commands, e-mail: user-h...@spark.apache.org


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org