Re: What does Blockchain technology mean for Big Data? And how Hadoop/Spark will play role with it?

2017-12-19 Thread Ryan C. Kleck
IMO blockchain won’t be doing anything for big data anytime soon.  It is not 
distributed (it’s “decntralized”). All blockchain data is replicated to ALL 
nodes in the network. At the moment there is a game involving breeding cats 
that is already clogging up ethereum blockchain.  Blockchain might be able to 
replace YARN at some point but won’t be able to replace HDFS or MR or spark.

Regards,
Ryan Kleck

On Dec 19, 2017, at 7:29 AM, Vadim Semenov 
> wrote:

I think it means that we can replace HDFS with a blockchain-based FS, and then 
offload some processing to smart contracts.

On Mon, Dec 18, 2017 at 11:59 PM, KhajaAsmath Mohammed 
> wrote:
I am looking for same answer too .. will wait for response from other people

Sent from my iPhone

> On Dec 18, 2017, at 10:56 PM, Gaurav1809 
> > wrote:
>
> Hi All,
>
> Will Bigdata tools & technology work with Blockchain in future? Any possible
> use cases that anyone is likely to face, please share.
>
> Thanks
> Gaurav
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: 
> user-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: 
user-unsubscr...@spark.apache.org




Re: What does Blockchain technology mean for Big Data? And how Hadoop/Spark will play role with it?

2017-12-19 Thread Vadim Semenov
I think it means that we can replace HDFS with a blockchain-based FS, and
then offload some processing to smart contracts.

On Mon, Dec 18, 2017 at 11:59 PM, KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:

> I am looking for same answer too .. will wait for response from other
> people
>
> Sent from my iPhone
>
> > On Dec 18, 2017, at 10:56 PM, Gaurav1809 
> wrote:
> >
> > Hi All,
> >
> > Will Bigdata tools & technology work with Blockchain in future? Any
> possible
> > use cases that anyone is likely to face, please share.
> >
> > Thanks
> > Gaurav
> >
> >
> >
> > --
> > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
> >
> > -
> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: What does Blockchain technology mean for Big Data? And how Hadoop/Spark will play role with it?

2017-12-18 Thread KhajaAsmath Mohammed
I am looking for same answer too .. will wait for response from other people 

Sent from my iPhone

> On Dec 18, 2017, at 10:56 PM, Gaurav1809  wrote:
> 
> Hi All,
> 
> Will Bigdata tools & technology work with Blockchain in future? Any possible
> use cases that anyone is likely to face, please share.
> 
> Thanks
> Gaurav
> 
> 
> 
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
> 
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



What does Blockchain technology mean for Big Data? And how Hadoop/Spark will play role with it?

2017-12-18 Thread Gaurav1809
Hi All,

Will Bigdata tools & technology work with Blockchain in future? Any possible
use cases that anyone is likely to face, please share.

Thanks
Gaurav



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



RE: Spark and Play

2014-11-13 Thread Mohammed Guller
Hi Patrick,
Although we are able to use Spark 1.1.0 with Play 2.2.x, as you mentioned, Akka 
incompatibility prevents us from using Spark with the current stable releases 
of Play (2.3.6) and Akka (2.3.7). Are there any plans to address this issue in 
Spark 1.2?

Thanks,
Mohammed

From: John Meehan [mailto:jnmee...@gmail.com]
Sent: Tuesday, November 11, 2014 11:35 PM
To: Mohammed Guller
Cc: Patrick Wendell; Akshat Aranya; user@spark.apache.org
Subject: Re: Spark and Play

You can also build a Play 2.2.x + Spark 1.1.0 fat jar with sbt-assembly for, 
e.g. yarn-client support or using with spark-shell for debugging:

play.Project.playScalaSettings

libraryDependencies ~= { _ map {
  case m if m.organization == com.typesafe.play =
m.exclude(commons-logging, commons-logging)
  case m = m
}}

assemblySettings

test in assembly := {}

mergeStrategy in assembly = (mergeStrategy in assembly) { (old) =
  {
case m if m.toLowerCase.endsWith(manifest.mf) = MergeStrategy.discard
case m if m.startsWith(META-INF) = MergeStrategy.discard
case PathList(javax, servlet, xs @ _*) = MergeStrategy.first
case PathList(org, apache, xs @ _*) = MergeStrategy.first
case PathList(org, jboss, xs @ _*) = MergeStrategy.first
case PathList(org, slf4j, xs @ _*) = MergeStrategy.discard
case about.html  = MergeStrategy.rename
case reference.conf = MergeStrategy.concat
case _ = MergeStrategy.first
  }
}

On Tue, Nov 11, 2014 at 3:04 PM, Mohammed Guller 
moham...@glassbeam.commailto:moham...@glassbeam.com wrote:
Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

Here is a sample build.sbt file:

name := xyz

version := 0.1 

scalaVersion := 2.10.4

libraryDependencies ++= Seq(
  jdbc,
  anorm,
  cache,
  org.apache.spark %% spark-core % 1.1.0,
  com.typesafe.akka %% akka-actor % 2.2.3,
  com.typesafe.akka %% akka-slf4j % 2.2.3,
  org.apache.spark %% spark-sql % 1.1.0
)

play.Project.playScalaSettings


Mohammed

-Original Message-
From: Patrick Wendell [mailto:pwend...@gmail.commailto:pwend...@gmail.com]
Sent: Tuesday, November 11, 2014 2:06 PM
To: Akshat Aranya
Cc: user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: Spark and Play

Hi There,

Because Akka versions are not binary compatible with one another, it might not 
be possible to integrate Play with Spark 1.1.0.

- Patrick

On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya 
aara...@gmail.commailto:aara...@gmail.com wrote:
 Hi,

 Sorry if this has been asked before; I didn't find a satisfactory
 answer when searching.  How can I integrate a Play application with
 Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x
 uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither
 of which work fine with Spark 1.1.0.  Is there something I should do
 with libraryDependencies in my build.sbt to make it work?

 Thanks,
 Akshat

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.orgmailto:user-unsubscr...@spark.apache.org For 
additional commands, e-mail: 
user-h...@spark.apache.orgmailto:user-h...@spark.apache.org


-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.orgmailto:user-unsubscr...@spark.apache.org
For additional commands, e-mail: 
user-h...@spark.apache.orgmailto:user-h...@spark.apache.org



Re: Spark and Play

2014-11-12 Thread Donald Szeto
Hi Akshat,

If your application is to serve results directly from a SparkContext, you
may want to take a look at http://prediction.io. It integrates Spark with
spray.io (another REST/web toolkit by Typesafe). Some heavy lifting is done
here:
https://github.com/PredictionIO/PredictionIO/blob/develop/core/src/main/scala/workflow/CreateServer.scala

Regards,
Donald
ᐧ

On Tue, Nov 11, 2014 at 11:35 PM, John Meehan jnmee...@gmail.com wrote:

 You can also build a Play 2.2.x + Spark 1.1.0 fat jar with sbt-assembly
 for, e.g. yarn-client support or using with spark-shell for debugging:

 play.Project.playScalaSettings

 libraryDependencies ~= { _ map {
   case m if m.organization == com.typesafe.play =
 m.exclude(commons-logging, commons-logging)
   case m = m
 }}

 assemblySettings

 test in assembly := {}

 mergeStrategy in assembly = (mergeStrategy in assembly) { (old) =
   {
 case m if m.toLowerCase.endsWith(manifest.mf) =
 MergeStrategy.discard
 case m if m.startsWith(META-INF) = MergeStrategy.discard
 case PathList(javax, servlet, xs @ _*) = MergeStrategy.first
 case PathList(org, apache, xs @ _*) = MergeStrategy.first
 case PathList(org, jboss, xs @ _*) = MergeStrategy.first
 case PathList(org, slf4j, xs @ _*) = MergeStrategy.discard
 case about.html  = MergeStrategy.rename
 case reference.conf = MergeStrategy.concat
 case _ = MergeStrategy.first
   }
 }

 On Tue, Nov 11, 2014 at 3:04 PM, Mohammed Guller moham...@glassbeam.com
 wrote:

 Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

 Here is a sample build.sbt file:

 name := xyz

 version := 0.1 

 scalaVersion := 2.10.4

 libraryDependencies ++= Seq(
   jdbc,
   anorm,
   cache,
   org.apache.spark %% spark-core % 1.1.0,
   com.typesafe.akka %% akka-actor % 2.2.3,
   com.typesafe.akka %% akka-slf4j % 2.2.3,
   org.apache.spark %% spark-sql % 1.1.0
 )

 play.Project.playScalaSettings


 Mohammed

 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Tuesday, November 11, 2014 2:06 PM
 To: Akshat Aranya
 Cc: user@spark.apache.org
 Subject: Re: Spark and Play

 Hi There,

 Because Akka versions are not binary compatible with one another, it
 might not be possible to integrate Play with Spark 1.1.0.

 - Patrick

 On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
  Hi,
 
  Sorry if this has been asked before; I didn't find a satisfactory
  answer when searching.  How can I integrate a Play application with
  Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x
  uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither
  of which work fine with Spark 1.1.0.  Is there something I should do
  with libraryDependencies in my build.sbt to make it work?
 
  Thanks,
  Akshat

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
 commands, e-mail: user-h...@spark.apache.org


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





-- 
Donald Szeto
PredictionIO


Spark and Play

2014-11-11 Thread Akshat Aranya
Hi,

Sorry if this has been asked before; I didn't find a satisfactory answer
when searching.  How can I integrate a Play application with Spark?  I'm
getting into issues of akka-actor versions.  Play 2.2.x uses akka-actor
2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither of which work fine
with Spark 1.1.0.  Is there something I should do with libraryDependencies
in my build.sbt to make it work?

Thanks,
Akshat


Re: Spark and Play

2014-11-11 Thread Patrick Wendell
Hi There,

Because Akka versions are not binary compatible with one another, it
might not be possible to integrate Play with Spark 1.1.0.

- Patrick

On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
 Hi,

 Sorry if this has been asked before; I didn't find a satisfactory answer
 when searching.  How can I integrate a Play application with Spark?  I'm
 getting into issues of akka-actor versions.  Play 2.2.x uses akka-actor 2.0,
 whereas Play 2.3.x uses akka-actor 2.3.4, neither of which work fine with
 Spark 1.1.0.  Is there something I should do with libraryDependencies in my
 build.sbt to make it work?

 Thanks,
 Akshat

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Spark and Play

2014-11-11 Thread Mohammed Guller
Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

Here is a sample build.sbt file:

name := xyz

version := 0.1 

scalaVersion := 2.10.4

libraryDependencies ++= Seq(
  jdbc,
  anorm,
  cache,
  org.apache.spark %% spark-core % 1.1.0,
  com.typesafe.akka %% akka-actor % 2.2.3,
  com.typesafe.akka %% akka-slf4j % 2.2.3,
  org.apache.spark %% spark-sql % 1.1.0
)

play.Project.playScalaSettings


Mohammed

-Original Message-
From: Patrick Wendell [mailto:pwend...@gmail.com] 
Sent: Tuesday, November 11, 2014 2:06 PM
To: Akshat Aranya
Cc: user@spark.apache.org
Subject: Re: Spark and Play

Hi There,

Because Akka versions are not binary compatible with one another, it might not 
be possible to integrate Play with Spark 1.1.0.

- Patrick

On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
 Hi,

 Sorry if this has been asked before; I didn't find a satisfactory 
 answer when searching.  How can I integrate a Play application with 
 Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x 
 uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither 
 of which work fine with Spark 1.1.0.  Is there something I should do 
 with libraryDependencies in my build.sbt to make it work?

 Thanks,
 Akshat

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark and Play

2014-11-11 Thread John Meehan
You can also build a Play 2.2.x + Spark 1.1.0 fat jar with sbt-assembly
for, e.g. yarn-client support or using with spark-shell for debugging:

play.Project.playScalaSettings

libraryDependencies ~= { _ map {
  case m if m.organization == com.typesafe.play =
m.exclude(commons-logging, commons-logging)
  case m = m
}}

assemblySettings

test in assembly := {}

mergeStrategy in assembly = (mergeStrategy in assembly) { (old) =
  {
case m if m.toLowerCase.endsWith(manifest.mf) = MergeStrategy.discard
case m if m.startsWith(META-INF) = MergeStrategy.discard
case PathList(javax, servlet, xs @ _*) = MergeStrategy.first
case PathList(org, apache, xs @ _*) = MergeStrategy.first
case PathList(org, jboss, xs @ _*) = MergeStrategy.first
case PathList(org, slf4j, xs @ _*) = MergeStrategy.discard
case about.html  = MergeStrategy.rename
case reference.conf = MergeStrategy.concat
case _ = MergeStrategy.first
  }
}

On Tue, Nov 11, 2014 at 3:04 PM, Mohammed Guller moham...@glassbeam.com
wrote:

 Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x

 Here is a sample build.sbt file:

 name := xyz

 version := 0.1 

 scalaVersion := 2.10.4

 libraryDependencies ++= Seq(
   jdbc,
   anorm,
   cache,
   org.apache.spark %% spark-core % 1.1.0,
   com.typesafe.akka %% akka-actor % 2.2.3,
   com.typesafe.akka %% akka-slf4j % 2.2.3,
   org.apache.spark %% spark-sql % 1.1.0
 )

 play.Project.playScalaSettings


 Mohammed

 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Tuesday, November 11, 2014 2:06 PM
 To: Akshat Aranya
 Cc: user@spark.apache.org
 Subject: Re: Spark and Play

 Hi There,

 Because Akka versions are not binary compatible with one another, it might
 not be possible to integrate Play with Spark 1.1.0.

 - Patrick

 On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya aara...@gmail.com wrote:
  Hi,
 
  Sorry if this has been asked before; I didn't find a satisfactory
  answer when searching.  How can I integrate a Play application with
  Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x
  uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither
  of which work fine with Spark 1.1.0.  Is there something I should do
  with libraryDependencies in my build.sbt to make it work?
 
  Thanks,
  Akshat

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
 commands, e-mail: user-h...@spark.apache.org


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org