[ 
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14619697#comment-14619697
 ] 

ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------

Github user dlyubimov commented on a diff in the pull request:

    https://github.com/apache/mahout/pull/146#discussion_r34216367
  
    --- Diff: 
spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
 ---
    @@ -48,55 +83,87 @@ class MahoutSparkILoop extends SparkILoop {
     
         conf.set("spark.executor.memory", "1g")
     
    -    sparkContext = mahoutSparkContext(
    +    sdc = mahoutSparkContext(
           masterUrl = master,
           appName = "Mahout Spark Shell",
           customJars = jars,
           sparkConf = conf
         )
     
    -    echo("Created spark context..")
    +    _interp.sparkContext = sdc
    +
    +    echoToShell("Created spark context..")
         sparkContext
       }
     
    +  // this is technically not part of Spark's explicitly defined Developer 
API though
    +  // nothing in the SparkILoopInit.scala file is marked as such.
       override def initializeSpark() {
    -    intp.beQuietDuring {
    -      command("""
     
    -         @transient implicit val sdc: 
org.apache.mahout.math.drm.DistributedContext =
    -            new org.apache.mahout.sparkbindings.SparkDistributedContext(
    -            org.apache.spark.repl.Main.interp.createSparkContext())
    +    _interp.beQuietDuring {
    +
    +      // get the spark context, at the same time create and store a mahout 
distributed context.
    +      _interp.interpret("""
    +         @transient val sc = {
    --- End diff --
    
    i honestly don't know if spark declared it as implicit. if it does not, we
    don't have to either i suppose.
    
    the reason mahout context is implicit is because all top-level routines
    like drmParallelize(...) look for it.
    
    On Wed, Jul 8, 2015 at 1:35 PM, Andrew Palumbo <notificati...@github.com>
    wrote:
    
    > In
    > 
spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
    > <https://github.com/apache/mahout/pull/146#discussion_r34195169>:
    >
    > >
    > > -         @transient implicit val sdc: 
org.apache.mahout.math.drm.DistributedContext =
    > > -            new 
org.apache.mahout.sparkbindings.SparkDistributedContext(
    > > -            org.apache.spark.repl.Main.interp.createSparkContext())
    > > +    _interp.beQuietDuring {
    > > +
    > > +      // get the spark context, at the same time create and store a 
mahout distributed context.
    > > +      _interp.interpret("""
    > > +         @transient val sc = {
    >
    > @dlyubimov <https://github.com/dlyubimov> I've made the necessary changes
    > clean up the the redundant creation of a SparkDistributedContext. Thanks
    > alot for the input.
    >
    > You mentioned is that we could declare the SparkContext as implicit val
    > sc = .... Is there a reason that it should be implicit?
    >
    > I've left it as val sc =... for now since that is the way Spark declares
    > is in this method. Thx.
    >
    > —
    > Reply to this email directly or view it on GitHub
    > <https://github.com/apache/mahout/pull/146/files#r34195169>.
    >



> Spark 1.3
> ---------
>
>                 Key: MAHOUT-1653
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1653
>             Project: Mahout
>          Issue Type: Dependency upgrade
>            Reporter: Andrew Musselman
>            Assignee: Andrew Palumbo
>             Fix For: 0.11.0
>
>
> Support Spark 1.3



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to