[ 
https://issues.apache.org/jira/browse/MAHOUT-1778?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14982841#comment-14982841
 ] 

ASF GitHub Bot commented on MAHOUT-1778:
----------------------------------------

Github user pferrel commented on a diff in the pull request:

    https://github.com/apache/mahout/pull/164#discussion_r43524084
  
    --- Diff: bin/compute-classpath.sh ---
    @@ -0,0 +1,167 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# This script computes Spark's classpath and prints it to stdout; it's 
used by both the "run"
    +# script and the ExecutorRunner in standalone cluster mode.
    +
    +# Figure out where Spark is installed
    +#FWDIR="$(cd "`dirname "$0"`"/..; pwd)"
    +FWDIR="$SPARK_HOME"
    +
    +echo "Using FWDIR: $FWDIR"
    +
    +#. "$FWDIR"/bin/load-spark-env.sh # not executable by defult in 
$SPARK_HOME/bin
    +
    +"$MAHOUT_HOME"/bin/mahout-load-spark-env.sh
    --- End diff --
    
    after moving the Scala version check to compute-classpath.sh the Shell 
starts up without a glitch, no more warnings and no OOM. But it throw an 
exception on exit. BTW I built Spark 1.5.1 from source for Hadoop 2.4.1 (my 
version) compute-classpath seems to create the correct result but 
mahout-spark-class.sh can't be used independently. Have not tested the Scala 
2.11 case--yikes this is getting complicated.
    
    here is the exit error:
    
        mahout> :q
        Stopping spark context.
        15/10/30 09:29:49 ERROR TransportResponseHandler: Still have 1 requests 
outstanding when connection from 192.168.0.2/192.168.0.2:65385 is closed
        15/10/30 09:29:49 WARN NettyRpcEnv: Exception when sending 
RequestMessage(192.168.0.2:65376,NettyRpcEndpointRef(spark://[email protected]:65385),StopExecutor,false)
        java.io.IOException: Connection from 192.168.0.2/192.168.0.2:65385 
closed
        at         
org.apache.spark.network.client.TransportResponseHandler.channelUnregistered(TransportResponseHandler.java:104)
        at 
org.apache.spark.network.server.TransportChannelHandler.channelUnregistered(TransportChannelHandler.java:91)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelUnregistered(DefaultChannelPipeline.java:739)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:659)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:745)



> MAhout Spark Shell doesn't work with Spark > 1.3
> ------------------------------------------------
>
>                 Key: MAHOUT-1778
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1778
>             Project: Mahout
>          Issue Type: Improvement
>          Components: Mahout spark shell
>    Affects Versions: 0.11.0
>            Reporter: Suneel Marthi
>            Assignee: Pat Ferrel
>             Fix For: 0.12.0
>
>
> Mahout Spark Shell uses compute-classpath.sh from Spark for loading up the 
> shell. The compute-classpath.sh script was removed from Spark 1.4 and above. 
> The Mahout Spark Shell code needs to be fixed to handle the changes in Spark 
> 1.4.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to