[ https://issues.apache.org/jira/browse/MAHOUT-1950?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15896035#comment-15896035 ]
ASF GitHub Bot commented on MAHOUT-1950: ---------------------------------------- GitHub user rawkintrevo opened a pull request: https://github.com/apache/mahout/pull/291 MAHOUT-1950 Fix block unread error in shell You can merge this pull request into a Git repository by running: $ git pull https://github.com/rawkintrevo/mahout mahout-1950 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/mahout/pull/291.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #291 ---- commit a30d241a67d3b90673924c4332ce432f335a7d05 Author: rawkintrevo <trevor.d.gr...@gmail.com> Date: 2017-03-05T03:08:33Z MAHOUT-1950 Fix block unread error in shell ---- > Unread Block Data in Spark Shell Pseudo Cluster > ----------------------------------------------- > > Key: MAHOUT-1950 > URL: https://issues.apache.org/jira/browse/MAHOUT-1950 > Project: Mahout > Issue Type: Bug > Components: Mahout spark shell > Affects Versions: 0.13.0 > Environment: Spark 1.6.3 Cluster / Pseudo Cluster / YARN Cluster (all > observed) > Reporter: Trevor Grant > Assignee: Trevor Grant > Priority: Blocker > > When doing an operation in the Spark Shell on a Pseudo Cluster, a > `java.lang.IllegalStateException: unread block data` error is thrown. > Research and stack trace implies there is some issue with serialization. > Other issues with spark in cluster mode, hint that the Kryo Jars aren't being > shipped around. > Toying has shown that: > `$SPARK_HOME/bin/spark-shell --jars > "/opt/mahout/math-scala/target/mahout-math-scala_2.10-0.13.0-SNAPSHOT.jar,/opt/mahout/math/target/mahout-math-0.13.0-SNAPSHOT.jar,/opt/mahout/spark/target/mahout-spark_2.10-0.13.0-SNAPSHOT.jar,/opt/mahout/spark/target/mahout-spark_2.10-0.13.0-SNAPSHOT-dependency-reduced.jar" > -i $MAHOUT_HOME/bin/load-shell.scala --conf > spark.kryo.referenceTracking=false --conf > spark.kryo.registrator=org.apache.mahout.sparkbindings.io.MahoutKryoRegistrator > --conf spark.kryoserializer.buffer=32k --conf > spark.kryoserializer.buffer.max=600m --conf > spark.serializer=org.apache.spark.serializer.KryoSerializer` > works, and should be used in place of: > https://github.com/apache/mahout/blob/master/bin/mahout#L294 -- This message was sent by Atlassian JIRA (v6.3.15#6346)