[ 
https://issues.apache.org/jira/browse/SPARK-5798?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14320533#comment-14320533
 ] 

DeepakVohra commented on SPARK-5798:
------------------------------------

Thanks Sean for testing. 

Not all Spark/Scala code generates an error in Spark Shell. 

For example, run all pre-requisite import, var, and method code and 
subsequently run the following code to test:
model(sc, rawUserArtistData, rawArtistData, rawArtistAlias)

from:
https://github.com/sryza/aas/blob/master/ch03-recommender/src/main/scala/com/cloudera/datascience/recommender/RunRecommender.scala

Data files are local to Spark/Scala and not in HDFS. 

Environment is different: Oracle Linux 6.5, but should't be a factor. 

If the preceding test also does not generate an error would agree it is some 
other factor and not a bug. 

> Spark shell issue
> -----------------
>
>                 Key: SPARK-5798
>                 URL: https://issues.apache.org/jira/browse/SPARK-5798
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 1.2.0
>         Environment: Spark 1.2
> Scala 2.10.4
>            Reporter: DeepakVohra
>
> The Spark shell terminates when Spark code is run indicating an issue with 
> Spark shell.
> The error is coming from the spark shell file
>  
>   /apachespark/spark-1.2.0-bin-cdh4/bin/spark-shell: line 48
>  
>   "$FWDIR"/bin/spark-submit --class org.apache.spark.repl.Main
>   "${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to