[
https://issues.apache.org/jira/browse/PIG-5246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
liyunzhang_intel updated PIG-5246:
----------------------------------
Attachment: PIG-5246.patch
[~nkollar], [~szita]: help review
in spark2, spark-assembly*.jar does not exist, so we need append all jars under
$SPARK_HOME/jars/ to the pig classpath.
{code}
+ if [ "$sparkversion" == "21" ]; then
+ if [ -n "$SPARK_HOME" ]; then
+ echo "Using Spark Home: " ${SPARK_HOME}
+ for f in $SPARK_HOME/jars/*.jar; do
+ CLASSPATH=${CLASSPATH}:$f
+ done
+ fi
+ fi
{code}
the way to use
1. build pig with spark21
{noformat}
ant clean -v -Dsparkversion=21 -Dhadoopversion=2 jar
{noformat}
2. run pig with spark21
{noformat}
/pig -x $mode -sparkversion 21 -log4jconf $PIG_HOME/conf/log4j.properties
-logfile $PIG_HOME/logs/pig.log $PIG_HOME/bin/testJoin.pig
{noformat}
> Modify bin/pig about SPARK_HOME, SPARK_ASSEMBLY_JAR after upgrading spark to 2
> ------------------------------------------------------------------------------
>
> Key: PIG-5246
> URL: https://issues.apache.org/jira/browse/PIG-5246
> Project: Pig
> Issue Type: Bug
> Reporter: liyunzhang_intel
> Assignee: liyunzhang_intel
> Attachments: PIG-5246.patch
>
>
> in bin/pig.
> we copy assembly jar to pig's classpath in spark1.6.
> {code}
> # For spark mode:
> # Please specify SPARK_HOME first so that we can locate
> $SPARK_HOME/lib/spark-assembly*.jar,
> # we will add spark-assembly*.jar to the classpath.
> if [ "$isSparkMode" == "true" ]; then
> if [ -z "$SPARK_HOME" ]; then
> echo "Error: SPARK_HOME is not set!"
> exit 1
> fi
> # Please specify SPARK_JAR which is the hdfs path of spark-assembly*.jar
> to allow YARN to cache spark-assembly*.jar on nodes so that it doesn't need
> to be distributed each time an application runs.
> if [ -z "$SPARK_JAR" ]; then
> echo "Error: SPARK_JAR is not set, SPARK_JAR stands for the hdfs
> location of spark-assembly*.jar. This allows YARN to cache
> spark-assembly*.jar on nodes so that it doesn't need to be distributed each
> time an application runs."
> exit 1
> fi
> if [ -n "$SPARK_HOME" ]; then
> echo "Using Spark Home: " ${SPARK_HOME}
> SPARK_ASSEMBLY_JAR=`ls ${SPARK_HOME}/lib/spark-assembly*`
> CLASSPATH=${CLASSPATH}:$SPARK_ASSEMBLY_JAR
> fi
> fi
> {code}
> after upgrade to spark2.0, we may modify it
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)