hi,all:
I build spark use:

./make-distribution.sh --name "hadoop2.7.1" --tgz
"-Pyarn,hadoop-2.6,parquet-provided,hive,hive-thriftserver" -DskipTests
-Dhadoop.version=2.7.1

I can run example :
./bin/spark-submit --class org.apache.spark.examples.SparkPi \
    --master spark://master1:7077 \
    --driver-memory 1g \
    --executor-memory 512m \
    --executor-cores 1 \
    lib/spark-examples*.jar \
    10

but can't run example :
org.apache.spark.examples.sql.RDDRelation

*I got error:*
16/07/07 18:28:45 INFO client.AppClient$ClientEndpoint: Executor updated:
app-20160707182845-0003/2 is now RUNNING
16/07/07 18:28:45 INFO client.AppClient$ClientEndpoint: Executor updated:
app-20160707182845-0003/4 is now RUNNING
16/07/07 18:28:45 INFO client.AppClient$ClientEndpoint: Executor updated:
app-20160707182845-0003/3 is now RUNNING
16/07/07 18:28:45 INFO client.AppClient$ClientEndpoint: Executor updated:
app-20160707182845-0003/0 is now RUNNING
16/07/07 18:28:45 INFO client.AppClient$ClientEndpoint: Executor updated:
app-20160707182845-0003/1 is now RUNNING
16/07/07 18:28:45 INFO client.AppClient$ClientEndpoint: Executor updated:
app-20160707182845-0003/5 is now RUNNING
16/07/07 18:28:46 INFO cluster.SparkDeploySchedulerBackend:
SchedulerBackend is ready for scheduling beginning after reached
minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/parquet/hadoop/ParquetOutputCommitter
at org.apache.spark.sql.SQLConf$.<init>(SQLConf.scala:319)
at org.apache.spark.sql.SQLConf$.<clinit>(SQLConf.scala)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:85)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77)
at main.RDDRelation$.main(RDDRelation.scala:13)
at main.RDDRelation.main(RDDRelation.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException:
org.apache.parquet.hadoop.ParquetOutputCommitter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 15 more

Reply via email to