Spark submit running spark-sql-perf and additional jar

2016-10-23 Thread Mr rty ff
Hi
I run the following script

home/spark-2.0.1-bin-hadoop2.7/bin/spark-submit --conf  "someconf" "--jars 
/home/user/workspace/auxdriver/target/auxdriver.jar,/media/sf_VboxShared/tpc-ds/spark-sql-perf-v.0.2.4/spark-sql-perf-assembly-0.2.4.jar
 --benchmark DatabasePerformance --iterations 1 --sparkMaster somemaster 
--location "location" --scaleFactor 50 --appName "nvm"
I get the following error
Error: Cannot load main class from JAR 
file:/home/irina/workspace/stocator-s3/target/stocator-s3-1.0-jar-with-dependencies.jar,/media/sf_VboxShared/tpc-ds/spark-sql-perf-v.0.2.4/spark-sql-perf-assembly-0.2.4.jar
If I change the order I get
Error: Unknown argument '/home/user/workspace/auxdriver/target/auxdriver.jar'

If I run without auxdriver it works ok but I need it to run it with it to test 
the driver.

What is the proper way to run spark submit with two jars while one with 
parameters

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Using explain plan to optimize sql query

2016-02-14 Thread Mr rty ff
HiI have some queries that take a long time to execute so I used an 
df.explain(true) to print physical and logical plans to see where the 
bottlenecks.As the query is very complicated I got a very unreadable  
result.How can I parse it to some thing more readable and  analyze it?And 
another question when analyzing execution plan to what should I pay attention?
Thanks