Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-04 Thread Cheng Lian
I think this is a bug of Spark SQL dates back to at least 1.1.0. The json_tuple function is implemented as org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple. The ClassNotFoundException should complain with the class name rather than the UDTF function name. The problematic line

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-04 Thread Cheng Lian
Filed https://issues.apache.org/jira/browse/SPARK-6708 to track this. Cheng On 4/4/15 10:21 PM, Cheng Lian wrote: I think this is a bug of Spark SQL dates back to at least 1.1.0. The json_tuple function is implemented as org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple. The

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Akhil Das
How did you build spark? which version of spark are you having? Doesn't this thread already explains it? https://www.mail-archive.com/user@spark.apache.org/msg25505.html Thanks Best Regards On Thu, Apr 2, 2015 at 11:10 PM, Todd Nist tsind...@gmail.com wrote: Hi Akhil, Tried your suggestion

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Todd Nist
I placed it there. It was downloaded from MySql site. On Fri, Apr 3, 2015 at 6:25 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote: Akhil you mentioned /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar . how come you got this lib into spark/lib folder. 1) did you place it there ? 2) What

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Todd Nist
Hi Deepujain, I did include the jar file, I believe it is hive-exe.jar, through the --jars option: ./bin/spark-shell --master spark://radtech.io:7077 --total-executor-cores 2 --driver-class-path /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar --jars

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread ๏̯͡๏
I think you need to include the jar file through --jars option that contains the hive definition (code) of UDF json_tuple. That should solve your problem. On Fri, Apr 3, 2015 at 3:57 PM, Todd Nist tsind...@gmail.com wrote: I placed it there. It was downloaded from MySql site. On Fri, Apr 3,

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Todd Nist
Started the spark shell with the one jar from hive suggested: ./bin/spark-shell --master spark://radtech.io:7077 --total-executor-cores 2 --driver-class-path /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar --jars /opt/apache-hive-0.13.1-bin/lib/hive-exec-0.13.1.jar Results in the same

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Akhil Das
Copy pasted his command in the same thread. Thanks Best Regards On Fri, Apr 3, 2015 at 3:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote: Akhil you mentioned /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar . how come you got this lib into spark/lib folder. 1) did you place it there

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread ๏̯͡๏
Akhil you mentioned /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar . how come you got this lib into spark/lib folder. 1) did you place it there ? 2) What is download location ? On Fri, Apr 3, 2015 at 3:42 PM, Todd Nist tsind...@gmail.com wrote: Started the spark shell with the one

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-02 Thread Akhil Das
Try adding all the jars in your $HIVE/lib directory. If you want the specific jar, you could look fr jackson or json serde in it. Thanks Best Regards On Thu, Apr 2, 2015 at 12:49 AM, Todd Nist tsind...@gmail.com wrote: I have a feeling I’m missing a Jar that provides the support or could this

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-02 Thread Todd Nist
Hi Akhil, Tried your suggestion to no avail. I actually to not see and jackson or json serde jars in the $HIVE/lib directory. This is hive 0.13.1 and spark 1.2.1 Here is what I did: I have added the lib folder to the –jars option when starting the spark-shell, but the job fails. The