I am trying to run the below Hive query on Yarn. I am using Cloudera 5.1. What can I do to make this work?
/"SELECT * FROM table_name DISTRIBUTE BY GEO_REGION, GEO_COUNTRY SORT BY IP_ADDRESS, COOKIE_ID";/ Below is the stack trace: Exception in thread "Thread-4" java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:187) Caused by: *java.lang.NoClassDefFoundError: org/apache/spark/sql/hive/api/java/JavaHiveContext* at HiveContextExample.main(HiveContextExample.java:57) ... 5 more Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.hive.api.java.JavaHiveContext at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 6 more Here is the code invoking it: /SparkConf conf = new SparkConf().setAppName("PartitionData"); JavaSparkContext ctx = new JavaSparkContext(conf); JavaHiveContext hiveContext = new JavaHiveContext(ctx); String sql = "SELECT * FROM table_name DISTRIBUTE BY GEO_REGION, GEO_COUNTRY SORT BY IP_ADDRESS, COOKIE_ID"; JavaSchemaRDD partitionedRDD = hiveContext.sql(sql);/ -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/JavaHiveContext-class-not-found-error-Help-tp17149.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org