Hello Yin,

Additional note:
In ./bin/spark-shell --jars "s3n:/mybucket/myudf.jar"  I got the following 
message in console.
Waring skipped external jar..
 
Thanks and Regards,
Sankar S.  



On , S Malligarjunan <smalligarju...@yahoo.com> wrote:
 


Hello Yin,

I have tried use sc.addJar and hiveContext.sparkContext.addJar and 
./bin/spark-shell --jars option,

In all three option when I try to create temporary funtion i get the 
classNotFoundException. What would be the issue here?
 
Thanks and Regards,
Sankar S.  



On Saturday, 23 August 2014, 0:53, Yin Huai <huaiyin....@gmail.com> wrote:
 


Hello Sankar,

"Add JAR" in SQL is not supported at the moment. We are working on it 
(https://issues.apache.org/jira/browse/SPARK-2219). For now, can you try 
SparkContext.addJar or using "--jars <your-jar>" to launch spark shell?

Thanks,

Yin 



On Fri, Aug 22, 2014 at 2:01 PM, S Malligarjunan <smalligarju...@yahoo.com> 
wrote:

Hello Yin/All.
>
>
>@Yin - Thanks for helping. I solved the sql parser error. I am getting the 
>following exception now
>
>
>scala> hiveContext.hql("ADD JAR s3n://hadoop.anonymous.com/lib/myudf.jar");
>warning: there were 1 deprecation warning(s); re-run with -deprecation for 
>details
>14/08/22 17:58:55 INFO SessionState: converting to local 
>s3n://hadoop.anonymous.com/lib/myudf.jar
>14/08/22 17:58:56 ERROR SessionState: Unable to register 
>/tmp/3d273a4c-0494-4bec-80fe-86aa56f11684_resources/myudf.jar
>Exception: org.apache.spark.repl.SparkIMain$TranslatingClassLoader cannot be 
>cast to java.net.URLClassLoader
>java.lang.ClassCastException: 
>org.apache.spark.repl.SparkIMain$TranslatingClassLoader cannot be cast to 
>java.net.URLClassLoader
>at org.apache.hadoop.hive.ql.exec.Utilities.addToClassPath(Utilities.java:1680)
>
>
> 
>Thanks and Regards,
>Sankar S.  
>
>
>
>
>
>On Friday, 22 August 2014, 22:53, S Malligarjunan 
><smalligarju...@yahoo.com.INVALID> wrote:
> 
>
>
>Hello Yin,
>
>
>Forgot to mention one thing, the same query works fine in Hive and Shark..
> 
>Thanks and Regards,
>Sankar S.  
>
>
>
>
>
>On , S Malligarjunan <smalligarju...@yahoo.com> wrote:
> 
>
>
>Hello Yin,
>
>
>I have tried  the create external table command as well. I get the same error.
>Please help me to find the root cause.
> 
>Thanks and Regards,
>Sankar S.  
>
>
>
>
>
>On Friday, 22 August 2014, 22:43, Yin Huai <huaiyin....@gmail.com> wrote:
> 
>
>
>Hi Sankar,
>
>
>You need to create an external table in order to specify the location of data 
>(i.e. using CREATE EXTERNAL TABLE user1 .... LOCATION).  You can take a look 
>at this page for reference. 
>
>
>Thanks,
>
>
>Yin
>
>
>
>On Thu, Aug 21, 2014 at 11:12 PM, S Malligarjunan 
><smalligarju...@yahoo.com.invalid> wrote:
>
>Hello All,
>>
>>
>>When i execute the following query 
>>
>>
>>
>>
>>val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
>>
>>
>>CREATE TABLE user1 (time string, id string, u_id string, c_ip string, 
>>user_agent string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' LINES 
>>TERMINATED BY '\n' STORED AS TEXTFILE LOCATION 
>>'s3n://hadoop.anonymous.com/output/qa/cnv_px_ip_gnc/ds=2014-06-14/')
>>
>>
>>I am getting the following error 
>>org.apache.spark.sql.hive.HiveQl$ParseException: Failed to parse: CREATE 
>>TABLE user1 (time string, id string, u_id string, c_ip string, user_agent 
>>string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '' LINES TERMINATED BY '
>>' STORED AS TEXTFILE LOCATION 
>>'s3n://hadoop.anonymous.com/output/qa/cnv_px_ip_gnc/ds=2014-06-14/')
>>at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:215)
>>at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:98)
>>at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:102)
>>at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
>>at $iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
>>at $iwC$$iwC$$iwC.<init>(<console>:29)
>>at $iwC$$iwC.<init>(<console>:31)
>>at $iwC.<init>(<console>:33)
>>at <init>(<console>:35)
>>
>>
>>Kindly let me know what could be the issue here.
>>
>>
>>I have cloned spark from github. Using Hadoop 1.0.3 
>> 
>>Thanks and Regards,
>>Sankar S.  
>>
>>
>
>
>
>
>
>
>

Reply via email to