Re: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-05 Thread Okehee Goh
I will..that will be great if simple UDF can return complex type.
Thanks!

On Fri, Jun 5, 2015 at 12:17 AM, Cheng, Hao hao.ch...@intel.com wrote:
 Confirmed, with latest master, we don't support complex data type for Simple 
 Hive UDF, do you mind file an issue in jira?

 -Original Message-
 From: Cheng, Hao [mailto:hao.ch...@intel.com]
 Sent: Friday, June 5, 2015 12:35 PM
 To: ogoh; user@spark.apache.org
 Subject: RE: SparkSQL : using Hive UDF returning Map throws rror: 
 scala.MatchError: interface java.util.Map (of class java.lang.Class) 
 (state=,code=0)

 Which version of Hive jar are you using? Hive 0.13.1 or Hive 0.12.0?

 -Original Message-
 From: ogoh [mailto:oke...@gmail.com]
 Sent: Friday, June 5, 2015 10:10 AM
 To: user@spark.apache.org
 Subject: SparkSQL : using Hive UDF returning Map throws rror: 
 scala.MatchError: interface java.util.Map (of class java.lang.Class) 
 (state=,code=0)


 Hello,
 I tested some custom udf on SparkSql's ThriftServer  Beeline (Spark 1.3.1).
 Some udfs work fine (access array parameter and returning int or string type).
 But my udf returning map type throws an error:
 Error: scala.MatchError: interface java.util.Map (of class java.lang.Class) 
 (state=,code=0)

 I converted the code into Hive's GenericUDF since I worried that using 
 complex type parameter (array of map) and returning complex type (map) can be 
 supported in Hive's GenericUDF instead of simple UDF.
 But SparkSQL doesn't seem supporting GenericUDF.(error message : Error:
 java.lang.IllegalAccessException: Class
 org.apache.spark.sql.hive.HiveFunctionWrapper can not access ..).

 Below is my example udf code returning MAP type.
 I appreciate any advice.
 Thanks

 --

 public final class ArrayToMap extends UDF {

 public MapString,String evaluate(ArrayListString arrayOfString) {
 // add code to handle all index problem

 MapString, String map = new HashMapString, String();

 int count = 0;
 for (String element : arrayOfString) {
 map.put(count + , element);
 count++;

 }
 return map;
 }
 }






 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-using-Hive-UDF-returning-Map-throws-rror-scala-MatchError-interface-java-util-Map-of-class--tp23164.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
 commands, e-mail: user-h...@spark.apache.org


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-05 Thread Okehee Goh
It is Spark 1.3.1.e (it is AWS release .. I think it is close to Spark
1.3.1 with some bug fixes).

My report about GenericUDF not working in SparkSQL is wrong. I tested
with open-source GenericUDF and it worked fine. Just my GenericUDF
which returns Map type didn't work. Sorry about false reporting.



On Thu, Jun 4, 2015 at 9:35 PM, Cheng, Hao hao.ch...@intel.com wrote:
 Which version of Hive jar are you using? Hive 0.13.1 or Hive 0.12.0?

 -Original Message-
 From: ogoh [mailto:oke...@gmail.com]
 Sent: Friday, June 5, 2015 10:10 AM
 To: user@spark.apache.org
 Subject: SparkSQL : using Hive UDF returning Map throws rror: 
 scala.MatchError: interface java.util.Map (of class java.lang.Class) 
 (state=,code=0)


 Hello,
 I tested some custom udf on SparkSql's ThriftServer  Beeline (Spark 1.3.1).
 Some udfs work fine (access array parameter and returning int or string type).
 But my udf returning map type throws an error:
 Error: scala.MatchError: interface java.util.Map (of class java.lang.Class) 
 (state=,code=0)

 I converted the code into Hive's GenericUDF since I worried that using 
 complex type parameter (array of map) and returning complex type (map) can be 
 supported in Hive's GenericUDF instead of simple UDF.
 But SparkSQL doesn't seem supporting GenericUDF.(error message : Error:
 java.lang.IllegalAccessException: Class
 org.apache.spark.sql.hive.HiveFunctionWrapper can not access ..).

 Below is my example udf code returning MAP type.
 I appreciate any advice.
 Thanks

 --

 public final class ArrayToMap extends UDF {

 public MapString,String evaluate(ArrayListString arrayOfString) {
 // add code to handle all index problem

 MapString, String map = new HashMapString, String();

 int count = 0;
 for (String element : arrayOfString) {
 map.put(count + , element);
 count++;

 }
 return map;
 }
 }






 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-using-Hive-UDF-returning-Map-throws-rror-scala-MatchError-interface-java-util-Map-of-class--tp23164.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
 commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-05 Thread Cheng, Hao
Confirmed, with latest master, we don't support complex data type for Simple 
Hive UDF, do you mind file an issue in jira?

-Original Message-
From: Cheng, Hao [mailto:hao.ch...@intel.com] 
Sent: Friday, June 5, 2015 12:35 PM
To: ogoh; user@spark.apache.org
Subject: RE: SparkSQL : using Hive UDF returning Map throws rror: 
scala.MatchError: interface java.util.Map (of class java.lang.Class) 
(state=,code=0)

Which version of Hive jar are you using? Hive 0.13.1 or Hive 0.12.0?

-Original Message-
From: ogoh [mailto:oke...@gmail.com] 
Sent: Friday, June 5, 2015 10:10 AM
To: user@spark.apache.org
Subject: SparkSQL : using Hive UDF returning Map throws rror: 
scala.MatchError: interface java.util.Map (of class java.lang.Class) 
(state=,code=0)


Hello,
I tested some custom udf on SparkSql's ThriftServer  Beeline (Spark 1.3.1).
Some udfs work fine (access array parameter and returning int or string type). 
But my udf returning map type throws an error:
Error: scala.MatchError: interface java.util.Map (of class java.lang.Class) 
(state=,code=0)

I converted the code into Hive's GenericUDF since I worried that using complex 
type parameter (array of map) and returning complex type (map) can be supported 
in Hive's GenericUDF instead of simple UDF.
But SparkSQL doesn't seem supporting GenericUDF.(error message : Error:
java.lang.IllegalAccessException: Class
org.apache.spark.sql.hive.HiveFunctionWrapper can not access ..).

Below is my example udf code returning MAP type.
I appreciate any advice.
Thanks

--

public final class ArrayToMap extends UDF {

public MapString,String evaluate(ArrayListString arrayOfString) {
// add code to handle all index problem

MapString, String map = new HashMapString, String();
   
int count = 0;
for (String element : arrayOfString) {
map.put(count + , element);
count++;

}
return map;
}
}






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-using-Hive-UDF-returning-Map-throws-rror-scala-MatchError-interface-java-util-Map-of-class--tp23164.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-04 Thread Cheng, Hao
Which version of Hive jar are you using? Hive 0.13.1 or Hive 0.12.0?

-Original Message-
From: ogoh [mailto:oke...@gmail.com] 
Sent: Friday, June 5, 2015 10:10 AM
To: user@spark.apache.org
Subject: SparkSQL : using Hive UDF returning Map throws rror: 
scala.MatchError: interface java.util.Map (of class java.lang.Class) 
(state=,code=0)


Hello,
I tested some custom udf on SparkSql's ThriftServer  Beeline (Spark 1.3.1).
Some udfs work fine (access array parameter and returning int or string type). 
But my udf returning map type throws an error:
Error: scala.MatchError: interface java.util.Map (of class java.lang.Class) 
(state=,code=0)

I converted the code into Hive's GenericUDF since I worried that using complex 
type parameter (array of map) and returning complex type (map) can be supported 
in Hive's GenericUDF instead of simple UDF.
But SparkSQL doesn't seem supporting GenericUDF.(error message : Error:
java.lang.IllegalAccessException: Class
org.apache.spark.sql.hive.HiveFunctionWrapper can not access ..).

Below is my example udf code returning MAP type.
I appreciate any advice.
Thanks

--

public final class ArrayToMap extends UDF {

public MapString,String evaluate(ArrayListString arrayOfString) {
// add code to handle all index problem

MapString, String map = new HashMapString, String();
   
int count = 0;
for (String element : arrayOfString) {
map.put(count + , element);
count++;

}
return map;
}
}






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-using-Hive-UDF-returning-Map-throws-rror-scala-MatchError-interface-java-util-Map-of-class--tp23164.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org