) to add the custom UDF jar into SPARK_CLASSPATH ( It works in AWS EMR)
Thanks,
On Tue, Jul 14, 2015 at 9:29 PM, Okehee Goh oke...@gmail.com wrote:
The command list jar doesn't seem accepted in beeline with Spark's
ThriftServer in both Spark 1.3.1 and Spark1.4.
0: jdbc:hive2://localhost:1
The command list jar doesn't seem accepted in beeline with Spark's
ThriftServer in both Spark 1.3.1 and Spark1.4.
0: jdbc:hive2://localhost:1 list jar;
Error: org.apache.spark.sql.AnalysisException: cannot recognize input
near 'list' 'jar' 'EOF'; line 1 pos 0 (state=,code=0)
Thanks
On Tue,
I will..that will be great if simple UDF can return complex type.
Thanks!
On Fri, Jun 5, 2015 at 12:17 AM, Cheng, Hao hao.ch...@intel.com wrote:
Confirmed, with latest master, we don't support complex data type for Simple
Hive UDF, do you mind file an issue in jira?
-Original
It is Spark 1.3.1.e (it is AWS release .. I think it is close to Spark
1.3.1 with some bug fixes).
My report about GenericUDF not working in SparkSQL is wrong. I tested
with open-source GenericUDF and it worked fine. Just my GenericUDF
which returns Map type didn't work. Sorry about false
Thanks, Michael and Akhil.
Yes, it worked with Spark 1.3.1 along with AWS EMR AMI 3.7.
Sorry I didn't update the status.
On Mon, Jun 1, 2015 at 5:17 AM, Michael Armbrust mich...@databricks.com wrote:
This sounds like a problem that was fixed in Spark 1.3.1.
versions of scala on your
classpath?
On Thu, Apr 2, 2015 at 4:28 PM, Okehee Goh oke...@gmail.com wrote:
yes, below is the stacktrace.
Thanks,
Okehee
java.lang.NoSuchMethodError:
scala.reflect.NameTransformer$.LOCAL_SUFFIX_STRING()Ljava/lang/String;
at scala.reflect.internal.StdNames
yes, below is the stacktrace.
Thanks,
Okehee
java.lang.NoSuchMethodError:
scala.reflect.NameTransformer$.LOCAL_SUFFIX_STRING()Ljava/lang/String;
at scala.reflect.internal.StdNames$CommonNames.init(StdNames.scala:97)
at