The same issue (A custome udf jar added through 'add jar' is not
recognized) is observed on Spark 1.4.1.
Instead of executing,
beelineadd jar udf.jar
My workaround is either
1) to pass the udf.jar by using --jars while starting ThriftServer
(This didn't work in AWS EMR's Spark 1.4.0.b).
or
2) to
Hello,
I am using SparkSQL along with ThriftServer so that we can access using Hive
queries.
With Spark 1.3.1, I can register UDF function. But, Spark 1.4.0 doesn't work
for that. The jar of the udf is same.
Below is logs:
I appreciate any advice.
== With Spark 1.4
Beeline version 1.4.0 by
The command list jar doesn't seem accepted in beeline with Spark's
ThriftServer in both Spark 1.3.1 and Spark1.4.
0: jdbc:hive2://localhost:1 list jar;
Error: org.apache.spark.sql.AnalysisException: cannot recognize input
near 'list' 'jar' 'EOF'; line 1 pos 0 (state=,code=0)
Thanks
On Tue,