Thanks Rajeshbabu and James for your help! The problem is resolved. There was
an issue with the jar packaging. Instead of putting the UDF in the
org.apache.phoenix.expression.function package, I had to put it in my own,
custom package.
- Anchal
On Tuesday, July 28, 2015 4:05 AM, "[email protected]"
<[email protected]> wrote:
bq. Rajeshbabu, instead of HDFS, I have a custom NFS setup for which I have
specified the fs.nfs.impl property in hbase-site. I've put the UDF jar in this
NFS setup, and also in the directory specified in the hbase.dynamic.jars.dir
property
In that case it should work. You can check the jar is getting loaded to local
directory specified for the below property.\
<property>
<name>hbase.local.dir</name>
<value>${hbase.tmp.dir}/local/</value>
<description>Directory on the local filesystem to be used
as a local storage.</description>
</property>
If the jar is not loaded then you can try placing the jar in the above
configured folder. But this workaround will not help if you use the udf in
where,group by clauses which will be converted to filters and server side we
try to load the same jar from the file system.
Thanks,Rajeshbabu.
On Tue, Jul 28, 2015 at 1:13 AM, Anchal Agrawal <[email protected]> wrote:
Hi James and Rajeshbabu,
Thank you for your replies. My hbase-site confs are being picked up, I have
confirmed it by deliberately misconfiguring one of the properties.
Rajeshbabu, instead of HDFS, I have a custom NFS setup for which I have
specified the fs.nfs.impl property in hbase-site. I've put the UDF jar in this
NFS setup, and also in the directory specified in the hbase.dynamic.jars.dir
property. I've been using the same setup and confs with Pig and it works. I
want to use Phoenix because it is significantly faster for my use cases.
I'm still getting the same ClassNotFoundException for the UDF class. I've tried
putting the UDF jar in multiple places, and as recommended by James, only the
UDF class is in the jar. The UDF class dependencies are included in the
classpath. Does Phoenix look for HDFS explicitly or does it just look at the
fs.some_fs.impl property? The latter shouldn't be a problem for my setup. Does
Phoenix write logs to HDFS? If it does, I can test whether it is able to find
my NFS setup or not.
Related question: Is putting the UDF jar in the local filesystem not supported
by Phoenix?
Sincerely,Anchal
On Monday, July 27, 2015 4:59 AM, "[email protected]"
<[email protected]> wrote:
Hi Anchal Agrawal,
Have you place the jar in HDFS? and the path_to_jar in the create function is
the URI for the jar in hdfs?
Thanks,Rajeshbabu.
On Sat, Jul 25, 2015 at 5:58 AM, James Taylor <[email protected]> wrote:
Are you sure your hbase-site.xml is being picked up on the client-side? I've
seen this happen numerous times. Maybe try setting something in there that
would cause an obvious issue to confirm.
I'm not away of anything else you need to do, but I'm sure Rajeshbabu will
chime in if there is.
Thanks,James
On Fri, Jul 24, 2015 at 5:25 PM, Anchal Agrawal <[email protected]> wrote:
Hi James,
Thanks for your email! I have set the hbase-site.xml configs. I tried removing
the dependent jars from the UDF jar and instead included the dependencies in
the classpath, but that didn't help.
Is there anything else that I could be missing, or could I try out some other
debug steps?
Thank you,Anchal
On Friday, July 24, 2015 3:29 PM, James Taylor <[email protected]>
wrote:
I don't believe you'd want to bundle the dependent jars iniside your jar - I
wasn't completely sure if that's what you've done. Also there's a config you
need to enable in your client-side hbase-site.xml to use this
feature.Thanks,James
On Friday, July 24, 2015, Anchal Agrawal <[email protected]> wrote:
Hi all,
I'm having issues getting a UDF to work. I've followed the instructions and
created a jar, and I've created a function with the CREATE FUNCTION command.
However, when I use the function in a SELECT statement, I get a
ClassNotFoundException for the custom class I wrote. I'm using v4.4.0.
Here's some debugging information:1. The UDF jar includes the dependency jars
(phoenix-core, hbase, hadoop-common, etc.), in addition to the UDF class
itself. There are no permission issues with the jar.
2. I've tried putting the jar on the local FS, on my custom DFS, and also in
the HBase dynamic jar dir (as specified in hbase-site.xml).3. I've tried the
CREATE FUNCTION command without giving the jar path (the jar is present in the
HBase dynamic jar dir).
4. The Phoenix client doesn't report any syntax errors with my CREATE FUNCTION
command I'm using:create function GetValue(VARBINARY) returns UNSIGNED_LONG as
'org.apache.phoenix.expression.function.GetValue' using jar 'path_to_jar';
5. Here's part of the stack trace for the query SELECT GetValue(pk) FROM
"table_name"; (full stack trace here)
Error: java.lang.reflect.InvocationTargetException (state=,code=0)
...
Caused by: java.lang.ClassNotFoundException:
org.apache.phoenix.expression.function.GetValue
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at
org.apache.hadoop.hbase.util.DynamicClassLoader.loadClass(DynamicClassLoader.java:147)
at
org.apache.phoenix.expression.function.UDFExpression.constructUDFFunction(UDFExpression.java:164)
... 28 more
Am I missing something? I've studied the UDF documentation and searched around
for my issue but to no avail. The GetValue class is present in the UDF jar, so
I'm not sure what the root problem is. I would greatly appreciate any help!
Thanks,Anchal