Hello,

I am trying to run a pig job with mapreduce mode that tries to write to
Hbase using PhoenixHBaseStorage.

I am seeing the reduce task fail with no suitable driver found for the
connection.

AttemptID:attempt_1436998373852_1140_r_000000_1 Info:Error:
java.lang.RuntimeException: java.sql.SQLException: No suitable driver
found for jdbc:phoenix:remoteclusterZkQuorum:2181;
        at 
org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:58)
        at 
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:88)
        at 
org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.<init>(ReduceTask.java:540)
        at 
org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:614)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1576)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.sql.SQLException: No suitable driver found for
jdbc:phoenix:jdbc:phoenix:remoteclusterZkQuorum:2181;

        at java.sql.DriverManager.getConnection(DriverManager.java:689)
        at java.sql.DriverManager.getConnection(DriverManager.java:208)
        at 
org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:93)
        at 
org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:80)
        at 
org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:68)
        at 
org.apache.phoenix.mapreduce.PhoenixRecordWriter.<init>(PhoenixRecordWriter.java:49)
        at 
org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:55)


I checked that the PhoenixDriver is on the classpath for the reduce task by
adding a Class.forName("org.apache.phoenix.jdbc.PhoenixDriver") but the
write still fails.

Anyone else encountered an issue while trying Hbase writes via Pig in
mapreduce mode.

--Siddhi

Reply via email to