Hi siddhi
   It's good to register the Phoenix-client.jar alone .

Regards
Ravi

On Wednesday, August 5, 2015, Siddhi Mehta <sm26...@gmail.com> wrote:

> Hey Ravi,
>
> For PhoenixHBaseStorage to work do we to register the phoenix-client jar or
> the phoenix-server jar?
>
> --Siddhi
>
> On Tue, Aug 4, 2015 at 1:51 PM, Siddhi Mehta <sm26...@gmail.com
> <javascript:;>> wrote:
>
> > Hey Ravi,
> >
> > I register both the phoenix-core jar as well as phoenix-pig jar.
> > Version: 4.5.0
> > I havent tried load. Will give it a try.
> >
> >
> > Thanks,
> > --Siddhi
> >
> >
> > On Tue, Aug 4, 2015 at 1:24 PM, Ravi Kiran <maghamraviki...@gmail.com
> <javascript:;>>
> > wrote:
> >
> >> Hi Siddhi,
> >>
> >>     Which jars of phoenix did you register in your pig script. Can you
> >> please share the version of phoenix you are working on .  Also, do you
> >> notice the same behavior with LOAD also ?
> >>
> >> Thanks
> >> Ravi
> >>
> >>
> >>
> >> On Tue, Aug 4, 2015 at 12:47 PM, Siddhi Mehta <sm26...@gmail.com
> <javascript:;>> wrote:
> >>
> >> > Ahh. Sorry ignore the typo in my stacktrace. That is due to me trying
> to
> >> > remove hostnames from stacktrace.
> >> >
> >> > The url is jdbc:phoenix:remoteclusterZkQuorum:2181
> >> >
> >> > --Siddhi
> >> >
> >> > On Tue, Aug 4, 2015 at 11:35 AM, Samarth Jain <sama...@apache.org
> <javascript:;>>
> >> wrote:
> >> >
> >> > > The jdbc url doesn't look correct - jdbc:phoenix:jdbc:phoenix:
> >> > > remoteclusterZkQuorum:2181
> >> > >
> >> > > It should be jdbc:phoenix:remoteclusterZkQuorum:2181
> >> > >
> >> > > Do you have the phoneix.mapreduce.output.cluster.quorum configured
> >> (take
> >> > > note of the typo)? Or hbase.zookeeper.quorum? If yes, what are the
> >> values
> >> > > set as?
> >> > >
> >> > >
> >> > >
> >> > >
> >> > >
> >> > > On Tue, Aug 4, 2015 at 11:19 AM, Siddhi Mehta <sm26...@gmail.com
> <javascript:;>>
> >> wrote:
> >> > >
> >> > > > Hello,
> >> > > >
> >> > > > I am trying to run a pig job with mapreduce mode that tries to
> >> write to
> >> > > > Hbase using PhoenixHBaseStorage.
> >> > > >
> >> > > > I am seeing the reduce task fail with no suitable driver found for
> >> the
> >> > > > connection.
> >> > > >
> >> > > > AttemptID:attempt_1436998373852_1140_r_000000_1 Info:Error:
> >> > > > java.lang.RuntimeException: java.sql.SQLException: No suitable
> >> driver
> >> > > > found for jdbc:phoenix:remoteclusterZkQuorum:2181;
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:58)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:88)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.<init>(ReduceTask.java:540)
> >> > > >         at
> >> > > >
> >> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:614)
> >> > > >         at
> >> org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
> >> > > >         at
> >> org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> >> > > >         at java.security.AccessController.doPrivileged(Native
> >> Method)
> >> > > >         at javax.security.auth.Subject.doAs(Subject.java:422)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1576)
> >> > > >         at
> >> org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
> >> > > > Caused by: java.sql.SQLException: No suitable driver found for
> >> > > > jdbc:phoenix:jdbc:phoenix:remoteclusterZkQuorum:2181;
> >> > > >
> >> > > >         at
> >> java.sql.DriverManager.getConnection(DriverManager.java:689)
> >> > > >         at
> >> java.sql.DriverManager.getConnection(DriverManager.java:208)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:93)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:80)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:68)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.phoenix.mapreduce.PhoenixRecordWriter.<init>(PhoenixRecordWriter.java:49)
> >> > > >         at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:55)
> >> > > >
> >> > > >
> >> > > > I checked that the PhoenixDriver is on the classpath for the
> reduce
> >> > task
> >> > > by
> >> > > > adding a Class.forName("org.apache.phoenix.jdbc.PhoenixDriver")
> but
> >> the
> >> > > > write still fails.
> >> > > >
> >> > > > Anyone else encountered an issue while trying Hbase writes via Pig
> >> in
> >> > > > mapreduce mode.
> >> > > >
> >> > > > --Siddhi
> >> > > >
> >> > >
> >> >
> >>
> >
> >
>

Reply via email to