Hello

I can't connect with my mail.
Can U help me.
I read your mail with your HBase usecase, Cool.

ThankU


@JBData31 <http://jbigdata.fr>


2015-10-30 11:20 GMT+01:00 Nicolae Marasoiu <[email protected]>:

> Hi,
>
> I tried to give him the jars containing those classes both by java class
> path and by pig.additional.jars and by REGISTER, all failed. It is even
> more strange that it already has classes for hbase client and zookeeper but
> not slf.
>
> Please advice!
>
> Thanks,
>
> Class path:
>
> export CLASSPATH="$(~/tools/hbase-0.98.14-hadoop1/bin/hbase classpath)"
>
> Pig script:
>
> SET hbase.zookeeper.quorum localhost;
>
> -- pig.exec.reducers.max defines the upper bound on the number of reducers
> (default is 999)
> SET pig.exec.reducers.max 2;
>
> REGISTER
> /Users/dnmaras/tools/hbase-0.98.14-hadoop1/lib/slf4j-api-1.6.4.jar;
> REGISTER
> /Users/dnmaras/tools/hbase-0.98.14-hadoop1/lib/slf4j-log4j12-1.6.4.jar;
> REGISTER
> /Users/dnmaras/tools/hbase-0.98.14-hadoop1/lib/zookeeper-3.4.6.jar;
>
> daily_rows = LOAD 'hbase://dt_campaign'
>     USING org.apache.pig.backend.hadoop.hbase.HBaseStorage(
>       'v:clid,v:caid, v:dt, v:dth, v:imo, v:im, v:ex, v:ltr, v:clo, v:cl,
> v:imcl, v:rev',
>       '-loadKey true -gte ${client}2015${month} -lte
> ${client}2015${month}Z')
>     AS (rowkey:chararray, clid:chararray, caid:chararray, dt:chararray,
> imo:double, im:double, ex:double, ltr:double, clo:double, cl:double,
> imcl:double, rev:double);
> describe daily_rows;
> monthly_rows_grp = GROUP daily_rows BY SUBSTRING(dt, 0, 7);
> describe monthly_rows_grp;
> monthly_rows = FOREACH monthly_rows_grp GENERATE
>     group as ym,
>     COUNT(daily_rows) as inputrows,
>     CONCAT('M', SUBSTRING(MIN(daily_rows.rowkey), 1, 26)) AS rowkey,
>     MIN(daily_rows.clid) AS clid,
>     MIN(daily_rows.caid) AS caid,
>     SUM(daily_rows.imo) AS imo,
>     SUM(daily_rows.im) AS im,
>     SUM(daily_rows.ex) AS ex,
>     SUM(daily_rows.ltr) AS ltr,
>     SUM(daily_rows.clo) AS clo,
>     SUM(daily_rows.cl) AS cl,
>     SUM(daily_rows.imcl) AS imcl,
>     SUM(daily_rows.rev) AS rev;
> describe monthly_rows;
> STORE monthly_rows INTO 'dt_campaign' USING
> org.apache.pig.backend.hadoop.hbase.HBaseStorage(
>      'v:ym, v:irows, v:clid, v:caid, v:imo, v:im, v:ex, v:ltr, v:clo,
> v:cl, v:imcl, v:rev'
> );
>
>
> So the slf LoggerFactory is in:
>
> REGISTER
> /Users/dnmaras/tools/hbase-0.98.14-hadoop1/lib/slf4j-api-1.6.4.jar;
>
> which is also in the classpath and running java slf.LoggerFactory gives
> that it does not have main (not that it does not exist).
>
> Pig run:
>
> rm -f pig*; pig -Dpig.additional.jars=slf4j-api-1.6.4.jar -x local -p
> month=01 -p client=29
> /Users/dnmaras/ds/bcs-batch/modules/bcs-pig/src/main/resources/scripts/granularities.pig;
> cat pig*
> 15/10/30 12:13:10 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
> 15/10/30 12:13:10 INFO pig.ExecTypeProvider: Picked LOCAL as the ExecType
> 2015-10-30 12:13:10,381 [main] INFO  org.apache.pig.Main - Apache Pig
> version 0.14.0 (r1640057) compiled Nov 16 2014, 18:01:24
> 2015-10-30 12:13:10,381 [main] INFO  org.apache.pig.Main - Logging error
> messages to: /private/tmp/pig_1446199990380.log
> 2015-10-30 12:13:10,443 [main] INFO  org.apache.pig.impl.util.Utils -
> Default bootup file /Users/dnmaras/.pigbootup not found
> 2015-10-30 12:13:10,495 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
> to hadoop file system at: file:///
> daily_rows: {rowkey: chararray,clid: chararray,caid: chararray,dt:
> chararray,imo: double,im: double,ex: double,ltr: double,clo: double,cl:
> double,imcl: double,rev: double}
> monthly_rows_grp: {group: chararray,daily_rows: {(rowkey: chararray,clid:
> chararray,caid: chararray,dt: chararray,imo: double,im: double,ex:
> double,ltr: double,clo: double,cl: double,imcl: double,rev: double)}}
> monthly_rows: {ym: chararray,inputrows: long,rowkey: chararray,clid:
> chararray,caid: chararray,imo: double,im: double,ex: double,ltr:
> double,clo: double,cl: double,imcl: double,rev: double}
> 2015-10-30 12:13:11,331 [main] ERROR
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat - java.io.IOException:
> java.lang.reflect.InvocationTargetException
> 2015-10-30 12:13:11,334 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> ERROR 2999: Unexpected internal error. java.io.IOException:
> java.lang.reflect.InvocationTargetException
> Details at logfile: /private/tmp/pig_1446199990380.log
> 2015-10-30 12:13:11,349 [main] INFO  org.apache.pig.Main - Pig script
> completed in 3 seconds and 213 milliseconds (3213 ms)
> Pig Stack Trace
> ---------------
> ERROR 2999: Unexpected internal error. java.io.IOException:
> java.lang.reflect.InvocationTargetException
>
> java.lang.RuntimeException: java.io.IOException:
> java.lang.reflect.InvocationTargetException
>         at
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:211)
>         at
> org.apache.pig.backend.hadoop.hbase.HBaseStorage.getOutputFormat(HBaseStorage.java:904)
>         at
> org.apache.pig.newplan.logical.visitor.InputOutputFileValidatorVisitor.visit(InputOutputFileValidatorVisitor.java:69)
>         at
> org.apache.pig.newplan.logical.relational.LOStore.accept(LOStore.java:66)
>         at
> org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:64)
>         at
> org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
>         at
> org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
>         at
> org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
>         at
> org.apache.pig.newplan.DepthFirstWalker.walk(DepthFirstWalker.java:53)
>         at org.apache.pig.newplan.PlanVisitor.visit(PlanVisitor.java:52)
>         at
> org.apache.pig.newplan.logical.relational.LogicalPlan.validate(LogicalPlan.java:212)
>         at org.apache.pig.PigServer$Graph.compile(PigServer.java:1767)
>         at org.apache.pig.PigServer$Graph.access$300(PigServer.java:1443)
>         at org.apache.pig.PigServer.execute(PigServer.java:1356)
>         at org.apache.pig.PigServer.executeBatch(PigServer.java:415)
>         at org.apache.pig.PigServer.executeBatch(PigServer.java:398)
>         at
> org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:171)
>         at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:234)
>         at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
>         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
>         at org.apache.pig.Main.run(Main.java:624)
>         at org.apache.pig.Main.main(Main.java:170)
> Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
>         at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:149)
>         at
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:206)
>         ... 21 more
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
>         ... 26 more
> Caused by: java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
>         at org.apache.zookeeper.ZooKeeper.<clinit>(ZooKeeper.java:94)
>         at
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.<init>(RecoverableZooKeeper.java:112)
>         at
> org.apache.hadoop.hbase.zookeeper.ZKUtil.connect(ZKUtil.java:132)
>         at
> org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:165)
>         at
> org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:134)
>         at
> org.apache.hadoop.hbase.client.ZooKeeperKeepAliveConnection.<init>(ZooKeeperKeepAliveConnection.java:43)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getKeepAliveZooKeeperWatcher(HConnectionManager.java:1722)
>         at
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:82)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
>         ... 31 more
> Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>         ... 41 more
>

Reply via email to