Hi Bogi,

When I am doing

ant -Dthirdparty=true test

I am getting error while initiating the minihbase cluster.

I am working on windows system from cmd line.

2017-03-20 23:47:59,643 (main) [ERROR -
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:233)]
Error starting cluster
java.lang.RuntimeException: Failed construction of Master: class
org.apache.hadoop.hbase.master.HMasterIllegal character in authority at
index 7:
file://C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\hbase


I am attaching the test result file for the same. Please let me know what
is missing here.

Thanks,
Jilani

On Sun, Mar 19, 2017 at 6:24 AM, Boglarka Egyed <egyedb...@gmail.com> wrote:

>
>
> > On March 18, 2017, 11:06 a.m., Boglarka Egyed wrote:
> > > Hi Jilani,
> > >
> > > Thanks for your contribution!
> > >
> > > I have applied your patch and got some warnings because of the
> incorrect indentation and trailing whitespaces which I mentioned in my
> review below too, please find my further findings there too. I could run
> 'ant clean test' susseccfully with your patch.
> > >
> > > On the top of the findings, could you please add a test case for your
> change?
> > >
> > > Thanks,
> > > Bogi
>
> Hi Jilani,
>
> I think you could add test(s) to com.cloudera.sqoop.hbase.HBaseImportNullTest
> class as it tests the null import into HBase cases.
>
> Thanks,
> Bogi
>
>
> - Boglarka
>
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/57499/#review169361
> -----------------------------------------------------------
>
>
> On March 11, 2017, 6:16 a.m., Jilani Shaik wrote:
> >
> > -----------------------------------------------------------
> > This is an automatically generated e-mail. To reply, visit:
> > https://reviews.apache.org/r/57499/
> > -----------------------------------------------------------
> >
> > (Updated March 11, 2017, 6:16 a.m.)
> >
> >
> > Review request for Sqoop.
> >
> >
> > Bugs: SQOOP-3149
> >     https://issues.apache.org/jira/browse/SQOOP-3149
> >
> >
> > Repository: sqoop-trunk
> >
> >
> > Description
> > -------
> >
> > HBase delete is added as part of incremental data import, Modified the
> return type of method which is responsible for holding list of Put objects
> to List of Mutation objects and at the time of execution of insert or
> delete using the type of object in Mutation, executed the actaul operation
> either insert or delete.
> >
> > Jira ticket for the same: https://issues.apache.org/
> jira/browse/SQOOP-3149
> >
> > Similar ticket to above:  SQOOP-3125
> >
> >
> > Diffs
> > -----
> >
> >   src/java/org/apache/sqoop/hbase/HBasePutProcessor.java fdbe1276
> >   src/java/org/apache/sqoop/hbase/PutTransformer.java 533467e5
> >   src/java/org/apache/sqoop/hbase/ToStringPutTransformer.java 363e1456
> >   src/java/org/apache/sqoop/mapreduce/HBaseBulkImportMapper.java
> 58ccee7b
> >
> >
> > Diff: https://reviews.apache.org/r/57499/diff/1/
> >
> >
> > Testing
> > -------
> >
> > Executed with jar prepared with these changes into hadoop cluster and
> tested bot import and then incremental import.
> >
> >
> > File Attachments
> > ----------------
> >
> > HBase delete support for incremental import
> >   https://reviews.apache.org/media/uploaded/files/2017/03/
> 11/5b1895fd-4c6b-42fa-8a92-4e093153e370__hbase_delete_
> support_in_incremental_import
> >
> >
> > Thanks,
> >
> > Jilani Shaik
> >
> >
>
>
Testsuite: com.cloudera.sqoop.hbase.HBaseImportNullTest
Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 6.112 sec
------------- Standard Error -----------------
2017-03-20 23:47:54,274 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2017-03-20 23:47:54,283 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:host.name=arfu1-PC
2017-03-20 23:47:54,285 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:java.version=1.8.0_111
2017-03-20 23:47:54,285 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:java.vendor=Oracle Corporation
2017-03-20 23:47:54,286 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:java.home=C:\Java\jdk1.8.0_111\jre
2017-03-20 23:47:54,287 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:java.class.path=D:\work\projects\sqoop-trunk\build\cobertura\classes;D:\work\projects\sqoop-trunk\testdata\hcatalog\conf;D:\work\projects\sqoop-trunk\build\test\classes;D:\work\projects\sqoop-trunk\build\test\extraconf;C:\Users\arfu1\.ivy2\cache\org.apache.avro\avro\bundles\avro-1.8.1.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.jackson\jackson-core-asl\jars\jackson-core-asl-1.9.13.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.jackson\jackson-mapper-asl\jars\jackson-mapper-asl-1.9.13.jar;C:\Users\arfu1\.ivy2\cache\com.thoughtworks.paranamer\paranamer\bundles\paranamer-2.7.jar;C:\Users\arfu1\.ivy2\cache\org.xerial.snappy\snappy-java\bundles\snappy-java-1.1.1.3.jar;C:\Users\arfu1\.ivy2\cache\org.apache.commons\commons-compress\jars\commons-compress-1.8.1.jar;C:\Users\arfu1\.ivy2\cache\org.tukaani\xz\jars\xz-1.5.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-common\jars\hadoop-common-2.6.0-tests.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-common\jars\hadoop-common-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-annotations\jars\hadoop-annotations-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\commons-cli\commons-cli\jars\commons-cli-1.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.commons\commons-math3\jars\commons-math3-3.1.1.jar;C:\Users\arfu1\.ivy2\cache\xmlenc\xmlenc\jars\xmlenc-0.52.jar;C:\Users\arfu1\.ivy2\cache\commons-httpclient\commons-httpclient\jars\commons-httpclient-3.1.jar;C:\Users\arfu1\.ivy2\cache\commons-io\commons-io\jars\commons-io-2.4.jar;C:\Users\arfu1\.ivy2\cache\commons-net\commons-net\jars\commons-net-3.1.jar;C:\Users\arfu1\.ivy2\cache\javax.servlet\servlet-api\jars\servlet-api-2.5.jar;C:\Users\arfu1\.ivy2\cache\org.mortbay.jetty\jetty\jars\jetty-6.1.26.jar;C:\Users\arfu1\.ivy2\cache\org.mortbay.jetty\jetty-util\jars\jetty-util-6.1.26.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.jettison\jettison\bundles\jettison-1.1.jar;C:\Users\arfu1\.ivy2\cache\com.sun.xml.bind\jaxb-impl\jars\jaxb-impl-2.2.3-1.jar;C:\Users\arfu1\.ivy2\cache\javax.xml.bind\jaxb-api\jars\jaxb-api-2.2.2.jar;C:\Users\arfu1\.ivy2\cache\javax.xml.stream\stax-api\jars\stax-api-1.0-2.jar;C:\Users\arfu1\.ivy2\cache\javax.activation\activation\jars\activation-1.1.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.jackson\jackson-jaxrs\jars\jackson-jaxrs-1.9.13.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.jackson\jackson-xc\jars\jackson-xc-1.9.13.jar;C:\Users\arfu1\.ivy2\cache\com.sun.jersey\jersey-server\bundles\jersey-server-1.9.jar;C:\Users\arfu1\.ivy2\cache\asm\asm\jars\asm-3.2.jar;C:\Users\arfu1\.ivy2\cache\log4j\log4j\bundles\log4j-1.2.17.jar;C:\Users\arfu1\.ivy2\cache\net.java.dev.jets3t\jets3t\jars\jets3t-0.9.0.jar;C:\Users\arfu1\.ivy2\cache\com.jamesmurty.utils\java-xmlbuilder\jars\java-xmlbuilder-0.4.jar;C:\Users\arfu1\.ivy2\cache\commons-lang\commons-lang\jars\commons-lang-2.6.jar;C:\Users\arfu1\.ivy2\cache\commons-configuration\commons-configuration\jars\commons-configuration-1.6.jar;C:\Users\arfu1\.ivy2\cache\commons-digester\commons-digester\jars\commons-digester-1.8.jar;C:\Users\arfu1\.ivy2\cache\commons-beanutils\commons-beanutils\jars\commons-beanutils-1.7.0.jar;C:\Users\arfu1\.ivy2\cache\commons-beanutils\commons-beanutils-core\jars\commons-beanutils-core-1.8.0.jar;C:\Users\arfu1\.ivy2\cache\com.google.protobuf\protobuf-java\bundles\protobuf-java-2.5.0.jar;C:\Users\arfu1\.ivy2\cache\com.google.code.gson\gson\jars\gson-2.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-auth\jars\hadoop-auth-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.directory.server\apacheds-kerberos-codec\bundles\apacheds-kerberos-codec-2.0.0-M15.jar;C:\Users\arfu1\.ivy2\cache\org.apache.directory.server\apacheds-i18n\bundles\apacheds-i18n-2.0.0-M15.jar;C:\Users\arfu1\.ivy2\cache\org.apache.directory.api\api-asn1-api\bundles\api-asn1-api-1.0.0-M20.jar;C:\Users\arfu1\.ivy2\cache\org.apache.directory.api\api-util\bundles\api-util-1.0.0-M20.jar;C:\Users\arfu1\.ivy2\cache\org.apache.curator\curator-framework\bundles\curator-framework-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.curator\curator-client\bundles\curator-client-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.zookeeper\zookeeper\jars\zookeeper-3.4.6.jar;C:\Users\arfu1\.ivy2\cache\org.slf4j\slf4j-log4j12\jars\slf4j-log4j12-1.7.5.jar;C:\Users\arfu1\.ivy2\cache\org.hamcrest\hamcrest-core\jars\hamcrest-core-1.3.jar;C:\Users\arfu1\.ivy2\cache\io.netty\netty\bundles\netty-3.6.2.Final.jar;C:\Users\arfu1\.ivy2\cache\com.jcraft\jsch\jars\jsch-0.1.42.jar;C:\Users\arfu1\.ivy2\cache\org.apache.curator\curator-recipes\bundles\curator-recipes-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.htrace\htrace-core\jars\htrace-core-3.0.4.jar;C:\Users\arfu1\.ivy2\cache\tomcat\jasper-compiler\jars\jasper-compiler-5.5.23.jar;C:\Users\arfu1\.ivy2\cache\tomcat\jasper-runtime\jars\jasper-runtime-5.5.23.jar;C:\Users\arfu1\.ivy2\cache\commons-el\commons-el\jars\commons-el-1.0.jar;C:\Users\arfu1\.ivy2\cache\javax.servlet.jsp\jsp-api\jars\jsp-api-2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-hdfs\jars\hadoop-hdfs-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-hdfs\jars\hadoop-hdfs-2.6.0-tests.jar;C:\Users\arfu1\.ivy2\cache\xerces\xercesImpl\jars\xercesImpl-2.9.1.jar;C:\Users\arfu1\.ivy2\cache\xml-apis\xml-apis\jars\xml-apis-1.3.04.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-common\jars\hadoop-mapreduce-client-common-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-common\jars\hadoop-yarn-common-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-api\jars\hadoop-yarn-api-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\com.sun.jersey\jersey-client\bundles\jersey-client-1.9.jar;C:\Users\arfu1\.ivy2\cache\com.google.inject.extensions\guice-servlet\jars\guice-servlet-3.0.jar;C:\Users\arfu1\.ivy2\cache\com.google.inject\guice\jars\guice-3.0.jar;C:\Users\arfu1\.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar;C:\Users\arfu1\.ivy2\cache\aopalliance\aopalliance\jars\aopalliance-1.0.jar;C:\Users\arfu1\.ivy2\cache\org.sonatype.sisu.inject\cglib\jars\cglib-2.2.1-v20090111.jar;C:\Users\arfu1\.ivy2\cache\com.sun.jersey.contribs\jersey-guice\jars\jersey-guice-1.9.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-client\jars\hadoop-yarn-client-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-core\jars\hadoop-mapreduce-client-core-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-server-common\jars\hadoop-yarn-server-common-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.fusesource.leveldbjni\leveldbjni-all\bundles\leveldbjni-all-1.8.jar;C:\Users\arfu1\.ivy2\cache\org.aspectj\aspectjtools\jars\aspectjtools-1.7.4.jar;C:\Users\arfu1\.ivy2\cache\org.aspectj\aspectjrt\jars\aspectjrt-1.7.4.jar;C:\Users\arfu1\.ivy2\cache\hsqldb\hsqldb\jars\hsqldb-1.8.0.10.jar;C:\Users\arfu1\.ivy2\cache\org.apache.commons\commons-lang3\jars\commons-lang3-3.4.jar;C:\Users\arfu1\.ivy2\cache\org.kitesdk\kite-data-mapreduce\jars\kite-data-mapreduce-1.0.0.jar;C:\Users\arfu1\.ivy2\cache\org.kitesdk\kite-data-core\jars\kite-data-core-1.0.0.jar;C:\Users\arfu1\.ivy2\cache\org.kitesdk\kite-hadoop-compatibility\jars\kite-hadoop-compatibility-1.0.0.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-avro\jars\parquet-avro-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-column\jars\parquet-column-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-common\jars\parquet-common-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-encoding\jars\parquet-encoding-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-generator\jars\parquet-generator-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-hadoop\jars\parquet-hadoop-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-format\jars\parquet-format-2.0.0.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-jackson\jars\parquet-jackson-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\net.sf.opencsv\opencsv\jars\opencsv-2.3.jar;C:\Users\arfu1\.ivy2\cache\org.apache.commons\commons-jexl\jars\commons-jexl-2.1.1.jar;C:\Users\arfu1\.ivy2\cache\com.fasterxml.jackson.core\jackson-databind\bundles\jackson-databind-2.3.1.jar;C:\Users\arfu1\.ivy2\cache\com.fasterxml.jackson.core\jackson-annotations\bundles\jackson-annotations-2.3.0.jar;C:\Users\arfu1\.ivy2\cache\com.fasterxml.jackson.core\jackson-core\bundles\jackson-core-2.3.1.jar;C:\Users\arfu1\.ivy2\cache\org.kitesdk\kite-data-hive\jars\kite-data-hive-1.0.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-core\jars\accumulo-core-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\com.beust\jcommander\jars\jcommander-1.32.jar;C:\Users\arfu1\.ivy2\cache\com.google.guava\guava\bundles\guava-14.0.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-fate\jars\accumulo-fate-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-start\jars\accumulo-start-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.commons\commons-vfs2\jars\commons-vfs2-2.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.maven.scm\maven-scm-api\jars\maven-scm-api-1.4.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.plexus\plexus-utils\jars\plexus-utils-1.5.6.jar;C:\Users\arfu1\.ivy2\cache\org.apache.maven.scm\maven-scm-provider-svnexe\jars\maven-scm-provider-svnexe-1.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.maven.scm\maven-scm-provider-svn-commons\jars\maven-scm-provider-svn-commons-1.4.jar;C:\Users\arfu1\.ivy2\cache\regexp\regexp\jars\regexp-1.3.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-client\jars\hadoop-client-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-app\jars\hadoop-mapreduce-client-app-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-shuffle\jars\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-jobclient\jars\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-server-nodemanager\jars\hadoop-yarn-server-nodemanager-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\com.sun.jersey.jersey-test-framework\jersey-test-framework-grizzly2\jars\jersey-test-framework-grizzly2-1.9.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-trace\jars\accumulo-trace-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-minicluster\jars\accumulo-minicluster-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-gc\jars\accumulo-gc-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-server-base\jars\accumulo-server-base-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-master\jars\accumulo-master-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-monitor\jars\accumulo-monitor-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\javax.servlet\javax.servlet-api\jars\javax.servlet-api-3.0.1.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty\jetty-http\jars\jetty-http-8.1.15.v20140411.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty\jetty-io\jars\jetty-io-8.1.15.v20140411.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty\jetty-util\jars\jetty-util-8.1.15.v20140411.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty\jetty-security\jars\jetty-security-8.1.15.v20140411.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty\jetty-server\jars\jetty-server-8.1.15.v20140411.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty.orbit\javax.servlet\orbits\javax.servlet-3.0.0.v201112011016.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty\jetty-continuation\jars\jetty-continuation-8.1.15.v20140411.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty\jetty-servlet\jars\jetty-servlet-8.1.15.v20140411.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-tracer\jars\accumulo-tracer-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.accumulo\accumulo-tserver\jars\accumulo-tserver-1.6.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-minicluster\jars\hadoop-minicluster-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-server-tests\test-jars\hadoop-yarn-server-tests-2.2.0-tests.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-hs\jars\hadoop-mapreduce-client-hs-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-client\test-jars\hbase-client-1.2.4-tests.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-client\jars\hbase-client-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-annotations\jars\hbase-annotations-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\com.github.stephenc.findbugs\findbugs-annotations\jars\findbugs-annotations-1.3.9-1.jar;C:\Users\arfu1\.ivy2\cache\junit\junit\jars\junit-4.12.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-protocol\jars\hbase-protocol-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\commons-logging\commons-logging\jars\commons-logging-1.2.jar;C:\Users\arfu1\.ivy2\cache\commons-codec\commons-codec\jars\commons-codec-1.9.jar;C:\Users\arfu1\.ivy2\cache\io.netty\netty-all\jars\netty-all-4.0.23.Final.jar;C:\Users\arfu1\.ivy2\cache\org.apache.htrace\htrace-core\jars\htrace-core-3.1.0-incubating.jar;C:\Users\arfu1\.ivy2\cache\org.jruby.jcodings\jcodings\jars\jcodings-1.0.8.jar;C:\Users\arfu1\.ivy2\cache\org.jruby.joni\joni\jars\joni-2.1.2.jar;C:\Users\arfu1\.ivy2\cache\com.yammer.metrics\metrics-core\jars\metrics-core-2.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.slf4j\slf4j-api\jars\slf4j-api-1.7.7.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-common\jars\hbase-common-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-common\jars\hbase-common-1.2.4-tests.jar;C:\Users\arfu1\.ivy2\cache\commons-collections\commons-collections\jars\commons-collections-3.2.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-server\jars\hbase-server-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-server\jars\hbase-server-1.2.4-tests.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-procedure\jars\hbase-procedure-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.commons\commons-math\jars\commons-math-2.2.jar;C:\Users\arfu1\.ivy2\cache\org.mortbay.jetty\jetty-sslengine\jars\jetty-sslengine-6.1.26.jar;C:\Users\arfu1\.ivy2\cache\org.mortbay.jetty\jsp-2.1\jars\jsp-2.1-6.1.14.jar;C:\Users\arfu1\.ivy2\cache\org.mortbay.jetty\jsp-api-2.1\jars\jsp-api-2.1-6.1.14.jar;C:\Users\arfu1\.ivy2\cache\org.mortbay.jetty\servlet-api-2.5\jars\servlet-api-2.5-6.1.14.jar;C:\Users\arfu1\.ivy2\cache\org.jamon\jamon-runtime\jars\jamon-runtime-2.4.1.jar;C:\Users\arfu1\.ivy2\cache\com.lmax\disruptor\jars\disruptor-3.3.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-prefix-tree\jars\hbase-prefix-tree-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-hadoop-compat\test-jars\hbase-hadoop-compat-1.2.4-tests.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-hadoop-compat\jars\hbase-hadoop-compat-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-hadoop2-compat\jars\hbase-hadoop2-compat-1.2.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hbase\hbase-hadoop2-compat\test-jars\hbase-hadoop2-compat-1.2.4-tests.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive.hcatalog\hive-hcatalog-core\jars\hive-hcatalog-core-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-cli\jars\hive-cli-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-common\jars\hive-common-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-shims\jars\hive-shims-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive.shims\hive-shims-common\jars\hive-shims-common-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\log4j\apache-log4j-extras\bundles\apache-log4j-extras-1.2.17.jar;C:\Users\arfu1\.ivy2\cache\org.apache.thrift\libthrift\jars\libthrift-0.9.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.httpcomponents\httpclient\jars\httpclient-4.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.httpcomponents\httpcore\jars\httpcore-4.4.jar;C:\Users\arfu1\.ivy2\cache\joda-time\joda-time\jars\joda-time-2.5.jar;C:\Users\arfu1\.ivy2\cache\org.apache.ant\ant\jars\ant-1.9.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.ant\ant-launcher\jars\ant-launcher-1.9.1.jar;C:\Users\arfu1\.ivy2\cache\org.json\json\jars\json-20090211.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive.shims\hive-shims-0.20S\jars\hive-shims-0.20S-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive.shims\hive-shims-0.23\jars\hive-shims-0.23-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-server-resourcemanager\jars\hadoop-yarn-server-resourcemanager-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\com.sun.jersey\jersey-json\jars\jersey-json-1.14.jar;C:\Users\arfu1\.ivy2\cache\com.sun.jersey\jersey-core\jars\jersey-core-1.14.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-server-applicationhistoryservice\jars\hadoop-yarn-server-applicationhistoryservice-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hadoop\hadoop-yarn-server-web-proxy\jars\hadoop-yarn-server-web-proxy-2.6.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive.shims\hive-shims-scheduler\jars\hive-shims-scheduler-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-metastore\jars\hive-metastore-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-serde\jars\hive-serde-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\com.google.code.findbugs\jsr305\jars\jsr305-3.0.0.jar;C:\Users\arfu1\.ivy2\cache\com.twitter\parquet-hadoop-bundle\jars\parquet-hadoop-bundle-1.6.0.jar;C:\Users\arfu1\.ivy2\cache\com.jolbox\bonecp\bundles\bonecp-0.8.0.RELEASE.jar;C:\Users\arfu1\.ivy2\cache\org.apache.derby\derby\jars\derby-10.10.2.0.jar;C:\Users\arfu1\.ivy2\cache\org.datanucleus\datanucleus-api-jdo\jars\datanucleus-api-jdo-3.2.6.jar;C:\Users\arfu1\.ivy2\cache\org.datanucleus\datanucleus-core\jars\datanucleus-core-3.2.10.jar;C:\Users\arfu1\.ivy2\cache\org.datanucleus\datanucleus-rdbms\jars\datanucleus-rdbms-3.2.9.jar;C:\Users\arfu1\.ivy2\cache\commons-pool\commons-pool\jars\commons-pool-1.5.4.jar;C:\Users\arfu1\.ivy2\cache\commons-dbcp\commons-dbcp\jars\commons-dbcp-1.4.jar;C:\Users\arfu1\.ivy2\cache\javax.jdo\jdo-api\jars\jdo-api-3.0.1.jar;C:\Users\arfu1\.ivy2\cache\javax.transaction\jta\jars\jta-1.1.jar;C:\Users\arfu1\.ivy2\cache\org.antlr\antlr-runtime\jars\antlr-runtime-3.4.jar;C:\Users\arfu1\.ivy2\cache\org.antlr\stringtemplate\jars\stringtemplate-3.2.1.jar;C:\Users\arfu1\.ivy2\cache\antlr\antlr\jars\antlr-2.7.7.jar;C:\Users\arfu1\.ivy2\cache\org.apache.thrift\libfb303\jars\libfb303-0.9.2.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-service\jars\hive-service-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-exec\jars\hive-exec-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.hive\hive-ant\jars\hive-ant-1.2.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.velocity\velocity\jars\velocity-1.5.jar;C:\Users\arfu1\.ivy2\cache\oro\oro\jars\oro-2.0.8.jar;C:\Users\arfu1\.ivy2\cache\org.antlr\ST4\jars\ST4-4.0.4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.ivy\ivy\jars\ivy-2.4.0.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.groovy\groovy-all\jars\groovy-all-2.1.6.jar;C:\Users\arfu1\.ivy2\cache\org.apache.calcite\calcite-core\jars\calcite-core-1.2.0-incubating.jar;C:\Users\arfu1\.ivy2\cache\org.apache.calcite\calcite-avatica\jars\calcite-avatica-1.2.0-incubating.jar;C:\Users\arfu1\.ivy2\cache\org.apache.calcite\calcite-linq4j\jars\calcite-linq4j-1.2.0-incubating.jar;C:\Users\arfu1\.ivy2\cache\net.hydromatic\eigenbase-properties\bundles\eigenbase-properties-1.1.5.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.janino\janino\jars\janino-2.7.6.jar;C:\Users\arfu1\.ivy2\cache\org.codehaus.janino\commons-compiler\jars\commons-compiler-2.7.6.jar;C:\Users\arfu1\.ivy2\cache\org.pentaho\pentaho-aggdesigner-algorithm\jars\pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar;C:\Users\arfu1\.ivy2\cache\stax\stax-api\jars\stax-api-1.0.1.jar;C:\Users\arfu1\.ivy2\cache\jline\jline\jars\jline-2.12.jar;C:\Users\arfu1\.ivy2\cache\net.sf.jpam\jpam\jars\jpam-1.1.jar;C:\Users\arfu1\.ivy2\cache\org.eclipse.jetty.aggregate\jetty-all\jars\jetty-all-7.6.0.v20120127.jar;C:\Users\arfu1\.ivy2\cache\org.apache.geronimo.specs\geronimo-jta_1.1_spec\jars\geronimo-jta_1.1_spec-1.1.1.jar;C:\Users\arfu1\.ivy2\cache\javax.mail\mail\jars\mail-1.4.1.jar;C:\Users\arfu1\.ivy2\cache\org.apache.geronimo.specs\geronimo-jaspic_1.0_spec\bundles\geronimo-jaspic_1.0_spec-1.0.jar;C:\Users\arfu1\.ivy2\cache\org.apache.geronimo.specs\geronimo-annotation_1.0_spec\jars\geronimo-annotation_1.0_spec-1.1.1.jar;C:\Users\arfu1\.ivy2\cache\asm\asm-commons\jars\asm-commons-3.1.jar;C:\Users\arfu1\.ivy2\cache\asm\asm-tree\jars\asm-tree-3.1.jar;C:\Users\arfu1\.ivy2\cache\org.postgresql\postgresql\jars\postgresql-9.2-1003-jdbc4.jar;C:\Users\arfu1\.ivy2\cache\org.apache.avro\avro-mapred\jars\avro-mapred-1.8.1-hadoop2.jar;C:\Users\arfu1\.ivy2\cache\org.mockito\mockito-all\jars\mockito-all-1.9.5.jar;C:\Users\arfu1\.ivy2\cache\com.h2database\h2\jars\h2-1.3.170.jar;D:\work\projects\sqoop-trunk\build\classes;D:\work\projects\sqoop-trunk\lib\ant-contrib-1.0b3.jar;D:\work\projects\sqoop-trunk\lib\ant-eclipse-1.0-jvm1.2.jar;D:\work\projects\sqoop-trunk\lib\ivy-2.3.0.jar;D:\work\apache-ant-1.10.0\lib\ant-launcher.jar;D:\work\apache-ant-1.10.0\lib\ant.jar;D:\work\apache-ant-1.10.0\lib\ant-junit.jar;D:\work\apache-ant-1.10.0\lib\ant-junit4.jar
2017-03-20 23:47:54,292 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:java.library.path=C:\Java\jdk1.8.0_111\jre\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\ProgramData\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\work\TortoiseGit\bin;C:\work\Git\cmd;C:\Program
 Files 
(x86)\Skype\Phone\;D:\work\programs\ProgramData\Anaconda2;D:\work\programs\ProgramData\Anaconda2\Scripts;D:\work\programs\ProgramData\Anaconda2\Library\bin;C:\Java\jdk1.8.0_111\bin;D:\work\apache-maven-3.3.9\bin;D:\work\apache-ant-1.10.0\bin;C:\Program
 Files (x86)\GnuWin32\bin;D:\work\programs\spark-2.1.0-bin-hadoop2.7\bin;.
2017-03-20 23:47:54,294 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:java.io.tmpdir=C:\Users\arfu1\AppData\Local\Temp\
2017-03-20 23:47:54,295 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:java.compiler=<NA>
2017-03-20 23:47:54,295 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:os.name=Windows 7
2017-03-20 23:47:54,296 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:os.arch=amd64
2017-03-20 23:47:54,297 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:os.version=6.1
2017-03-20 23:47:54,297 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:user.name=arfu1
2017-03-20 23:47:54,297 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:user.home=C:\Users\arfu1
2017-03-20 23:47:54,298 (main) [INFO - 
org.apache.zookeeper.Environment.logEnv(Environment.java:100)] Server 
environment:user.dir=D:\work\projects\sqoop-trunk\build\test/data
2017-03-20 23:47:54,305 (main) [DEBUG - 
org.apache.zookeeper.server.persistence.FileTxnSnapLog.<init>(FileTxnSnapLog.java:79)]
 Opening 
datadir:C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\zk\zookeeper_0
 
snapDir:C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\zk\zookeeper_0
2017-03-20 23:47:54,333 (main) [INFO - 
org.apache.zookeeper.server.ZooKeeperServer.<init>(ZooKeeperServer.java:162)] 
Created server with tickTime 2000 minSessionTimeout 4000 maxSessionTimeout 
40000 datadir 
C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\zk\zookeeper_0\version-2
 snapdir 
C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\zk\zookeeper_0\version-2
2017-03-20 23:47:56,482 (main) [INFO - 
org.apache.zookeeper.server.NIOServerCnxnFactory.configure(NIOServerCnxnFactory.java:94)]
 binding to port 0.0.0.0/0.0.0.0:51781
2017-03-20 23:47:56,803 (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:51781) [INFO - 
org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:197)]
 Accepted socket connection from /127.0.0.1:65229
2017-03-20 23:47:56,808 (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:51781) [INFO - 
org.apache.zookeeper.server.NIOServerCnxn.checkFourLetterWord(NIOServerCnxn.java:827)]
 Processing stat command from /127.0.0.1:65229
2017-03-20 23:47:56,812 (Thread-1) [INFO - 
org.apache.zookeeper.server.NIOServerCnxn$StatCommand.commandRun(NIOServerCnxn.java:663)]
 Stat command output
2017-03-20 23:47:56,813 (main) [INFO - 
org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster.startup(MiniZooKeeperCluster.java:273)]
 Started MiniZooKeeperCluster and ran successful 'stat' on client port=51781
2017-03-20 23:47:56,814 (Thread-1) [INFO - 
org.apache.zookeeper.server.NIOServerCnxn.closeSock(NIOServerCnxn.java:1007)] 
Closed socket connection for client /127.0.0.1:65229 (no session established 
for client)
2017-03-20 23:47:58,156 (main) [DEBUG - 
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:278)]
  Creating new Groups object
2017-03-20 23:47:58,195 (main) [DEBUG - 
org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:46)] 
Trying to load the custom-built native-hadoop library...
2017-03-20 23:47:58,197 (main) [DEBUG - 
org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:55)] 
Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no 
hadoop in java.library.path
2017-03-20 23:47:58,199 (main) [DEBUG - 
org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:56)] 
java.library.path=C:\Java\jdk1.8.0_111\jre\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\ProgramData\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\work\TortoiseGit\bin;C:\work\Git\cmd;C:\Program
 Files 
(x86)\Skype\Phone\;D:\work\programs\ProgramData\Anaconda2;D:\work\programs\ProgramData\Anaconda2\Scripts;D:\work\programs\ProgramData\Anaconda2\Library\bin;C:\Java\jdk1.8.0_111\bin;D:\work\apache-maven-3.3.9\bin;D:\work\apache-ant-1.10.0\bin;C:\Program
 Files (x86)\GnuWin32\bin;D:\work\programs\spark-2.1.0-bin-hadoop2.7\bin;.
2017-03-20 23:47:58,199 (main) [WARN - 
org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:62)] 
Unable to load native-hadoop library for your platform... using builtin-java 
classes where applicable
2017-03-20 23:47:58,200 (main) [DEBUG - 
org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:41)]
 Falling back to shell based
2017-03-20 23:47:58,201 (main) [DEBUG - 
org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:45)]
 Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
2017-03-20 23:47:58,207 (main) [DEBUG - 
org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:320)] Failed to detect 
a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
        at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302)
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:327)
        at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
        at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:104)
        at org.apache.hadoop.security.Groups.<init>(Groups.java:86)
        at org.apache.hadoop.security.Groups.<init>(Groups.java:66)
        at 
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280)
        at 
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:265)
        at 
org.apache.hadoop.hbase.security.UserProvider.<clinit>(UserProvider.java:56)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:504)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:381)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:217)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:97)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:81)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:68)
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:148)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
        at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
        at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
        at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:38)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:535)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1182)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:1033)
2017-03-20 23:47:58,219 (main) [ERROR - 
org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:373)] Failed to locate 
the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the 
Hadoop binaries.
        at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
        at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
        at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
        at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:104)
        at org.apache.hadoop.security.Groups.<init>(Groups.java:86)
        at org.apache.hadoop.security.Groups.<init>(Groups.java:66)
        at 
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280)
        at 
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:265)
        at 
org.apache.hadoop.hbase.security.UserProvider.<clinit>(UserProvider.java:56)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:504)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:381)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:217)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:97)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:81)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:68)
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:148)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
        at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
        at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
        at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:38)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:535)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1182)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:1033)
2017-03-20 23:47:58,225 (main) [DEBUG - 
org.apache.hadoop.security.Groups.<init>(Groups.java:91)] Group mapping 
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; 
cacheTimeout=300000; warningDeltaMs=5000
2017-03-20 23:47:58,620 (main) [INFO - 
org.apache.hadoop.hbase.client.ConnectionUtils.setServerSideHConnectionRetriesConfig(ConnectionUtils.java:108)]
 master/arfu1-PC/10.0.0.53:0 server-side HConnection retries=350
2017-03-20 23:47:58,848 (main) [INFO - 
org.apache.hadoop.hbase.ipc.SimpleRpcScheduler.<init>(SimpleRpcScheduler.java:128)]
 Using deadline as user call queue, count=3
2017-03-20 23:47:58,874 (main) [INFO - 
org.apache.hadoop.hbase.ipc.RpcServer$Listener.<init>(RpcServer.java:592)] 
master/arfu1-PC/10.0.0.53:0: started 10 reader(s) listening on port=65230
2017-03-20 23:47:58,930 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.initMode(MetricsSystemImpl.java:619)]
 from system property: null
2017-03-20 23:47:58,931 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.initMode(MetricsSystemImpl.java:620)]
 from environment variable: null
2017-03-20 23:47:58,976 (main) [DEBUG - 
org.apache.commons.configuration.ConfigurationUtils.locate(ConfigurationUtils.java:447)]
 ConfigurationUtils.locate(): base is null, name is 
hadoop-metrics2-hbase.properties
2017-03-20 23:47:58,983 (main) [DEBUG - 
org.apache.commons.configuration.ConfigurationUtils.locate(ConfigurationUtils.java:447)]
 ConfigurationUtils.locate(): base is null, name is hadoop-metrics2.properties
2017-03-20 23:47:58,985 (main) [WARN - 
org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:124)]
 Cannot locate configuration: tried 
hadoop-metrics2-hbase.properties,hadoop-metrics2.properties
2017-03-20 23:47:58,997 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: period
2017-03-20 23:47:59,037 (main) [DEBUG - 
org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)]
 field org.apache.hadoop.metrics2.lib.MutableStat 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.snapshotStat with annotation 
@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, 
about=, type=DEFAULT, value=[Snapshot, Snapshot stats], valueName=Time)
2017-03-20 23:47:59,046 (main) [DEBUG - 
org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)]
 field org.apache.hadoop.metrics2.lib.MutableStat 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.publishStat with annotation 
@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, 
about=, type=DEFAULT, value=[Publish, Publishing stats], valueName=Time)
2017-03-20 23:47:59,047 (main) [DEBUG - 
org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)]
 field org.apache.hadoop.metrics2.lib.MutableCounterLong 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.droppedPubAll with annotation 
@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, 
about=, type=DEFAULT, value=[Dropped updates by all sinks], valueName=Time)
2017-03-20 23:47:59,053 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
2017-03-20 23:47:59,054 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'MetricsConfig' for key: source.start_mbeans
2017-03-20 23:47:59,054 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
2017-03-20 23:47:59,063 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:245)]
 Updating attr cache...
2017-03-20 23:47:59,064 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:259)]
 Done. # tags & metrics=10
2017-03-20 23:47:59,064 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:239)]
 Updating info cache...
2017-03-20 23:47:59,064 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MBeanInfoBuilder.get(MBeanInfoBuilder.java:109)]
 [javax.management.MBeanAttributeInfo[description=Metrics context, 
name=tag.Context, type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of active metrics 
sources, name=NumActiveSources, type=java.lang.Integer, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Number of all 
registered metrics sources, name=NumAllSources, type=java.lang.Integer, 
read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of active metrics sinks, 
name=NumActiveSinks, type=java.lang.Integer, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of all registered 
metrics sinks, name=NumAllSinks, type=java.lang.Integer, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops 
for snapshot stats, name=SnapshotNumOps, type=java.lang.Long, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Average time 
for snapshot stats, name=SnapshotAvgTime, type=java.lang.Double, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Number of ops 
for publishing stats, name=PublishNumOps, type=java.lang.Long, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Average time 
for publishing stats, name=PublishAvgTime, type=java.lang.Double, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Dropped updates 
by all sinks, name=DroppedPubAll, type=java.lang.Long, read-only, 
descriptor={}]]
2017-03-20 23:47:59,065 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:241)]
 Done
2017-03-20 23:47:59,065 (main) [DEBUG - 
org.apache.hadoop.metrics2.util.MBeans.register(MBeans.java:58)] Registered 
Hadoop:service=HBase,name=MetricsSystem,sub=Stats
2017-03-20 23:47:59,066 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.startMBeans(MetricsSourceAdapter.java:222)]
 MBean for source MetricsSystem,sub=Stats registered.
2017-03-20 23:47:59,067 (main) [INFO - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.startTimer(MetricsSystemImpl.java:376)]
 Scheduled snapshot period at 10 second(s).
2017-03-20 23:47:59,067 (main) [INFO - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:191)]
 HBase metrics system started
2017-03-20 23:47:59,069 (main) [DEBUG - 
org.apache.hadoop.metrics2.util.MBeans.register(MBeans.java:58)] Registered 
Hadoop:service=HBase,name=MetricsSystem,sub=Control
2017-03-20 23:47:59,073 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:231)]
 JvmMetrics, JVM related metrics etc.
2017-03-20 23:47:59,073 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
2017-03-20 23:47:59,073 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'MetricsConfig' for key: source.start_mbeans
2017-03-20 23:47:59,074 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
2017-03-20 23:47:59,088 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:245)]
 Updating attr cache...
2017-03-20 23:47:59,088 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:259)]
 Done. # tags & metrics=27
2017-03-20 23:47:59,089 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:239)]
 Updating info cache...
2017-03-20 23:47:59,089 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MBeanInfoBuilder.get(MBeanInfoBuilder.java:109)]
 [javax.management.MBeanAttributeInfo[description=Metrics context, 
name=tag.Context, type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Process name, 
name=tag.ProcessName, type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Session ID, name=tag.SessionId, 
type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Local hostname, 
name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Non-heap memory used in MB, 
name=MemNonHeapUsedM, type=java.lang.Float, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Non-heap memory committed in 
MB, name=MemNonHeapCommittedM, type=java.lang.Float, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Non-heap memory max in MB, 
name=MemNonHeapMaxM, type=java.lang.Float, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Heap memory used in MB, 
name=MemHeapUsedM, type=java.lang.Float, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Heap memory committed in MB, 
name=MemHeapCommittedM, type=java.lang.Float, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Heap memory max in MB, 
name=MemHeapMaxM, type=java.lang.Float, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Max memory size in MB, 
name=MemMaxM, type=java.lang.Float, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=GC Count for Copy, 
name=GcCountCopy, type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=GC Time for Copy, 
name=GcTimeMillisCopy, type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=GC Count for MarkSweepCompact, 
name=GcCountMarkSweepCompact, type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=GC Time for MarkSweepCompact, 
name=GcTimeMillisMarkSweepCompact, type=java.lang.Long, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Total GC count, 
name=GcCount, type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Total GC time in milliseconds, 
name=GcTimeMillis, type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of new threads, 
name=ThreadsNew, type=java.lang.Integer, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of runnable threads, 
name=ThreadsRunnable, type=java.lang.Integer, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of blocked threads, 
name=ThreadsBlocked, type=java.lang.Integer, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of waiting threads, 
name=ThreadsWaiting, type=java.lang.Integer, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of timed waiting 
threads, name=ThreadsTimedWaiting, type=java.lang.Integer, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Number of 
terminated threads, name=ThreadsTerminated, type=java.lang.Integer, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Total number of 
fatal log events, name=LogFatal, type=java.lang.Long, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Total number of 
error log events, name=LogError, type=java.lang.Long, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Total number of 
warning log events, name=LogWarn, type=java.lang.Long, read-only, 
descriptor={}], javax.management.MBeanAttributeInfo[description=Total number of 
info log events, name=LogInfo, type=java.lang.Long, read-only, descriptor={}]]
2017-03-20 23:47:59,090 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:241)]
 Done
2017-03-20 23:47:59,091 (main) [DEBUG - 
org.apache.hadoop.metrics2.util.MBeans.register(MBeans.java:58)] Registered 
Hadoop:service=HBase,name=JvmMetrics
2017-03-20 23:47:59,091 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.startMBeans(MetricsSourceAdapter.java:222)]
 MBean for source JvmMetrics registered.
2017-03-20 23:47:59,091 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.registerSource(MetricsSystemImpl.java:270)]
 Registered source JvmMetrics
2017-03-20 23:47:59,127 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:231)]
 Master,sub=IPC, Metrics about HBase Server IPC
2017-03-20 23:47:59,128 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
2017-03-20 23:47:59,128 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'MetricsConfig' for key: source.start_mbeans
2017-03-20 23:47:59,128 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
2017-03-20 23:47:59,129 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:245)]
 Updating attr cache...
2017-03-20 23:47:59,130 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:259)]
 Done. # tags & metrics=2
2017-03-20 23:47:59,153 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:239)]
 Updating info cache...
2017-03-20 23:47:59,154 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MBeanInfoBuilder.get(MBeanInfoBuilder.java:109)]
 [javax.management.MBeanAttributeInfo[description=Metrics context, 
name=tag.Context, type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Local hostname, 
name=tag.Hostname, type=java.lang.String, read-only, descriptor={}]]
2017-03-20 23:47:59,160 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:241)]
 Done
2017-03-20 23:47:59,161 (main) [DEBUG - 
org.apache.hadoop.metrics2.util.MBeans.register(MBeans.java:58)] Registered 
Hadoop:service=HBase,name=Master,sub=IPC
2017-03-20 23:47:59,161 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.startMBeans(MetricsSourceAdapter.java:222)]
 MBean for source Master,sub=IPC registered.
2017-03-20 23:47:59,162 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.registerSource(MetricsSystemImpl.java:270)]
 Registered source Master,sub=IPC
2017-03-20 23:47:59,246 (main) [DEBUG - 
org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)]
 field org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with 
annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, 
always=false, about=, type=DEFAULT, value=[Rate of successful kerberos logins 
and latency (milliseconds)], valueName=Time)
2017-03-20 23:47:59,247 (main) [DEBUG - 
org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)]
 field org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with 
annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, 
always=false, about=, type=DEFAULT, value=[Rate of failed kerberos logins and 
latency (milliseconds)], valueName=Time)
2017-03-20 23:47:59,248 (main) [DEBUG - 
org.apache.hadoop.metrics2.lib.MutableMetricsFactory.newForField(MutableMetricsFactory.java:42)]
 field org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with 
annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, 
always=false, about=, type=DEFAULT, value=[GetGroups], valueName=Time)
2017-03-20 23:47:59,248 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:231)]
 UgiMetrics, User and group related metrics
2017-03-20 23:47:59,249 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans
2017-03-20 23:47:59,249 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'MetricsConfig' for key: source.start_mbeans
2017-03-20 23:47:59,249 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsConfig.getProperty(MetricsConfig.java:179)]
 poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans
2017-03-20 23:47:59,249 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:245)]
 Updating attr cache...
2017-03-20 23:47:59,250 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateAttrCache(MetricsSourceAdapter.java:259)]
 Done. # tags & metrics=8
2017-03-20 23:47:59,250 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:239)]
 Updating info cache...
2017-03-20 23:47:59,251 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MBeanInfoBuilder.get(MBeanInfoBuilder.java:109)]
 [javax.management.MBeanAttributeInfo[description=Metrics context, 
name=tag.Context, type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Local hostname, 
name=tag.Hostname, type=java.lang.String, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of ops for rate of 
successful kerberos logins and latency (milliseconds), name=LoginSuccessNumOps, 
type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Average time for rate of 
successful kerberos logins and latency (milliseconds), 
name=LoginSuccessAvgTime, type=java.lang.Double, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of ops for rate of 
failed kerberos logins and latency (milliseconds), name=LoginFailureNumOps, 
type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Average time for rate of failed 
kerberos logins and latency (milliseconds), name=LoginFailureAvgTime, 
type=java.lang.Double, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Number of ops for getGroups, 
name=GetGroupsNumOps, type=java.lang.Long, read-only, descriptor={}], 
javax.management.MBeanAttributeInfo[description=Average time for getGroups, 
name=GetGroupsAvgTime, type=java.lang.Double, read-only, descriptor={}]]
2017-03-20 23:47:59,251 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateInfoCache(MetricsSourceAdapter.java:241)]
 Done
2017-03-20 23:47:59,252 (main) [DEBUG - 
org.apache.hadoop.metrics2.util.MBeans.register(MBeans.java:58)] Registered 
Hadoop:service=HBase,name=UgiMetrics
2017-03-20 23:47:59,252 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.startMBeans(MetricsSourceAdapter.java:222)]
 MBean for source UgiMetrics registered.
2017-03-20 23:47:59,252 (main) [DEBUG - 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl.registerSource(MetricsSystemImpl.java:270)]
 Registered source UgiMetrics
2017-03-20 23:47:59,468 (main) [DEBUG - 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.login(UserGroupInformation.java:209)]
 hadoop login
2017-03-20 23:47:59,469 (main) [DEBUG - 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:144)]
 hadoop login commit
2017-03-20 23:47:59,476 (main) [DEBUG - 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:174)]
 using local user:NTUserPrincipal: arfu1
2017-03-20 23:47:59,476 (main) [DEBUG - 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:180)]
 Using user: "NTUserPrincipal: arfu1" with name arfu1
2017-03-20 23:47:59,477 (main) [DEBUG - 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:190)]
 User entry: "arfu1"
2017-03-20 23:47:59,478 (main) [DEBUG - 
org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:799)]
 UGI loginUser:arfu1 (auth:SIMPLE)
2017-03-20 23:47:59,643 (main) [ERROR - 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:233)] Error 
starting cluster
java.lang.RuntimeException: Failed construction of Master: class 
org.apache.hadoop.hbase.master.HMasterIllegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\hbase
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:217)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:97)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:81)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:68)
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:148)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
        at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
        at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
        at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:38)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:535)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1182)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:1033)
Caused by: java.lang.IllegalArgumentException: Illegal character in authority 
at index 7: 
file://C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\hbase
        at java.net.URI.create(URI.java:852)
        at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:177)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
        at org.apache.hadoop.hbase.fs.HFileSystem.<init>(HFileSystem.java:80)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:574)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:381)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
        ... 29 more
Caused by: java.net.URISyntaxException: Illegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\hbase
        at java.net.URI$Parser.fail(URI.java:2848)
        at java.net.URI$Parser.parseAuthority(URI.java:3186)
        at java.net.URI$Parser.parseHierarchical(URI.java:3097)
        at java.net.URI$Parser.parse(URI.java:3053)
        at java.net.URI.<init>(URI.java:588)
        at java.net.URI.create(URI.java:850)
        ... 39 more
2017-03-20 23:47:59,645 (main) [INFO - 
com.cloudera.sqoop.hbase.HBaseTestCase.shutdown(HBaseTestCase.java:183)] In 
shutdown() method
2017-03-20 23:47:59,657 (main) [INFO - 
com.cloudera.sqoop.hbase.HBaseTestCase.shutdown(HBaseTestCase.java:191)] 
shutdown() method returning.
2017-03-20 23:47:59,661 (main) [WARN - 
com.cloudera.sqoop.testutil.BaseSqoopTestCase.guaranteeCleanWarehouse(BaseSqoopTestCase.java:265)]
 Cannot delete D:\work\projects\sqoop-trunk\build\test\data\sqoop\warehouse
2017-03-20 23:47:59,681 (main) [DEBUG - 
org.apache.zookeeper.server.persistence.FileTxnSnapLog.<init>(FileTxnSnapLog.java:79)]
 Opening 
datadir:C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\zk\zookeeper_0
 
snapDir:C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\zk\zookeeper_0
2017-03-20 23:47:59,682 (main) [INFO - 
org.apache.zookeeper.server.ZooKeeperServer.<init>(ZooKeeperServer.java:162)] 
Created server with tickTime 2000 minSessionTimeout 4000 maxSessionTimeout 
40000 datadir 
C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\zk\zookeeper_0\version-2
 snapdir 
C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\zk\zookeeper_0\version-2
2017-03-20 23:48:00,019 (main) [INFO - 
org.apache.zookeeper.server.NIOServerCnxnFactory.configure(NIOServerCnxnFactory.java:94)]
 binding to port 0.0.0.0/0.0.0.0:53690
2017-03-20 23:48:00,021 (main) [WARN - 
org.apache.zookeeper.jmx.MBeanRegistry.register(MBeanRegistry.java:100)] Failed 
to register MBean StandaloneServer_port-1
2017-03-20 23:48:00,021 (main) [WARN - 
org.apache.zookeeper.server.ZooKeeperServer.registerJMX(ZooKeeperServer.java:387)]
 Failed to register with JMX
javax.management.InstanceAlreadyExistsException: 
org.apache.ZooKeeperService:name0=StandaloneServer_port-1
        at com.sun.jmx.mbeanserver.Repository.addMBean(Repository.java:437)
        at 
com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerWithRepository(DefaultMBeanServerInterceptor.java:1898)
        at 
com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerDynamicMBean(DefaultMBeanServerInterceptor.java:966)
        at 
com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerObject(DefaultMBeanServerInterceptor.java:900)
        at 
com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:324)
        at 
com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
        at 
org.apache.zookeeper.jmx.MBeanRegistry.register(MBeanRegistry.java:96)
        at 
org.apache.zookeeper.server.ZooKeeperServer.registerJMX(ZooKeeperServer.java:377)
        at 
org.apache.zookeeper.server.ZooKeeperServer.startup(ZooKeeperServer.java:410)
        at 
org.apache.zookeeper.server.NIOServerCnxnFactory.startup(NIOServerCnxnFactory.java:123)
        at 
org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster.startup(MiniZooKeeperCluster.java:250)
        at 
org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster.startup(MiniZooKeeperCluster.java:185)
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:135)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
        at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
        at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
        at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:38)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:535)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1182)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:1033)
2017-03-20 23:48:00,024 (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:53690) [INFO - 
org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:197)]
 Accepted socket connection from /127.0.0.1:65257
2017-03-20 23:48:00,025 (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:53690) [INFO - 
org.apache.zookeeper.server.NIOServerCnxn.checkFourLetterWord(NIOServerCnxn.java:827)]
 Processing stat command from /127.0.0.1:65257
2017-03-20 23:48:00,026 (Thread-6) [INFO - 
org.apache.zookeeper.server.NIOServerCnxn$StatCommand.commandRun(NIOServerCnxn.java:663)]
 Stat command output
2017-03-20 23:48:00,029 (main) [INFO - 
org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster.startup(MiniZooKeeperCluster.java:273)]
 Started MiniZooKeeperCluster and ran successful 'stat' on client port=53690
2017-03-20 23:48:00,035 (Thread-6) [INFO - 
org.apache.zookeeper.server.NIOServerCnxn.closeSock(NIOServerCnxn.java:1007)] 
Closed socket connection for client /127.0.0.1:65257 (no session established 
for client)
2017-03-20 23:48:00,071 (main) [INFO - 
org.apache.hadoop.hbase.client.ConnectionUtils.setServerSideHConnectionRetriesConfig(ConnectionUtils.java:108)]
 master/arfu1-PC/10.0.0.53:0 server-side HConnection retries=350
2017-03-20 23:48:00,080 (main) [INFO - 
org.apache.hadoop.hbase.ipc.SimpleRpcScheduler.<init>(SimpleRpcScheduler.java:128)]
 Using deadline as user call queue, count=3
2017-03-20 23:48:00,092 (main) [INFO - 
org.apache.hadoop.hbase.ipc.RpcServer$Listener.<init>(RpcServer.java:592)] 
master/arfu1-PC/10.0.0.53:0: started 10 reader(s) listening on port=65258
2017-03-20 23:48:00,104 (main) [ERROR - 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:233)] Error 
starting cluster
java.lang.RuntimeException: Failed construction of Master: class 
org.apache.hadoop.hbase.master.HMasterIllegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\hbase
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:217)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:97)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:81)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:68)
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:148)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
        at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
        at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
        at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:38)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:535)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1182)
        at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:1033)
Caused by: java.lang.IllegalArgumentException: Illegal character in authority 
at index 7: 
file://C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\hbase
        at java.net.URI.create(URI.java:852)
        at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:177)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
        at org.apache.hadoop.hbase.fs.HFileSystem.<init>(HFileSystem.java:80)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:574)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:381)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
        ... 29 more
Caused by: java.net.URISyntaxException: Illegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\hbase
        at java.net.URI$Parser.fail(URI.java:2848)
        at java.net.URI$Parser.parseAuthority(URI.java:3186)
        at java.net.URI$Parser.parseHierarchical(URI.java:3097)
        at java.net.URI$Parser.parse(URI.java:3053)
        at java.net.URI.<init>(URI.java:588)
        at java.net.URI.create(URI.java:850)
        ... 39 more
2017-03-20 23:48:00,130 (main) [INFO - 
com.cloudera.sqoop.hbase.HBaseTestCase.shutdown(HBaseTestCase.java:183)] In 
shutdown() method
2017-03-20 23:48:00,132 (main) [INFO - 
com.cloudera.sqoop.hbase.HBaseTestCase.shutdown(HBaseTestCase.java:191)] 
shutdown() method returning.
2017-03-20 23:48:00,133 (main) [WARN - 
com.cloudera.sqoop.testutil.BaseSqoopTestCase.guaranteeCleanWarehouse(BaseSqoopTestCase.java:265)]
 Cannot delete D:\work\projects\sqoop-trunk\build\test\data\sqoop\warehouse
------------- ---------------- ---------------

Testcase: testNulls took 5.478 sec
        Caused an ERROR
java.io.IOException: Shutting down
java.lang.RuntimeException: java.io.IOException: Shutting down
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:178)
Caused by: java.io.IOException: Shutting down
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:235)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:97)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:81)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:68)
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:148)
Caused by: java.lang.RuntimeException: Failed construction of Master: class 
org.apache.hadoop.hbase.master.HMasterIllegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\hbase
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:217)
Caused by: java.lang.IllegalArgumentException: Illegal character in authority 
at index 7: 
file://C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\hbase
        at java.net.URI.create(URI.java:852)
        at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:177)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
        at org.apache.hadoop.hbase.fs.HFileSystem.<init>(HFileSystem.java:80)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:574)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:381)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
Caused by: java.net.URISyntaxException: Illegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\29d3fccc-b968-4ed1-84f3-651277bba0ca\hbase
        at java.net.URI$Parser.fail(URI.java:2848)
        at java.net.URI$Parser.parseAuthority(URI.java:3186)
        at java.net.URI$Parser.parseHierarchical(URI.java:3097)
        at java.net.URI$Parser.parse(URI.java:3053)
        at java.net.URI.<init>(URI.java:588)
        at java.net.URI.create(URI.java:850)

Testcase: testNullRow took 0.453 sec
        Caused an ERROR
java.io.IOException: Shutting down
java.lang.RuntimeException: java.io.IOException: Shutting down
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:178)
Caused by: java.io.IOException: Shutting down
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:235)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:97)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:81)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:68)
        at com.cloudera.sqoop.hbase.HBaseTestCase.setUp(HBaseTestCase.java:148)
Caused by: java.lang.RuntimeException: Failed construction of Master: class 
org.apache.hadoop.hbase.master.HMasterIllegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\hbase
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
        at 
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
        at 
org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:217)
Caused by: java.lang.IllegalArgumentException: Illegal character in authority 
at index 7: 
file://C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\hbase
        at java.net.URI.create(URI.java:852)
        at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:177)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
        at org.apache.hadoop.hbase.fs.HFileSystem.<init>(HFileSystem.java:80)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:574)
        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:381)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
Caused by: java.net.URISyntaxException: Illegal character in authority at index 
7: 
file://C:\Users\arfu1\AppData\Local\Temp\1a8daa1a-7126-49b2-9edb-9efd92f1ff40\hbase
        at java.net.URI$Parser.fail(URI.java:2848)
        at java.net.URI$Parser.parseAuthority(URI.java:3186)
        at java.net.URI$Parser.parseHierarchical(URI.java:3097)
        at java.net.URI$Parser.parse(URI.java:3053)
        at java.net.URI.<init>(URI.java:588)
        at java.net.URI.create(URI.java:850)

Reply via email to