Hi,
For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to try to run
build/UT with Hadoop-2.1.0-beta jar files. The compile could be successful.
However, encounter following error when run UT. Any comments? Thanks!
<system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
@260f260f,5,main]]: run()/openServerSocket():
java.net.BindException: Address already in use
at java.net.PlainSocketImpl.socketBind(Native Method)
at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
at java.net.ServerSocket.bind(ServerSocket.java:339)
at java.net.ServerSocket.<init>(ServerSocket.java:205)
at java.net.ServerSocket.<init>(ServerSocket.java:117)
at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown Source)
at org.hsqldb.Server.openServerSocket(Unknown Source)
at org.hsqldb.Server.run(Unknown Source)
at org.hsqldb.Server.access$000(Unknown Source)
at org.hsqldb.Server$ServerThread.run(Unknown Source)
Note:
/home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
java.lang.UnsupportedOperationException: This is supposed to be overridden
by subclasses.
at
com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
at
org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
at
org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
at
org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
at java.util.HashMap.hash(HashMap.java:132)
at java.util.HashMap.putImpl(HashMap.java:695)
at java.util.HashMap.put(HashMap.java:680)
at
org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
at
org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
at
org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at
java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
at
org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
at
org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
at
org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
at
org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
at
org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
at
com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
at
com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
--
Sam Liu