When I use hadoop to import data from HDFS to mysql database like following 
command:

$ sqoop import --connect jdbc:mysql://172.11.12.6/hadooptest --username 
hadoopuser  --password password --table employees

But it raise following error in yarn-hadoop-resourcemanager-master.log and 
hadoop-hadoop-secondarynamenode-master.log files


yarn-hadoop-resourcemanager-master.log
org.apache.hadoop.util.Shell$ExitCodeException: /home/hadoop/rack-script.sh: 
line 3: unexpected EOF while looking for matching `''
/home/hadoop/rack-script.sh: line 9: syntax error: unexpected end of file
 at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
 at org.apache.hadoop.util.Shell.run(Shell.java:379)
 at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
 at 
org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:241)
 at 
org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:179)
 at 
org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:119)
 at org.apache.hadoop.yarn.util.RackResolver.coreResolve(RackResolver.java:101)
 at org.apache.hadoop.yarn.util.RackResolver.resolve(RackResolver.java:95)
 at 
org.apache.hadoop.yarn.server.resourcemanager.ResourceTrackerService.resolve(ResourceTrackerService.java:348)
 at 
org.apache.hadoop.yarn.server.resourcemanager.ResourceTrackerService.registerNodeManager(ResourceTrackerService.java:208)
 at 
org.apache.hadoop.yarn.server.api.impl.pb.service.ResourceTrackerPBServiceImpl.registerNodeManager(ResourceTrackerPBServiceImpl.java:54)
 at 
org.apache.hadoop.yarn.proto.ResourceTracker$ResourceTrackerService$2.callBlockingMethod(ResourceTracker.java:79)
 at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)


rack-script.sh is follows:
#!  /bin/bash

if [$1='172.11.12.6"] then
    echo -n "/rack1 "
else
    echo -n "/default-racks "
fi


hadoop-hadoop-secondarynamenode-master.log
java.io.IOException: Inconsistent checkpoint fields.
LV = -47 namespaceID = 1008917896 cTime = 0 ; clusterId = 
CID-e47f6b2e-ca1f-4eda-b6f7-dd9d5314c251 ; blockpoolId = 
BP-1227723652-172.11.12.6-1390358446283.
Expecting respectively: -47; 1482569253; 0; 
CID-d097c1ad-c6fb-4343-a87d-87a06bed0ccd; BP-19399663-172.11.12.6-1397098797922.
 at 
org.apache.hadoop.hdfs.server.namenode.CheckpointSignature.validateStorageInfo(CheckpointSignature.java:133)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:519)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:380)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$2.run(SecondaryNameNode.java:346)
 at 
org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:456)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:342)
 at java.lang.Thread.run(Thread.java:722)
java.io.IOException: Inconsistent checkpoint fields.
LV = -47 namespaceID = 1008917896 cTime = 0 ; clusterId = 
CID-e47f6b2e-ca1f-4eda-b6f7-dd9d5314c251 ; blockpoolId = 
BP-1227723652-172.11.12.6-1390358446283.
Expecting respectively: -47; 1482569253; 0; 
CID-d097c1ad-c6fb-4343-a87d-87a06bed0ccd; BP-19399663-172.11.12.6-1397098797922.
 at 
org.apache.hadoop.hdfs.server.namenode.CheckpointSignature.validateStorageInfo(CheckpointSignature.java:133)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:519)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:380)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$2.run(SecondaryNameNode.java:346)
 at 
org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:456)
 at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:342)
 at java.lang.Thread.run(Thread.java:722)


Although above error,hadoop finally success to import HDFS data to mysql 
database. I want to know why raise above error?  I use an ordinary computer 
with AMD A4-3305M CPU and 4G memory under XP,then I installed VMWare,there are 
two vitual machine under VMWare. I want to know whether the reason of error is 
my computer configuration is too low or other reasons. Any idea will be 
appreciated. Thanks
---------------------------------------------------------------------------------------------------
Confidentiality Notice: The information contained in this e-mail and any 
accompanying attachment(s) 
is intended only for the use of the intended recipient and may be confidential 
and/or privileged of 
Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader of 
this communication is 
not the intended recipient, unauthorized use, forwarding, printing,  storing, 
disclosure or copying 
is strictly prohibited, and may be unlawful.If you have received this 
communication in error,please 
immediately notify the sender by return e-mail, and delete the original message 
and all copies from 
your system. Thank you. 
---------------------------------------------------------------------------------------------------

Reply via email to