Hi folks,

I am running Hive 0.12 UT on top of Hadoop 2.2.0. A bunch of UTs throw the
following exceptions.

I am wondering why scheme is "pfile" instead of "file" in hive UT cases. If
I change pfile to file, the UT passes.
Or, if I provide an implementation of listLocatedStatus in ProxyFileSystem,
the UT passes too.
But I still can not understand why we translate pfile to file in the shims.


  [junit] java.lang.IllegalArgumentException: Wrong FS:
pfile:/home/hadoop/jenkins/workspace/HudsonHD2_2_0Hive_UT_JDK7/hive/build/hbase-handler/test/data/warehouse/src,
expected: file:///
    [junit]     at 
org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:642)
    [junit]     at
org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:69)
    [junit]     at
org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:375)
    [junit]     at 
org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1482)
    [junit]     at 
org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1522)
    [junit]     at 
org.apache.hadoop.fs.FileSystem$4.<init>(FileSystem.java:1798)
    [junit]     at
org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1797)
    [junit]     at
org.apache.hadoop.fs.ChecksumFileSystem.listLocatedStatus(ChecksumFileSystem.java:579)
    [junit]     at
org.apache.hadoop.fs.FilterFileSystem.listLocatedStatus(FilterFileSystem.java:235)
    [junit]     at
org.apache.hadoop.fs.FilterFileSystem.listLocatedStatus(FilterFileSystem.java:235)
    [junit]     at
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:264)
    [junit]     at
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:217)
    [junit]     at
org.apache.hadoop.mapred.lib.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:75)
    [junit]     at
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getSplits(HadoopShimsSecure.java:385)
    [junit]     at
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getSplits(HadoopShimsSecure.java:351)
    [junit]     at
org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:400)
    [junit]     at
org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
    [junit]     at
org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
    [junit]     at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
    [junit]     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
    [junit]     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
    [junit]     at java.security.AccessController.doPrivileged(Native Method)
    [junit]     at javax.security.auth.Subject.doAs(Subject.java:415)
    [junit]     at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    [junit]     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
    [junit]     at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
    [junit]     at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
    [junit]     at java.security.AccessController.doPrivileged(Native Method)
    [junit]     at javax.security.auth.Subject.doAs(Subject.java:415)
    [junit]     at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    [junit]     at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
    [junit]     at 
org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
    [junit]     at
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
    [junit]     at
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.main(ExecDriver.java:727)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    [junit]     at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:601)
    [junit]     at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
    [junit] Job Submission failed with exception
'java.lang.IllegalArgumentException(Wrong FS:
pfile:/home/hadoop/jenkins/workspace/HudsonHD2_2_0Hive_UT_JDK7/hive/build/hbase-handler/test/data/warehouse/src,
expected: file:///)'
    [junit] Exception: Client Execution failed with error code = 1
    [junit] Failed query: hbase_stats.q


-- 
Regards
Gordon Wang

Reply via email to