[
https://issues.apache.org/jira/browse/HIVE-1083?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13772440#comment-13772440
]
Jean-Marc Spaggiari commented on HIVE-1083:
-------------------------------------------
Hi don't think MAPREDUCE-1501 is fixing this issue. Still getting errors like:
{code}
java.io.IOException: Not a file:
hdfs://namenode/user/abcd/data/efgh/logs/2013/07/02
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:212)
at
org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:292)
at
org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:329)
at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1090)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1082)
{code}
> allow sub-directories for an external table/partition
> -----------------------------------------------------
>
> Key: HIVE-1083
> URL: https://issues.apache.org/jira/browse/HIVE-1083
> Project: Hive
> Issue Type: Improvement
> Components: Query Processor
> Affects Versions: 0.6.0
> Reporter: Namit Jain
> Assignee: Zheng Shao
> Labels: inputformat
>
> Sometimes users want to define an external table/partition based on all files
> (recursively) inside a directory.
> Currently most of the Hadoop InputFormat classes do not support that. We
> should extract all files recursively in the directory, and add them to the
> input path of the job.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira