MetaException(message:hdfs://
h1.vgs.mypoints.com:8020/user/flume/events/request_logs/ar1.vgs.mypoints.com/13-06-13/FlumeData.1371144648033

is not a directory or unable to create one)


it clearly says its not a directory. Point to the dictory and it will work


On Thu, Jun 20, 2013 at 10:52 PM, sanjeev sagar <sanjeev.sa...@gmail.com>wrote:

> Hello Everyone, I'm running into the following Hive external table issue.
>
>
>
> hive> CREATE EXTERNAL TABLE access(
>
>      >       host STRING,
>
>      >       identity STRING,
>
>      >       user STRING,
>
>      >       time STRING,
>
>      >       request STRING,
>
>      >       status STRING,
>
>      >       size STRING,
>
>      >       referer STRING,
>
>      >       agent STRING)
>
>      >       ROW FORMAT SERDE
>
> 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe'
>
>      >       WITH SERDEPROPERTIES (
>
>      >      "input.regex" = "([^ ]*) ([^ ]*) ([^ ]*) (-|\\[[^\\]]*\\])
>
> ([^ \"]*|\"[^\"]*\") (-|[0-9]*) (-|[0-9]*)(?: ([^ \"]*|\"[^\"]*\") ([^
> \"]*|\"[^\"]*\"))?",
>
>      >       "output.format.string" = "%1$s %2$s %3$s %4$s %5$s %6$s
>
> %7$s %8$s %9$s"
>
>      >       )
>
>      >       STORED AS TEXTFILE
>
>      >       LOCATION
>
> '/user/flume/events/request_logs/
> ar1.vgs.mypoints.com/13-06-13/FlumeData.1371144648033';
>
> FAILED: Error in metadata:
>
> MetaException(message:hdfs://
> h1.vgs.mypoints.com:8020/user/flume/events/request_logs/ar1.vgs.mypoints.com/13-06-13/FlumeData.1371144648033
>
> is not a directory or unable to create one)
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
>
>
>
> In HDFS: file exists
>
>
>
> hadoop fs -ls
>
> /user/flume/events/request_logs/
> ar1.vgs.mypoints.com/13-06-13/FlumeData.1371144648033
>
> Found 1 items
>
> -rw-r--r--   3 hdfs supergroup 2242037226 2013-06-13 11:14
>
> /user/flume/events/request_logs/
> ar1.vgs.mypoints.com/13-06-13/FlumeData.1371144648033
>
>
>
> I've download the serde2 jar file too and install it in
> /usr/lib/hive/lib/hive-json-serde-0.2.jar and I've bounced all the hadoop
> services after that.
>
>
>
> I even added the jar file manually in hive and run the above sql but still
> failing.
>
> ive> add jar /usr/lib/hive/lib/hive-json-serde-0.2.jar
>
>      > ;
>
> Added /usr/lib/hive/lib/hive-json-serde-0.2.jar to class path Added
> resource: /usr/lib/hive/lib/hive-json-serde-0.2.jar
>
>
>
> Any help would be highly appreciable.
>
>
>
> -Sanjeev
>
>
>
>
>
>
>
>
>
> --
> Sanjeev Sagar
>
> *"**Separate yourself from everything that separates you from others !" - 
> Nirankari
> Baba Hardev Singh ji *
>
> **
>



-- 
Nitin Pawar

Reply via email to