[
https://issues.apache.org/jira/browse/PHOENIX-1539?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14252417#comment-14252417
]
Gabriel Reid commented on PHOENIX-1539:
---------------------------------------
This seems to be brought on by the fact that you're running on a (somewhat)
case-insensitive file system (on OS X), but I think that there's something that
we can do about it in the packaging of the Phoenix jar by doing something like
this:
http://stackoverflow.com/questions/10522835/hadoop-java-io-ioexception-mkdirs-failed-to-create-some-path/13181099#13181099
> Unable to Bulk Load into HBase 0.98.8
> -------------------------------------
>
> Key: PHOENIX-1539
> URL: https://issues.apache.org/jira/browse/PHOENIX-1539
> Project: Phoenix
> Issue Type: Bug
> Affects Versions: 4.2
> Environment: Running Hadoop 2.6.0, HBase 0.98.8 and Phoenix 4.2.2 on
> a Mac OS X.
> Reporter: Murali T
> Fix For: 4.2
>
>
> I get the following error when trying to import data into HBase using the
> following command.
> HADOOP_CLASSPATH=/Users/TM/hbase-0.98.8-hadoop2/lib/hbase-protocol-0.98.8-hadoop2.jar:/Users/TM/hbase-0.98.8-hadoop2/conf
> hadoop jar /Users/TM/phoenix-4.2.2-bin/phoenix-4.2.2-client.jar
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table SAMPLE --input
> hdfs://localhost:9000/etl/input/sample.csv
> Exception in thread "main" java.io.FileNotFoundException:
> /var/folders/wd/c7qfgncn76x5lkvczkdjgvym0000gn/T/hadoop-unjar514359376416696397/META-INF/LICENSE
> (Is a directory)
> at java.io.FileOutputStream.open(Native Method)
> at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
> at java.io.FileOutputStream.<init>(FileOutputStream.java:171)
> at org.apache.hadoop.util.RunJar.unJar(RunJar.java:105)
> at org.apache.hadoop.util.RunJar.unJar(RunJar.java:81)
> at org.apache.hadoop.util.RunJar.run(RunJar.java:209)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)