[ 
https://issues.apache.org/jira/browse/HADOOP-16180?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16798638#comment-16798638
 ] 

Yuming Wang commented on HADOOP-16180:
--------------------------------------

I'm not sure. Maybe it's the way we use it.

> LocalFileSystem throw Malformed input or input contains unmappable characters
> -----------------------------------------------------------------------------
>
>                 Key: HADOOP-16180
>                 URL: https://issues.apache.org/jira/browse/HADOOP-16180
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 2.8.0, 3.2.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> How to reproduce:
> {code:java}
> export LANG=
> export LC_CTYPE="POSIX"
> export LC_NUMERIC="POSIX"
> export LC_TIME="POSIX"
> export LC_COLLATE="POSIX"
> export LC_MONETARY="POSIX"
> export LC_MESSAGES="POSIX"
> export LC_PAPER="POSIX"
> export LC_NAME="POSIX"
> export LC_ADDRESS="POSIX"
> export LC_TELEPHONE="POSIX"
> export LC_MEASUREMENT="POSIX"
> export LC_IDENTIFICATION="POSIX"
> git clone https://github.com/apache/spark.git && cd spark && git checkout 
> v2.4.0
> build/sbt "hive/testOnly *.HiveDDLSuite" -Phive -Phadoop-2.7 
> -Dhadoop.version=2.8.0
> {code}
> Stack trace:
> {noformat}
> Caused by: sbt.ForkMain$ForkError: java.nio.file.InvalidPathException: 
> Malformed input or input contains unmappable characters: 
> /home/jenkins/workspace/SparkPullRequestBuilder@2/target/tmp/warehouse-15474fdf-0808-40ab-946d-1309fb05bf26/DaTaBaSe_I.db/tab_ı
>       at sun.nio.fs.UnixPath.encode(UnixPath.java:147)
>       at sun.nio.fs.UnixPath.<init>(UnixPath.java:71)
>       at sun.nio.fs.UnixFileSystem.getPath(UnixFileSystem.java:281)
>       at java.io.File.toPath(File.java:2234)
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getLastAccessTime(RawLocalFileSystem.java:683)
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.<init>(RawLocalFileSystem.java:694)
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:664)
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:987)
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:656)
>       at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
>       at org.apache.hadoop.hive.metastore.Warehouse.isDir(Warehouse.java:520)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1436)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1503)
>       ... 112 more{noformat}
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/103328/testReport/org.apache.spark.sql.hive.execution/HiveCatalogedDDLSuite/basic_DDL_using_locale_tr___caseSensitive_true/]
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/103328/testReport/org.apache.spark.sql.hive.execution/HiveDDLSuite/create_Hive_serde_table_and_view_with_unicode_columns_and_comment/]
>  
> It works before https://issues.apache.org/jira/browse/HADOOP-12045.
> We could workaround it by resetting locale:
> {code:java}
> export LANG=en_US.UTF-8
> export LC_CTYPE="en_US.UTF-8"
> export LC_NUMERIC="en_US.UTF-8"
> export LC_TIME="en_US.UTF-8"
> export LC_COLLATE="en_US.UTF-8"
> export LC_MONETARY="en_US.UTF-8"
> export LC_MESSAGES="en_US.UTF-8"
> export LC_PAPER="en_US.UTF-8"
> export LC_NAME="en_US.UTF-8"
> export LC_ADDRESS="en_US.UTF-8"
> export LC_TELEPHONE="en_US.UTF-8"
> export LC_MEASUREMENT="en_US.UTF-8"
> export LC_IDENTIFICATION="en_US.UTF-8"
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to