[ 
https://issues.apache.org/jira/browse/HDFS-5072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Colin Patrick McCabe resolved HDFS-5072.
----------------------------------------

    Resolution: Invalid

JIRA isn't for troubleshooting.  Try asking your question on 
u...@hadoop.apache.org
                
> fuse_dfs: ERROR: could not connect open file fuse_impls_open.c:54
> -----------------------------------------------------------------
>
>                 Key: HDFS-5072
>                 URL: https://issues.apache.org/jira/browse/HDFS-5072
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: fuse-dfs
>    Affects Versions: 1.1.2
>         Environment: CentOS 6.4 amd64
>            Reporter: Guoen Yong
>
> Here are some command lines on CentOS 6.4
> sudo ./fuse_dfs_wrapper.sh dfs://172.16.0.80:9000 /mnt/hdfs
> sudo -u hadoop bin/hadoop dfs -mkdir /test
> sudo -u hadoop bin/hadoop dfs -chown -R root:root /test
> I can create file and directories from following command lines
> sudo bin/hadoop dfs -copyFromLocal /tmp/vod/* /test
> sudo touch /mnt/hdfs/test/test.txt
> And then I created samba share \\172.16.0.80\hdfs for /mnt/hdfs,
> On window system, go to the share folder \\172.16.0.80\hdfs\test via root 
> user,
> I can create directory, copy files from samba, also can rename file on the 
> samba,
> but when I copy file into samba, it popup one window and said I/O error.
> I checked the /var/log/messages, found
> fuse_dfs: ERROR: could not connect open file fuse_impls_open.c:54
> I'm guess it's a bad build, but wondering if there might be another cause.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to