[
https://issues.apache.org/jira/browse/HADOOP-7199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Steve Loughran resolved HADOOP-7199.
------------------------------------
Resolution: Not A Problem
> Crash due to reuse of checksum files
> ------------------------------------
>
> Key: HADOOP-7199
> URL: https://issues.apache.org/jira/browse/HADOOP-7199
> Project: Hadoop Common
> Issue Type: Bug
> Components: fs
> Affects Versions: 0.20.2
> Environment: Cloudera CDH3B4 in pseudo mode on a Linux
> 2.6.32-28-generic #55-Ubuntu SMP x86_64 kernel, with Java HotSpot64-Bit
> Server VM (build 19.1-b02, mixed mode)
> Reporter: Lars Ailo Bongo
> Priority: Minor
>
> copyFromLocalFile crashes if a cheksum file exists on the local filesystem
> and the checksum does not match the file content. This will for example crash
> "hadoop -fs put ./foo ./foo" with a non-descriptive error.
> It is therefore not possible to do:
> 1. copyToLocalFile(hdfsFile, localFile) // creates checksum file
> 2. modify localFile
> 3. copyFromLocalFile(localFile, hdfsFile) // uses old checksum
> Solution: do not reuse checksum files, or add a parameter to
> copyFromLocalFile that specifies that checksum files should not be reused.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]