[jira] [Commented] (HADOOP-16836) Bug in widely-used helper function caused valid configuration value to fail on multiple tests, causing build failure

2020-03-18 Thread Ctest (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17062165#comment-17062165
 ] 

Ctest commented on HADOOP-16836:


I updated the description to make it more readable. 

> Bug in widely-used helper function caused valid configuration value to fail 
> on multiple tests, causing build failure
> 
>
> Key: HADOOP-16836
> URL: https://issues.apache.org/jira/browse/HADOOP-16836
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.3.0, 3.2.1
>Reporter: Ctest
>Priority: Blocker
>  Labels: configuration, easyfix, patch, test
> Attachments: HADOOP-16836-000.patch, HADOOP-16836-000.patch
>
>
> {code:java}
> org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryMixedLengths1
> org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryUnknownLength
> org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryMixedLengths1
> org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryUnknownLength{code}
>  
> 4 actively-used tests above call the helper function 
> `TestTFileStreams#writeRecords()` to write key-value pairs (kv pairs), then 
> call `TestTFileByteArrays#readRecords()` to assert the key and the value part 
> (v) of these kv pairs matched with what they wrote. All v of kv pairs are 
> hardcode strings with a length of 6.
>  
> `readRecords()` uses 
> `org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValueLength()` 
> to get full length of the v of these kv pairs.  But `getValueLength()` can 
> only get the full length of v when v's full length is less than the value of 
> configuration parameter `tfile.io.chunk.size`, otherwise `readRecords()` will 
> throw an exception. So, *when `tfile.io.chunk.size` is configured/set to a 
> value less than 6, these 4 tests failed because of the exception from 
> `readRecords()`, even 6 is a valid value for `tfile.io.chunk.size`.*
> The definition of `tfile.io.chunk.size` is "Value chunk size in bytes. 
> Default to 1MB. Values of the length less than the chunk size is guaranteed 
> to have known value length in read time (See also 
> TFile.Reader.Scanner.Entry.isValueLengthKnown())". 
> *Fixes*
> `readRecords()` should call 
> `org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValue(byte[])` 
> instead, which returns the correct full length of the `value` part despite 
> whether the value's length is larger than `tfile.io.chunk.size`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16836) Bug in widely-used helper function caused valid configuration value to fail on multiple tests, causing build failure

2020-03-16 Thread Wei-Chiu Chuang (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17060426#comment-17060426
 ] 

Wei-Chiu Chuang commented on HADOOP-16836:
--

Updated target version. Unless I am wrong, this is also a blocker for Hadoop 
3.3.0.

[~brahma] fyi.

> Bug in widely-used helper function caused valid configuration value to fail 
> on multiple tests, causing build failure
> 
>
> Key: HADOOP-16836
> URL: https://issues.apache.org/jira/browse/HADOOP-16836
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.3.0, 3.2.1
>Reporter: Ctest
>Priority: Blocker
>  Labels: configuration, easyfix, patch, test
> Attachments: HADOOP-16836-000.patch, HADOOP-16836-000.patch
>
>
> Test helper function 
> `org.apache.hadoop.io.file.tfile.TestTFileByteArrays#readRecords(org.apache.hadoop.fs.FileSystem,
>  org.apache.hadoop.fs.Path, int, org.apache.hadoop.conf.Configuration)` 
> (abbreviate as `readRecords()` below) are called in 4 actively-used tests 
> below:
>  
> {code:java}
> org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryMixedLengths1
> org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryUnknownLength
> org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryMixedLengths1
> org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryUnknownLength{code}
>  
> These tests first call 
> `org.apache.hadoop.io.file.tfile.TestTFileStreams#writeRecords(int count, 
> boolean knownKeyLength, boolean knownValueLength, boolean close)` to write 
> `key-value` pair records in a `TFile` object, then call the helper function 
> `readRecords()` to assert the `key` part and the `value` part of `key-value` 
> pair records stored matched with what they wrote perviously. The `value` 
> parts of `key-value` pairs from these tests are hardcode strings with a 
> length of 6.
> Assertions in `readRecords()` are directly related to the value of the 
> configuration parameter `tfile.io.chunk.size`. The formal definition of 
> `tfile.io.chunk.size` is "Value chunk size in bytes. Default to 1MB. Values 
> of the length less than the chunk size is guaranteed to have known value 
> length in read time (See also 
> TFile.Reader.Scanner.Entry.isValueLengthKnown())".
> When `tfile.io.chunk.size` is configured to a value less than the length of 
> the `value` part of the `key-value` pairs from these 4 tests, these tests 
> will fail, even though the configured value for `tfile.io.chunk.size` is 
> correct in semantic.
>  
> *Consequence*
> At least 4 actively-used tests failed on correctly configured parameters. 
> Tests used `readRecords()` could fail if the length of the hardcoded `value` 
> part they tested is larger than the configured value of 
> `tfile.io.chunk.size`. This caused build failure of Hadoop-Common if these 
> tests are not skipped.
>  
> *Root Cause*
> `readRecords()` used 
> `org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValueLength()` 
> (abbreviate as `getValueLength()` below) to get the full length of the 
> `value` part in the `key-value` pair. But `getValueLength()` can only get the 
> full length of the `value` part when the full length is less than 
> `tfile.io.chunk.size`, otherwise, `getValueLength()` throws an exception, 
> causing `readRecords()` to fail, and thus resulting in failures in the 
> aforementioned 4 tests. This is because `getValueLength()` do not know the 
> full length of the `value` part when `value` part's size is larger than 
> `tfile.io.chunk.size`.
>  
> *Fixes*
> `readRecords()` should instead call 
> `org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValue(byte[])` 
> (abbreviate as `getValue()` below), which returns the correct full length of 
> the `value` part despite whether the `value` length is larger than 
> `tfile.io.chunk.size`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16836) Bug in widely-used helper function caused valid configuration value to fail on multiple tests, causing build failure

2020-02-11 Thread Ctest (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17034938#comment-17034938
 ] 

Ctest commented on HADOOP-16836:


Hi, what do you think of this issue?

> Bug in widely-used helper function caused valid configuration value to fail 
> on multiple tests, causing build failure
> 
>
> Key: HADOOP-16836
> URL: https://issues.apache.org/jira/browse/HADOOP-16836
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.2.1
>Reporter: Ctest
>Priority: Blocker
>  Labels: configuration, easyfix, patch, test
> Attachments: HADOOP-16836-000.patch, HADOOP-16836-000.patch
>
>
> Test helper function 
> `org.apache.hadoop.io.file.tfile.TestTFileByteArrays#readRecords(org.apache.hadoop.fs.FileSystem,
>  org.apache.hadoop.fs.Path, int, org.apache.hadoop.conf.Configuration)` 
> (abbreviate as `readRecords()` below) are called in 4 actively-used tests 
> below:
>  
> {code:java}
> org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryMixedLengths1
> org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryUnknownLength
> org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryMixedLengths1
> org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryUnknownLength{code}
>  
> These tests first call 
> `org.apache.hadoop.io.file.tfile.TestTFileStreams#writeRecords(int count, 
> boolean knownKeyLength, boolean knownValueLength, boolean close)` to write 
> `key-value` pair records in a `TFile` object, then call the helper function 
> `readRecords()` to assert the `key` part and the `value` part of `key-value` 
> pair records stored matched with what they wrote perviously. The `value` 
> parts of `key-value` pairs from these tests are hardcode strings with a 
> length of 6.
> Assertions in `readRecords()` are directly related to the value of the 
> configuration parameter `tfile.io.chunk.size`. The formal definition of 
> `tfile.io.chunk.size` is "Value chunk size in bytes. Default to 1MB. Values 
> of the length less than the chunk size is guaranteed to have known value 
> length in read time (See also 
> TFile.Reader.Scanner.Entry.isValueLengthKnown())".
> When `tfile.io.chunk.size` is configured to a value less than the length of 
> the `value` part of the `key-value` pairs from these 4 tests, these tests 
> will fail, even though the configured value for `tfile.io.chunk.size` is 
> correct in semantic.
>  
> *Consequence*
> At least 4 actively-used tests failed on correctly configured parameters. 
> Tests used `readRecords()` could fail if the length of the hardcoded `value` 
> part they tested is larger than the configured value of 
> `tfile.io.chunk.size`. This caused build failure of Hadoop-Common if these 
> tests are not skipped.
>  
> *Root Cause*
> `readRecords()` used 
> `org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValueLength()` 
> (abbreviate as `getValueLength()` below) to get the full length of the 
> `value` part in the `key-value` pair. But `getValueLength()` can only get the 
> full length of the `value` part when the full length is less than 
> `tfile.io.chunk.size`, otherwise, `getValueLength()` throws an exception, 
> causing `readRecords()` to fail, and thus resulting in failures in the 
> aforementioned 4 tests. This is because `getValueLength()` do not know the 
> full length of the `value` part when `value` part's size is larger than 
> `tfile.io.chunk.size`.
>  
> *Fixes*
> `readRecords()` should instead call 
> `org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValue(byte[])` 
> (abbreviate as `getValue()` below), which returns the correct full length of 
> the `value` part despite whether the `value` length is larger than 
> `tfile.io.chunk.size`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16836) Bug in widely-used helper function caused valid configuration value to fail on multiple tests, causing build failure

2020-02-02 Thread Hadoop QA (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17028343#comment-17028343
 ] 

Hadoop QA commented on HADOOP-16836:


| (/) *{color:green}+1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue}  0m 
34s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green}  0m 
 0s{color} | {color:green} The patch appears to include 1 new or modified test 
files. {color} |
|| || || || {color:brown} trunk Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 22m 
27s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 18m  
4s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
42s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  1m 
12s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 
14m 49s{color} | {color:green} branch has no errors when building and testing 
our client artifacts. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  1m 
46s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
16s{color} | {color:green} trunk passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  0m 
48s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 15m 
21s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 15m 
21s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
43s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  1m 
20s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 
13m 11s{color} | {color:green} patch has no errors when building and testing 
our client artifacts. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  1m 
47s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
18s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  9m 
13s{color} | {color:green} hadoop-common in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
40s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black}105m  7s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=19.03.5 Server=19.03.5 Image:yetus/hadoop:c44943d1fc3 |
| JIRA Issue | HADOOP-16836 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12992438/HADOOP-16836-000.patch
 |
| Optional Tests |  dupname  asflicense  compile  javac  javadoc  mvninstall  
mvnsite  unit  shadedclient  findbugs  checkstyle  |
| uname | Linux 321410062d1f 4.15.0-74-generic #84-Ubuntu SMP Thu Dec 19 
08:06:28 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | /testptch/patchprocess/precommit/personality/provided.sh |
| git revision | trunk / 1e3a0b0 |
| maven | version: Apache Maven 3.3.9 |
| Default Java | 1.8.0_242 |
| findbugs | v3.1.0-RC1 |
|  Test Results | 
https://builds.apache.org/job/PreCommit-HADOOP-Build/16748/testReport/ |
| Max. process+thread count | 1376 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common U: 
hadoop-common-project/hadoop-common |
| Console output | 
https://builds.apache.org/job/PreCommit-HADOOP-Build/16748/console |
| Powered by | Apache Yetus 0.8.0   http://yetus.apache.org |


This message was automatically generated.



> Bug in widely-used helper function caused valid configuration value to fail 
> on multiple tests, causing build failure
>