alamb commented on code in PR #7096:
URL: https://github.com/apache/arrow-rs/pull/7096#discussion_r1949956394


##########
arrow-ipc/src/reader.rs:
##########
@@ -2492,4 +2539,109 @@ mod tests {
                 assert_eq!(decoded_batch.expect("Failed to read RecordBatch"), 
batch);
             });
     }
+
+    #[test]
+    fn test_validation_of_invalid_list_array() {

Review Comment:
   I tried a more invasive change to add tests for validation for 
PrimitiveArrays and concluded that the current code will panic rather than error
   
   Specifically, this code which generates invalid IPC data results in a panic 
on decode rather than an `Result`
   
   ```rust
       #[test]
       fn test_validation_of_invalid_primitive_array() {
           // Int32Array with not enough nulls
           let array = unsafe {
               let nulls = NullBuffer::from(&[true, false, true, false]);
   
               let buffer = ScalarBuffer::<i32>::from_iter(0..8000);
               let data = ArrayDataBuilder::new(DataType::Int32)
                   .len(8000)
                   .add_buffer(buffer.into())
                   .nulls(Some(nulls)) // too few nulls
                   .build_unchecked();
   
               Int32Array::from(data)
           };
   
           expect_ipc_validation_error(
               Arc::new(array),
               "Invalid argument error: Nulls do not match",
           );
       }
   ```
   
   Since I am not really trying to change / improve validation in this PR or 
project, I do not plan to change how it works (though I will file a ticket with 
the report of panic vs error)
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to