[ 
https://issues.apache.org/jira/browse/AVRO-3049?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17294104#comment-17294104
 ] 

ASF subversion and git services commented on AVRO-3049:
-------------------------------------------------------

Commit 552ee9e3c1679fec9ecb959c7c12ed7bd91827b0 in avro's branch 
refs/heads/dependabot/nuget/lang/csharp/Newtonsoft.Json-12.0.3 from John Karp
[ https://gitbox.apache.org/repos/asf?p=avro.git;h=552ee9e ]

AVRO-3049: Add checks to BinaryDecoder for bytes length (#1098)

* AVRO-3049: Handle negative length for bytes

* AVRO-3049: Don't try to allocate beyond VM limits for bytes

* AVRO-3049: Use clearer unit test names

* AVRO-3049: Use Assert.assertThrows for cleaner UTs

* AVRO-3049: Add org.apache.avro.limits.bytes.maxLength property

* AVRO-3049: mvn spotless:apply

* AVRO-3049: Call super() from BinaryDecoder ctor

* AVRO-3049: Add JavaDoc re BinaryDecoder limits

> Java: BinaryDecoder lacks checks on bytes array length
> ------------------------------------------------------
>
>                 Key: AVRO-3049
>                 URL: https://issues.apache.org/jira/browse/AVRO-3049
>             Project: Apache Avro
>          Issue Type: Bug
>          Components: java
>            Reporter: John Karp
>            Assignee: John Karp
>            Priority: Major
>             Fix For: 1.11.0, 1.10.2
>
>
> There are several checks on string length in BinaryDecoder. However, it lacks 
> the same checks for byte arrays.
>  * Negative lengths lead to IllegalArgumentException instead of 
> org.apache.avro.AvroRuntimeException
>  * Pathologically large (however legal) arrays can be allocated. Some 
> applications will be suffer denial of service if they are forced to allocate 
> 1 GB arrays repeatedly.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to