[ 
https://issues.apache.org/jira/browse/JCR-3799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14085029#comment-14085029
 ] 

Harish Reddy commented on JCR-3799:
-----------------------------------

I don't think that's the same issue. The problem in this case simply appears to 
be a lossy conversion from long to int which results in a bad value being 
passed to stream.read. As far as I can tell, there is no inherent limitation in 
streaming a large amount of data. As I mentioned earlier, it's probably a 
matter of checking the places where such casts are being made and if the cast 
is gong to fail when the value in length will not fit in an int.

> Setting large binary value on jcr:data property fails over RMI
> --------------------------------------------------------------
>
>                 Key: JCR-3799
>                 URL: https://issues.apache.org/jira/browse/JCR-3799
>             Project: Jackrabbit Content Repository
>          Issue Type: Bug
>          Components: jackrabbit-jcr-rmi
>    Affects Versions: 2.6.5
>         Environment: Any
>            Reporter: Harish Reddy
>
> Setting a very large binary value on the jcr:data property fails and throws 
> an an exception at line 187 in the 2.6.5 source 
> (org.apache.jackrabbit.rmi.value.SerializableBinary.java)
> This appears to be a problem with converting a long variable into a
> int at line 187.
> n = stream.read(buffer, 0, Math.min(
>     buffer.length, (int) (length - count)));
> The problem occurs only when the length of the binary exceeds a number that 
> can't fit in an int (in my test case, length was 3245027213). 
> Other parts of SerializableBinary.java also appear to be casting length to an 
> int, so I'm guessing all instances of this pattern will need to be fixed.
> I'm using 2.6.5, so don't know if the issue exists in earlier 2.x versions as 
> well. This used to work in v 1.5.4. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to