--- On Sat, 9/11/10, Scott Gray <scott.g...@hotwaxmedia.com> wrote:
> Currently
> JdbcValueHandler.BlobJdbcValueHandler will accept byte
> arrays and Blobs (reluctantly) but not ByteBuffers. 
> The problem is that the service engine uses ByteBuffer as
> the attribute type for attributes created based on blob
> entity fields and crud services fail due to this because
> they don't attempt to convert them before using something
> like <create-value/>.
> 
> So what is the correct behavior?  Should the service
> engine be using a different type or should the
> JdbcValueHandler be more flexible?

I'm pretty sure the current JdbcValueHandler code contains the same logic as 
the original switch statement, but I might have missed something.

That code is a bit messy because it's trying to convert multiple types to a 
BLOB - in an effort to maintain backward compatibility. You can see remarks in 
there about that.

I introduced new data types - Object and byte[], to help make things less 
ambiguous and hopefully provide a path toward cleaning some of that up.

So, a discussion about which data type to use or support would be worthwhile. I 
think the answer lies in analyzing what is being stored as a BLOB and use the 
correct Java data type for it. If it's a serialized Object, then use an Object 
data type. If it's a byte array that holds the contents of a file, then use a 
byte array, etc. I imagine such an analysis would break down in the Content 
component - where the Java data type might be unknown.

I know this doesn't answer your question, I'm just sharing things I've learned 
and thoughts I've had on the subject.

-Adrian




Reply via email to