Hi
That was my first thought, too: Nothing prevents the Binary implementation from
checking whether the InputStream is a FileInputStream and then access the
FileChannel from it.
In the concrete case of Sling, the Sling RequestParameter.getInputStream()
happens to call the Commons Upload
Hi,
On Tue, Feb 18, 2014 at 2:25 AM, Chetan Mehrotra
chetan.mehro...@gmail.com wrote:
If we can have a way to create JCR Binary implementations which
enables DataStore/BlobStore to efficiently transfer content then that
would help.
ValueFactory.createBinary(InputStream stream)
The problem
Hi,
On Tue, Feb 18, 2014 at 3:50 AM, Felix Meschberger fmesc...@adobe.com wrote:
That was my first thought, too: Nothing prevents the Binary implementation
from checking
whether the InputStream is a FileInputStream and then access the FileChannel
from it.
Right, but then there's no easy
Hi,
On 18 February 2014 08:50, Felix Meschberger fmesc...@adobe.com wrote:
Hi
That was my first thought, too: Nothing prevents the Binary implementation
from checking whether the InputStream is a FileInputStream and then access
the FileChannel from it.
In the concrete case of Sling, the
On Tue, Feb 18, 2014 at 2:32 PM, Jukka Zitting jukka.zitt...@gmail.com wrote:
Good point. That use case would probably be best handled with a
specific InputStream subclass like suggested by Felix for files.
That mode is fine for case like FileInputStream but not for case like
S3 where
hi
since we will sooner or later will approach an official 1.0 release, i
would like us to invest a couple of hours thinking about maturing our
code base.
apart from providing missing functionality and working on scalability and
performance (which is mostly covered by JIRA issues), we may also
On Tue, Feb 18, 2014 at 3:48 PM, Jukka Zitting jukka.zitt...@gmail.com wrote:
Something like S3InputStream.getURL() should work just fine for that use case:
That also would work with the caveat that in between layer do not
decorate the InputStream in any form.
Chetan Mehrotra
On 18.2.14 12:03 , Angela Schreiber wrote:
so, unless there is strong opposition for doing a bit of cleanup (some
may think it's still to early), i would like us to come up with some
improvements and ideas in dedicated issues or discussions on the list.
+1
Additionally we should go through
hi
i looked at the dependencies listed with the oak-core module
and was wondering that it depends on a specific mk implementation,
while i was assuming that oak-core should be independent of the
mk/node store implementation being used.
looking at it in more detail revealed that the dependency
On 18.2.14 12:03 , Angela Schreiber wrote:
so, unless there is strong opposition for doing a bit of cleanup (some
may think it's still to early), i would like us to come up with some
improvements and ideas in dedicated issues or discussions on the list.
+1
Additionally we should go through
On 18.2.14 12:03 , Angela Schreiber wrote:
so, unless there is strong opposition for doing a bit of cleanup (some
may think it's still to early), i would like us to come up with some
improvements and ideas in dedicated issues or discussions on the list.
Another point that came up over lunch
hi michael
sounds good to me.
regards
angela
On 18/02/14 14:17, Michael Dürig mdue...@apache.org wrote:
On 18.2.14 12:03 , Angela Schreiber wrote:
so, unless there is strong opposition for doing a bit of cleanup (some
may think it's still to early), i would like us to come up with some
Hi,
On Tue, Feb 18, 2014 at 7:14 AM, Angela Schreiber anch...@adobe.com wrote:
variant b:
only move the json and utility related code to oak-commons but create
a new dedicated module for the blob store related code. while this would
look more natural to me, i know that adding new module has
hi jukka
On Tue, Feb 18, 2014 at 7:14 AM, Angela Schreiber anch...@adobe.com
wrote:
variant b:
only move the json and utility related code to oak-commons but create
a new dedicated module for the blob store related code. while this would
look more natural to me, i know that adding new module
hi michael
Additionally we should go through the TODO/FIXME annotations and
- remove the ones that are obsolete,
- update/clarify the ones that are out dated/unclear,
- create issues for the ones we deem necessary and add the issue
reference to the TODO/FIXME.
right now we have 473 TODO/FIXMEs
Hi,
On Tue, Feb 18, 2014 at 9:49 AM, Angela Schreiber anch...@adobe.com wrote:
in the regular code i got rid of the dependencies but there is
still quite some tests left that currently have a dependency to
oak-mk, namely the MicroKernelImpl. it would feel wrong to change that
just for the
hi jukka
that would be fine with me.
so, i would suggest we adjust the various test setup before starting
with the refactoring.
i will create a jira issue that allows us to track the progress of this
dependency cleanup.
kind regards
angela
On 18/02/14 16:55, Jukka Zitting
https://issues.apache.org/jira/browse/OAK-1434
On 18/02/14 17:00, Angela Schreiber anch...@adobe.com wrote:
hi jukka
that would be fine with me.
so, i would suggest we adjust the various test setup before starting
with the refactoring.
i will create a jira issue that allows us to track the
hi,
In CRX we solved this problem with CRX providing the blob factory,
i.e. CRX already creates the appropriate structures for exactly the
upload case, the multipart handler then just uses the blob for writing
the output, too. Unfortunately we never got this into the JCR spec
(IIRC).
Also,
The Buildbot has detected a new failure on builder oak-trunk while building ASF
Buildbot.
Full details are available at:
http://ci.apache.org/builders/oak-trunk/builds/4411
Buildbot URL: http://ci.apache.org/
Buildslave for this Build: osiris_ubuntu
Build Reason: scheduler
Build Source Stamp:
The Buildbot has detected a restored build on builder oak-trunk while building
ASF Buildbot.
Full details are available at:
http://ci.apache.org/builders/oak-trunk/builds/4413
Buildbot URL: http://ci.apache.org/
Buildslave for this Build: osiris_ubuntu
Build Reason: scheduler
Build Source
A candidate for the Jackrabbit Oak 0.17 release is available at:
https://dist.apache.org/repos/dist/dev/jackrabbit/oak/0.17/
The release candidate is a zip archive of the sources in:
https://svn.apache.org/repos/asf/jackrabbit/oak/tags/jackrabbit-oak-0.17/
The SHA1 checksum of the archive
Hi,
On Tue, Feb 18, 2014 at 6:16 AM, Chetan Mehrotra
chetan.mehro...@gmail.com wrote:
That also would work with the caveat that in between layer do not
decorate the InputStream in any form.
Right, good point. At the moment we don't decorate streams within Oak,
and I don't see any big reasons
Hi,
On Tue, Feb 18, 2014 at 1:25 PM, Michael Dürig mdue...@apache.org wrote:
The check fails for me due to a missing license header:
Unapproved licenses:
Build Update for apache/jackrabbit-oak
-
Build: #3398
Status: Passed
Duration: 2541 seconds
Commit: dcde7bfa6491e2141d4f323e2dba5c77d67b7c5f (trunk)
Author: Tobias Bocanegra
Message: @trivial fix license checks
git-svn-id:
Build Update for apache/jackrabbit-oak
-
Build: #3399
Status: Passed
Duration: 2702 seconds
Commit: 5400e18e5feefef3d74beb11f80ff77bc9faa644 (trunk)
Author: Tobias Bocanegra
Message: @trivial remove unused export annotations
git-svn-id:
26 matches
Mail list logo