[GH] (jackrabbit-oak): Workflow run "SonarCloud" failed!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has failed.
Run started by GitHub user nit0906 (triggered by nit0906).

Head commit for run:
cf51f98e999c82bca2e456c58e58dd77c0903d09 / nit0906 
Oak-10874 | Add jmx function for bringing forward a delayed async lane to a 
latest checkpoint. (#1522)

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9559223702

With regards,
GitHub Actions via GitBox



Re: [PR] Oak-10874 | Add jmx function for bringing forward a delayed async lane to a latest checkpoint. [jackrabbit-oak]

2024-06-17 Thread via GitHub


nit0906 merged PR #1522:
URL: https://github.com/apache/jackrabbit-oak/pull/1522


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GH] (jackrabbit-oak): Workflow run "SonarCloud" is working again!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has succeeded.
Run started by GitHub user nit0906 (triggered by nit0906).

Head commit for run:
752f1f19760f51d58c3dc35869d2283c99f1b88b / Nitin Gupta 
Make default checkpoint time to 100 days from 1000

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9559217127

With regards,
GitHub Actions via GitBox



Re: [PR] Oak-10874 | Add jmx function for bringing forward a delayed async lane to a latest checkpoint. [jackrabbit-oak]

2024-06-17 Thread via GitHub


nit0906 commented on code in PR #1522:
URL: https://github.com/apache/jackrabbit-oak/pull/1522#discussion_r1643657775


##
oak-core/src/main/java/org/apache/jackrabbit/oak/plugins/index/AsyncIndexUpdate.java:
##
@@ -1242,6 +1242,62 @@ public String getReferenceCheckpoint() {
 return referenceCp;
 }
 
+@Override
+public String forceIndexLaneCatchup(String confirmMessage) throws 
CommitFailedException {
+
+if (!"CONFIRM".equals(confirmMessage)) {
+String msg = "Please confirm that you want to force the lane 
catch-up by passing 'CONFIRM' as argument";
+log.warn(msg);
+return msg;
+}
+
+if (!this.isFailing()) {
+String msg = "The lane is not failing. This operation should 
only be performed if the lane is failing, it should first be allowed to catch 
up on its own.";
+log.warn(msg);
+return msg;
+}
+
+try {
+log.info("Running a forced catch-up for indexing lane [{}]. ", 
name);
+// First we need to abort and pause the running indexing task
+this.abortAndPause();
+log.info("Aborted and paused async indexing for lane [{}]", 
name);
+// Release lease for the paused lane
+this.releaseLeaseForPausedLane();
+log.info("Released lease for paused lane [{}]", name);
+String newReferenceCheckpoint = store.checkpoint(lifetime, 
Map.of(

Review Comment:
   done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: The import org.apache.jackrabbit.guava cannot be resolved

2024-06-17 Thread Raffaele Gambelli
Ok Julian, it was enough closing the project of oak-shaded-guava, there is no 
need of exclusion.

Thanks again

Cordiali saluti / Best regards,

Raffaele Gambelli
Senior Java Developer
E  raffaele.gambe...@cegeka.com

[CEGEKA]Via Ettore Cristoni, 84
IT-40033 Bologna (IT), Italy
T +39 02 2544271
WWW.CEGEKA.COM

[http://signature.cegeka.com/SignatureRO/bannerRO.jpg]



From: Raffaele Gambelli 
Sent: Monday, June 17, 2024 6:03 PM
To: oak-dev@jackrabbit.apache.org 
Subject: Re: The import org.apache.jackrabbit.guava cannot be resolved

Thanks Julian but unfortunately it didn't worked:

I have already built oak-shaded-guava and everything other from command line, 
anyway I've rebuilt oak-shaded-guava, in its target I have its jar 1.64.0, then 
I have added the exclusion in the pom of my very very simple hello-world 
project:


   
   org.apache.jackrabbit
   oak-core
   1.64.0
   

org.apache.jackrabbit
oak-shaded-guava

   
   
   
   org.apache.jackrabbit
   oak-jcr
   1.64.0
   

org.apache.jackrabbit
oak-shaded-guava

   
   

but when I run it from Eclipse I have again same errors:

Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/jackrabbit/guava/common/base/Predicate
at 
org.apache.jackrabbit.oak.plugins.memory.MemoryNodeStore.(MemoryNodeStore.java:73)
at it.cegeka.oak.Hello.main(Hello.java:24)
Caused by: java.lang.ClassNotFoundException: 
org.apache.jackrabbit.guava.common.base.Predicate
at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
... 2 more

if you have a working code built in eclipse could you share with me?

Cordiali saluti / Best regards,

Raffaele Gambelli
Senior Java Developer
E  raffaele.gambe...@cegeka.com

[CEGEKA]Via Ettore Cristoni, 84
IT-40033 Bologna (IT), Italy
T +39 02 2544271
https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.cegeka.com%2F=05%7C02%7CRaffaele.Gambelli%40cegeka.com%7Ca8838661d646425d4bab08dc8ee70601%7C42151053019347aa9e81effd81f772cc%7C0%7C0%7C638542370134753395%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C=h6zLSKLBjPFraH3iul5GusKunKVMBQdcnFvdbS8hZak%3D=0

[https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fsignature.cegeka.com%2FSignatureRO%2FbannerRO.jpg=05%7C02%7CRaffaele.Gambelli%40cegeka.com%7Ca8838661d646425d4bab08dc8ee70601%7C42151053019347aa9e81effd81f772cc%7C0%7C0%7C638542370134774399%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C=EZL37NOdg0m%2BTeH6y7lKZNqrnAfA9L6bQBjwvXneai4%3D=0]



From: Julian Reschke 
Sent: Monday, June 17, 2024 5:34 PM
To: oak-dev@jackrabbit.apache.org 
Subject: Re: The import org.apache.jackrabbit.guava cannot be resolved

[You don't often get email from julian.resc...@gmx.de.invalid. Learn why this 
is important at https://aka.ms/LearnAboutSenderIdentification ]

Am 17.06.2024 um 17:28 schrieb Raffaele Gambelli:
> Hi all,
>
> I would like to start exploring oak, I've imported full project in Eclipse 
> STS but everywhere there is an import to org.apache.jackrabbit.guava I have 
> errors.

Eclipse's Maven support doesn't understand the shade plugin, so

1) you need to build oak-shaded-guava with maven from the command line, and

2) *exclude* that module when you import the project (or remove it once
you imported "everything").

> I have this kind of problem even if I create an hello-world project depending 
> on oak-core, when I run the application it says:
>
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/jackrabbit/guava/common/base/Predicate
> at 
> org.apache.jackrabbit.oak.plugins.memory.MemoryNodeStore.(MemoryNodeStore.java:73)
> at it.cegeka.oak.Hello.main(Hello.java:15)
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.jackrabbit.guava.common.base.Predicate
> at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
> at 
> 

Re: The import org.apache.jackrabbit.guava cannot be resolved

2024-06-17 Thread Raffaele Gambelli
Thanks Julian but unfortunately it didn't worked:

I have already built oak-shaded-guava and everything other from command line, 
anyway I've rebuilt oak-shaded-guava, in its target I have its jar 1.64.0, then 
I have added the exclusion in the pom of my very very simple hello-world 
project:


   
   org.apache.jackrabbit
   oak-core
   1.64.0
   

org.apache.jackrabbit
oak-shaded-guava

   
   
   
   org.apache.jackrabbit
   oak-jcr
   1.64.0
   

org.apache.jackrabbit
oak-shaded-guava

   
   

but when I run it from Eclipse I have again same errors:

Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/jackrabbit/guava/common/base/Predicate
at 
org.apache.jackrabbit.oak.plugins.memory.MemoryNodeStore.(MemoryNodeStore.java:73)
at it.cegeka.oak.Hello.main(Hello.java:24)
Caused by: java.lang.ClassNotFoundException: 
org.apache.jackrabbit.guava.common.base.Predicate
at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
... 2 more

if you have a working code built in eclipse could you share with me?

Cordiali saluti / Best regards,

Raffaele Gambelli
Senior Java Developer
E  raffaele.gambe...@cegeka.com

[CEGEKA]Via Ettore Cristoni, 84
IT-40033 Bologna (IT), Italy
T +39 02 2544271
WWW.CEGEKA.COM

[http://signature.cegeka.com/SignatureRO/bannerRO.jpg]



From: Julian Reschke 
Sent: Monday, June 17, 2024 5:34 PM
To: oak-dev@jackrabbit.apache.org 
Subject: Re: The import org.apache.jackrabbit.guava cannot be resolved

[You don't often get email from julian.resc...@gmx.de.invalid. Learn why this 
is important at https://aka.ms/LearnAboutSenderIdentification ]

Am 17.06.2024 um 17:28 schrieb Raffaele Gambelli:
> Hi all,
>
> I would like to start exploring oak, I've imported full project in Eclipse 
> STS but everywhere there is an import to org.apache.jackrabbit.guava I have 
> errors.

Eclipse's Maven support doesn't understand the shade plugin, so

1) you need to build oak-shaded-guava with maven from the command line, and

2) *exclude* that module when you import the project (or remove it once
you imported "everything").

> I have this kind of problem even if I create an hello-world project depending 
> on oak-core, when I run the application it says:
>
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/jackrabbit/guava/common/base/Predicate
> at 
> org.apache.jackrabbit.oak.plugins.memory.MemoryNodeStore.(MemoryNodeStore.java:73)
> at it.cegeka.oak.Hello.main(Hello.java:15)
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.jackrabbit.guava.common.base.Predicate
> at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
> at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
> ... 2 more
>
> I cannot find any class named 
> org.apache.jackrabbit.guava.common.base.Predicates, could you help me?

It's generated using the "shade" plugin within oak-shaded-guava.

> If I run mvn clean install or run my hello world main class from command line 
> it works, but I would like run it in eclipse to debug.
>
> Thanks
>
> Cordiali saluti / Best regards,
>
> Raffaele Gambelli
> Senior Java Developer
> E  raffaele.gambe...@cegeka.com
>
> [CEGEKA]Via Ettore Cristoni, 84
> IT-40033 Bologna (IT), Italy
> T +39 02 2544271
> https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.cegeka.com%2F=05%7C02%7CRaffaele.Gambelli%40cegeka.com%7C5e55ca311af24da0710508dc8ee2f7ff%7C42151053019347aa9e81effd81f772cc%7C0%7C0%7C638542352673171240%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C=3DI9mpiNWerzNmK6yDbLqSaUydWPD%2FajbLEUWB9i3IM%3D=0
>
> 

[GH] (jackrabbit-oak): Workflow run "SonarCloud" failed!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has failed.
Run started by GitHub user reschke (triggered by reschke).

Head commit for run:
95e2c03ac942c6fe71b006bb258f4788a628ad6f / Julian Reschke 
OAK-10882; fix ml target for oak-commits

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9550138128

With regards,
GitHub Actions via GitBox



Re: The import org.apache.jackrabbit.guava cannot be resolved

2024-06-17 Thread Julian Reschke

Am 17.06.2024 um 17:28 schrieb Raffaele Gambelli:

Hi all,

I would like to start exploring oak, I've imported full project in Eclipse STS 
but everywhere there is an import to org.apache.jackrabbit.guava I have errors.


Eclipse's Maven support doesn't understand the shade plugin, so

1) you need to build oak-shaded-guava with maven from the command line, and

2) *exclude* that module when you import the project (or remove it once
you imported "everything").


I have this kind of problem even if I create an hello-world project depending 
on oak-core, when I run the application it says:

Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/jackrabbit/guava/common/base/Predicate
at 
org.apache.jackrabbit.oak.plugins.memory.MemoryNodeStore.(MemoryNodeStore.java:73)
at it.cegeka.oak.Hello.main(Hello.java:15)
Caused by: java.lang.ClassNotFoundException: 
org.apache.jackrabbit.guava.common.base.Predicate
at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
... 2 more

I cannot find any class named 
org.apache.jackrabbit.guava.common.base.Predicates, could you help me?


It's generated using the "shade" plugin within oak-shaded-guava.


If I run mvn clean install or run my hello world main class from command line 
it works, but I would like run it in eclipse to debug.

Thanks

Cordiali saluti / Best regards,

Raffaele Gambelli
Senior Java Developer
E  raffaele.gambe...@cegeka.com

[CEGEKA]Via Ettore Cristoni, 84
IT-40033 Bologna (IT), Italy
T +39 02 2544271
WWW.CEGEKA.COM

[http://signature.cegeka.com/SignatureRO/bannerRO.jpg]


Best regards, Julian


The import org.apache.jackrabbit.guava cannot be resolved

2024-06-17 Thread Raffaele Gambelli
Hi all,

I would like to start exploring oak, I've imported full project in Eclipse STS 
but everywhere there is an import to org.apache.jackrabbit.guava I have errors.
I have this kind of problem even if I create an hello-world project depending 
on oak-core, when I run the application it says:

Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/jackrabbit/guava/common/base/Predicate
at 
org.apache.jackrabbit.oak.plugins.memory.MemoryNodeStore.(MemoryNodeStore.java:73)
at it.cegeka.oak.Hello.main(Hello.java:15)
Caused by: java.lang.ClassNotFoundException: 
org.apache.jackrabbit.guava.common.base.Predicate
at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
... 2 more

I cannot find any class named 
org.apache.jackrabbit.guava.common.base.Predicates, could you help me?

If I run mvn clean install or run my hello world main class from command line 
it works, but I would like run it in eclipse to debug.

Thanks

Cordiali saluti / Best regards,

Raffaele Gambelli
Senior Java Developer
E  raffaele.gambe...@cegeka.com

[CEGEKA]Via Ettore Cristoni, 84
IT-40033 Bologna (IT), Italy
T +39 02 2544271
WWW.CEGEKA.COM

[http://signature.cegeka.com/SignatureRO/bannerRO.jpg]




(jackrabbit-oak) branch trunk updated: OAK-10882; fix ml target for oak-commits

2024-06-17 Thread reschke
This is an automated email from the ASF dual-hosted git repository.

reschke pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 95e2c03ac9 OAK-10882; fix ml target for oak-commits
95e2c03ac9 is described below

commit 95e2c03ac942c6fe71b006bb258f4788a628ad6f
Author: Julian Reschke 
AuthorDate: Mon Jun 17 17:08:29 2024 +0200

OAK-10882; fix ml target for oak-commits
---
 .asf.yaml | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/.asf.yaml b/.asf.yaml
index 86857d3332..f9422a3bb7 100644
--- a/.asf.yaml
+++ b/.asf.yaml
@@ -15,7 +15,7 @@
 
 # https://cwiki.apache.org/confluence/x/7guYBw
 notifications:
-  commits: oak-dev@jackrabbit.apache.org
+  commits: oak-comm...@jackrabbit.apache.org
   issues: oak-dev@jackrabbit.apache.org
   pullrequests: oak-dev@jackrabbit.apache.org
   jobs: oak-dev@jackrabbit.apache.org
@@ -34,4 +34,4 @@ github:
 - JCR
 - JCRVLT
 - SLING
-- FELIX
\ No newline at end of file
+- FELIX



[GH] (jackrabbit-oak): Workflow run "SonarCloud" is working again!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has succeeded.
Run started by GitHub user nfsantos (triggered by nfsantos).

Head commit for run:
a1d193cf78c454e19e9a322df5cb6a715fa5e41c / Nuno Santos 
Merge remote-tracking branch 'upstream/trunk' into OAK-10894

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9550101884

With regards,
GitHub Actions via GitBox



[GH] (jackrabbit-oak): Workflow run "SonarCloud" failed!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has failed.
Run started by GitHub user reschke (triggered by reschke).

Head commit for run:
a159cfa979be8e8fd6441eaa68af3159f5999f21 / Julian Reschke 
OAK-10691: remove use of Guava Charsets class (#1538)

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9549912970

With regards,
GitHub Actions via GitBox



Re: [PR] Issue/oak 8848 [jackrabbit-oak]

2024-06-17 Thread via GitHub


shodaaan commented on code in PR #1474:
URL: https://github.com/apache/jackrabbit-oak/pull/1474#discussion_r1642962427


##
oak-core/src/main/java/org/apache/jackrabbit/oak/plugins/version/VersionEditor.java:
##
@@ -145,27 +146,71 @@ public void propertyChanged(PropertyState before, 
PropertyState after)
 return;
 }
 String propName = after.getName();
+
+// Updates the checked-out / checked-in state of the currently 
processed node when
+// the JCR_ISCHECKEDOUT property change is processed.
 if (propName.equals(JCR_ISCHECKEDOUT)) {
 if (wasCheckedIn()) {
 vMgr.checkout(node);
 } else {
 vMgr.checkin(node);
 }
 } else if (propName.equals(JCR_BASEVERSION)) {
-String baseVersion = after.getValue(Type.REFERENCE);
-if (baseVersion.startsWith(RESTORE_PREFIX)) {
-baseVersion = baseVersion.substring(RESTORE_PREFIX.length());
-node.setProperty(JCR_BASEVERSION, baseVersion, Type.REFERENCE);
+
+// Completes the restore of a version from version history.
+//
+// When the JCR_BASEVERSION property is processed, a check is made 
for the current
+// base version property.
+// If a restore is currently in progress for the current base 
version (the check for
+// this is that the current base version name has the format 
"restore-[UUID of the
+// version to restore to]"), then the restore is completed for the 
current node
+// to the version specified by the UUID.
+//
+// If a node that was moved or copied to the location of a deleted 
node is currently
+// being processed (see OAK-8848 for context), the restore 
operation must NOT be
+// performed when the JCR_BASEVERSION property change is processed 
for the node.
+if (!nodeWasMovedOrCopied()) {
+
+String baseVersion = after.getValue(Type.REFERENCE);
+if (baseVersion.startsWith(RESTORE_PREFIX)) {
+baseVersion = 
baseVersion.substring(RESTORE_PREFIX.length());
+node.setProperty(JCR_BASEVERSION, baseVersion, 
Type.REFERENCE);
+}
+
+vMgr.restore(node, baseVersion, null);
 }
-vMgr.restore(node, baseVersion, null);
 } else if (isVersionProperty(after)) {
-throwProtected(after.getName());
+// Checks if a version property is being changed and throws a 
CommitFailedException
+// with the message "Constraint Violation Exception" if this is 
not allowed.
+// JCR_ISCHECKEDOUT and JCR_BASEVERSION properties should be 
ignored, since changes
+// to them are allowed for specific use cases (for example, 
completing the check-in
+// / check-out for a node or completing a node restore).
+//
+// The only situation when the update of a version property is 
allowed is when this
+// occurs as a result of the current node being moved over a 
previously deleted node
+// - see OAK-8848 for context.
+//
+// OAK-8848: moving a versionable node in the same location as a 
node deleted in the
+// same session should be allowed.
+// This check works because the only way that moving a node in a 
location is allowed
+// is if there is no existing (undeleted) node in that location.
+// Property comparison should not fail for two jcr:versionHistory 
properties in this case.
+if (!nodeWasMovedOrCopied()) {
+throwProtected(after.getName());
+}
 } else if (isReadOnly && getOPV(after) != 
OnParentVersionAction.IGNORE) {
 throwCheckedIn("Cannot change property " + after.getName()
 + " on checked in node");
 }
 }
 
+/**
+ * Returns true if and only if the given node was moved or copied from 
another location.
+ */
+private boolean nodeWasMovedOrCopied() {

Review Comment:
   I just tested the copy instead of move case and it can not be done in one 
session, because WorkspaceDelegate.copy throws a javax.jcr.ItemExistsException. 
   Copy was not mentioned in the ticket requirement either, it was a wrong 
assumption on my part that copy will behave the same as move in this case.
   
   I will rename the check method to nodeWasMoved.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GH] (jackrabbit-oak): Workflow run "SonarCloud" failed!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has failed.
Run started by GitHub user nfsantos (triggered by nfsantos).

Head commit for run:
759637eb3a7c6bf550227569e85c852a58d9762c / Nuno Santos 
OAK-10897 - Delete unused class: DocumentStoreSplitter (#1537)

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9549731873

With regards,
GitHub Actions via GitBox



(jackrabbit-oak) branch OAK-10691 deleted (was 654e60d397)

2024-06-17 Thread reschke
This is an automated email from the ASF dual-hosted git repository.

reschke pushed a change to branch OAK-10691
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git


 was 654e60d397 OAK-10691: remove use of Guava Charsets class

The revisions that were on this branch are still contained in
other references; therefore, this change does not discard any commits
from the repository.



(jackrabbit-oak) branch trunk updated: OAK-10691: remove use of Guava Charsets class (#1538)

2024-06-17 Thread reschke
This is an automated email from the ASF dual-hosted git repository.

reschke pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git


The following commit(s) were added to refs/heads/trunk by this push:
 new a159cfa979 OAK-10691: remove use of Guava Charsets class (#1538)
a159cfa979 is described below

commit a159cfa979be8e8fd6441eaa68af3159f5999f21
Author: Julian Reschke 
AuthorDate: Mon Jun 17 16:54:01 2024 +0200

OAK-10691: remove use of Guava Charsets class (#1538)
---
 .../jackrabbit/oak/scalability/ScalabilityRunner.java|  4 ++--
 .../cloud/azure/blobstorage/AzureBlobStoreBackend.java   |  8 
 .../oak/plugins/blob/MarkSweepGarbageCollector.java  |  8 
 .../oak/plugins/blob/datastore/BlobIdTracker.java|  4 ++--
 .../oak/plugins/blob/datastore/OakFileDataStore.java |  4 ++--
 .../directaccess/DataRecordDownloadOptions.java  |  3 +--
 .../oak/plugins/blob/ConsolidatedDataStoreStatsTest.java |  4 ++--
 .../oak/plugins/blob/UploadStagingCacheTest.java |  4 ++--
 .../AbstractDataRecordAccessProviderTest.java|  6 +++---
 .../directaccess/DataRecordDownloadOptionsTest.java  |  6 +++---
 .../plugins/blob/serializer/FSBlobSerializerTest.java|  4 ++--
 .../jackrabbit/oak/spi/blob/AbstractBlobStore.java   |  4 ++--
 .../apache/jackrabbit/oak/spi/blob/split/BlobIdSet.java  |  4 ++--
 .../oak/plugins/index/datastore/DataStoreTextWriter.java | 10 +-
 .../plugins/index/importer/IndexDefinitionUpdater.java   |  4 ++--
 .../oak/plugins/index/property/PropertyIndexUtil.java|  4 ++--
 .../oak/plugins/nodetype/write/NodeTypeRegistry.java |  4 ++--
 .../oak/plugins/index/importer/IndexImporterTest.java|  6 +++---
 .../apache/jackrabbit/oak/http/HtmlRepresentation.java   |  4 ++--
 .../org/apache/jackrabbit/oak/jcr/TestContentLoader.java |  4 ++--
 .../directory/ActiveDeletedBlobCollectorFactory.java |  4 ++--
 .../index/lucene/directory/IndexRootDirectory.java   |  4 ++--
 .../plugins/index/lucene/LucenePropertyIndexTest.java|  6 +++---
 .../apache/jackrabbit/oak/run/osgi/ConfigTracker.java|  4 ++--
 .../indexer/document/flatfile/NodeStateEntrySorter.java  |  4 ++--
 .../indexer/document/flatfile/StateInBytesHolder.java|  6 +++---
 .../document/flatfile/TraverseWithSortStrategy.java  |  4 ++--
 .../jackrabbit/oak/exporter/NodeStateSerializer.java |  6 +++---
 .../oak/index/IndexConsistencyCheckPrinter.java  |  4 ++--
 .../oak/plugins/tika/CSVFileBinaryResourceProvider.java  |  4 ++--
 .../jackrabbit/oak/plugins/tika/CSVFileGenerator.java|  4 ++--
 .../jackrabbit/oak/plugins/tika/TextPopulator.java   |  4 ++--
 .../apache/jackrabbit/oak/run/DataStoreCheckCommand.java |  8 
 .../org/apache/jackrabbit/oak/run/DataStoreCommand.java  |  8 
 .../jackrabbit/oak/exporter/NodeStateSerializerTest.java |  6 +++---
 .../java/org/apache/jackrabbit/oak/index/ReindexIT.java  |  4 ++--
 .../plugins/tika/CSVFileBinaryResourceProviderTest.java  |  6 +++---
 .../jackrabbit/oak/plugins/tika/TextPopulatorTest.java   |  4 ++--
 .../jackrabbit/oak/plugins/tika/TikaHelperTest.java  |  6 +++---
 .../apache/jackrabbit/oak/run/DataStoreCheckTest.java|  6 +++---
 .../jackrabbit/oak/segment/azure/AzureGCJournalFile.java |  4 ++--
 .../jackrabbit/oak/segment/DefaultSegmentWriter.java |  6 +++---
 .../java/org/apache/jackrabbit/oak/segment/Segment.java  |  4 ++--
 .../org/apache/jackrabbit/oak/segment/SegmentBlob.java   |  4 ++--
 .../jackrabbit/oak/segment/SegmentBufferWriter.java  |  4 ++--
 .../org/apache/jackrabbit/oak/segment/SegmentDump.java   |  4 ++--
 .../org/apache/jackrabbit/oak/segment/SegmentParser.java |  6 +++---
 .../org/apache/jackrabbit/oak/segment/SegmentStream.java |  6 +++---
 .../jackrabbit/oak/segment/data/SegmentDataV12.java  |  6 +++---
 .../jackrabbit/oak/segment/file/LocalGCJournalFile.java  |  6 +++---
 .../oak/segment/file/tar/SegmentTarManager.java  |  6 +++---
 .../oak/segment/file/tar/SegmentTarWriter.java   | 16 
 .../file/tar/binaries/BinaryReferencesIndexLoaderV1.java |  4 ++--
 .../file/tar/binaries/BinaryReferencesIndexLoaderV2.java |  4 ++--
 .../file/tar/binaries/BinaryReferencesIndexWriter.java   |  6 +++---
 .../standby/codec/GetReferencesResponseEncoder.java  |  5 +++--
 .../oak/segment/standby/codec/ResponseDecoder.java   |  8 
 .../jackrabbit/oak/segment/DefaultSegmentWriterTest.java | 10 +-
 .../jackrabbit/oak/segment/file/tar/TarFileTest.java |  8 
 .../jackrabbit/oak/segment/file/tar/TarWriterTest.java   |  4 ++--
 .../tar/binaries/BinaryReferencesIndexLoaderTest.java|  4 ++--
 .../tar/binaries/BinaryReferencesIndexLoaderV1Test.java  |  4 ++--
 .../tar/binaries/BinaryReferencesIndexLoaderV2Test.java  |  4 ++--
 .../jackrabbit/oak/segment/standby/StandbyTestUtils.java |  4 ++--
 

Re: [PR] OAK-10691: remove use of Guava Charsets class [jackrabbit-oak]

2024-06-17 Thread via GitHub


reschke merged PR #1538:
URL: https://github.com/apache/jackrabbit-oak/pull/1538


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10897 - Delete unused class: DocumentStoreSplitter [jackrabbit-oak]

2024-06-17 Thread via GitHub


nfsantos merged PR #1537:
URL: https://github.com/apache/jackrabbit-oak/pull/1537


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(jackrabbit-oak) branch trunk updated: OAK-10897 - Delete unused class: DocumentStoreSplitter (#1537)

2024-06-17 Thread nfsantos
This is an automated email from the ASF dual-hosted git repository.

nfsantos pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 759637eb3a OAK-10897 - Delete unused class: DocumentStoreSplitter 
(#1537)
759637eb3a is described below

commit 759637eb3a7c6bf550227569e85c852a58d9762c
Author: Nuno Santos 
AuthorDate: Mon Jun 17 16:41:37 2024 +0200

OAK-10897 - Delete unused class: DocumentStoreSplitter (#1537)
---
 .../document/mongo/DocumentStoreSplitter.java  | 97 --
 1 file changed, 97 deletions(-)

diff --git 
a/oak-run-commons/src/main/java/org/apache/jackrabbit/oak/plugins/document/mongo/DocumentStoreSplitter.java
 
b/oak-run-commons/src/main/java/org/apache/jackrabbit/oak/plugins/document/mongo/DocumentStoreSplitter.java
deleted file mode 100644
index e5a2186c0a..00
--- 
a/oak-run-commons/src/main/java/org/apache/jackrabbit/oak/plugins/document/mongo/DocumentStoreSplitter.java
+++ /dev/null
@@ -1,97 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-package org.apache.jackrabbit.oak.plugins.document.mongo;
-
-import com.mongodb.BasicDBObject;
-import com.mongodb.client.MongoCollection;
-import org.apache.jackrabbit.oak.plugins.document.Collection;
-import org.apache.jackrabbit.oak.plugins.document.Document;
-import org.apache.jackrabbit.oak.plugins.document.NodeDocument;
-import org.bson.BsonDocument;
-import org.bson.BsonInt64;
-import org.bson.BsonNull;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.util.ArrayList;
-import java.util.Collections;
-import java.util.Iterator;
-import java.util.List;
-
-public class DocumentStoreSplitter {
-
-private static final Logger log = 
LoggerFactory.getLogger(DocumentStoreSplitter.class);
-
-MongoDocumentStore mongoStore;
-
-public DocumentStoreSplitter(MongoDocumentStore mongoStore) {
-this.mongoStore = mongoStore;
-}
-
-public  List split(Collection collection, 
long modifiedSinceLowerLimit, int parts) {
-MongoCollection dbCollection = 
mongoStore.getDBCollection(collection);
-BsonDocument query = new BsonDocument();
-query.append(NodeDocument.MODIFIED_IN_SECS, new 
BsonDocument().append("$ne", new BsonNull()));
-long oldest;
-Iterator cursor;
-if (modifiedSinceLowerLimit <= 0) {
-cursor = dbCollection.find(query).sort(new 
BsonDocument(NodeDocument.MODIFIED_IN_SECS,
-new BsonInt64(1))).limit(1).iterator();
-if (!cursor.hasNext()) {
-return Collections.emptyList();
-}
-oldest = cursor.next().getLong(NodeDocument.MODIFIED_IN_SECS);
-} else {
-oldest = modifiedSinceLowerLimit;
-}
-cursor = dbCollection.find(query).sort(new 
BsonDocument(NodeDocument.MODIFIED_IN_SECS,
-new BsonInt64(-1))).limit(1).iterator();
-if (!cursor.hasNext()) {
-return Collections.emptyList();
-}
-long latest = cursor.next().getLong(NodeDocument.MODIFIED_IN_SECS);
-return simpleSplit(oldest, latest, parts);
-}
-
-public static List simpleSplit(long start, long end, int parts) {
-if (end < start) {
-throw new IllegalArgumentException("start(" + start + ") can't be 
greater than end (" + end + ")");
-}
-if (start == end) {
-return Collections.singletonList(start);
-}
-if (parts > end - start) {
-log.debug("Adjusting parts according to given range {} - {}", 
start, end);
-parts = (int)(end - start);
-}
-long stepSize = (end - start)/parts;
-List steps = new ArrayList<>();
-StringBuilder splitPoints = new StringBuilder();
-for (long i = start; i <= end; i+=stepSize) {
-steps.add(i);
-splitPoints.append(" ").append(i);
-}
-if (steps.size() > 0 && steps.get(steps.size() - 1) != end) {
-steps.add(end);
-splitPoints.append(" ").append(end);
-}
-log.info("Split points of 

Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642878592


##
oak-store-document/src/test/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyStateTest.java:
##
@@ -81,4 +100,179 @@ public void multiValuedBinarySize() throws Exception {
 assertEquals(0, reads.size());
 }
 
-}
+@Test
+public void multiValuedAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+List blobs = newArrayList();
+for (int i = 0; i < 13; i++) {
+blobs.add(builder.createBlob(new RandomStream(BLOB_SIZE, i)));
+}
+builder.child(TEST_NODE).setProperty("p", blobs, Type.BINARIES);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.BINARIES, Objects.requireNonNull(p).getType());
+assertEquals(13, p.count());
+
+reads.clear();
+assertEquals(BLOB_SIZE, p.size(0));
+// must not read the blob via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringBelowThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", "dummy", Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(5, p.size(0));
+// must not read the string via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", STRING_HUGEVALUE, 
Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(10050, p.size(0));
+// must not read the string via streams
+assertEquals(0, reads.size());
+}
+
+@Test
+public void compressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", "\"" + STRING_HUGEVALUE + "\"", 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), 
STRING_HUGEVALUE);
+
+verify(mockCompression, 
times(1)).getOutputStream(any(OutputStream.class));
+
+compressionThreshold.set(null, -1);
+
+}
+
+@Test
+public void uncompressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+OutputStream mockOutputStream= mock(OutputStream.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenReturn(mockOutputStream);
+
when(mockCompression.getInputStream(any(InputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", STRING_HUGEVALUE, 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), "{}");
+
+verify(mockCompression, 
times(1)).getInputStream(any(InputStream.class));
+
+

[PR] OAK-10896: - Add feature toggle for removal of deleted properties and orphaned nodes [jackrabbit-oak]

2024-06-17 Thread via GitHub


shodaaan opened a new pull request, #1539:
URL: https://github.com/apache/jackrabbit-oak/pull/1539

   - create class FullGCOptions which will hold the different parameters used 
for running full GC
   - added variables to Configuration.java for 2 full GC modes: GAP_ORPHANS and 
EMPTY_PROPERTIES
   - updated VersionGarbageCollector constructor, as well as DocumentNodeStore 
Builder / Service and DocumentNodeStore to get values received from 
Configuration via the FullGCOptions class
   
   - applied the new full GC mode options in VersionGarbageCollector via the 
fullGCOptions parameter passed in the constructor
   - added unit test methods for testing feature toggle and configuration 
options set in DocumentNodeStoreBuilder, based on existing testing of feature 
toggles and configuration settings in unit tests
   - fixed CommitBuilderTest test failures by changing expected exception type


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10890 - Added log warning [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli commented on PR #1535:
URL: https://github.com/apache/jackrabbit-oak/pull/1535#issuecomment-2173430866

   ok - just checking why the build fails - restarted it now at 
https://ci-builds.apache.org/job/Jackrabbit/job/oak-trunk-pr/job/PR-1535/3/


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GH] (jackrabbit-oak): Workflow run "SonarCloud" is working again!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has succeeded.
Run started by GitHub user ionutzpi (triggered by stefan-egli).

Head commit for run:
b642725cbf9181f2ebe3281f4423b9ea0a4e920c / pirlogea 
OAK-10890 - Added prefix to log warning

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9547463727

With regards,
GitHub Actions via GitBox



[GH] (jackrabbit-oak): Workflow run "SonarCloud" failed!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has failed.
Run started by GitHub user reschke (triggered by reschke).

Head commit for run:
654e60d397523fb3dba4fdff35990b72a233bb2d / Julian Reschke 
OAK-10691: remove use of Guava Charsets class

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9548334572

With regards,
GitHub Actions via GitBox



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


reschke commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642813805


##
oak-store-document/src/test/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyStateTest.java:
##
@@ -81,4 +100,179 @@ public void multiValuedBinarySize() throws Exception {
 assertEquals(0, reads.size());
 }
 
-}
+@Test
+public void multiValuedAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+List blobs = newArrayList();
+for (int i = 0; i < 13; i++) {
+blobs.add(builder.createBlob(new RandomStream(BLOB_SIZE, i)));
+}
+builder.child(TEST_NODE).setProperty("p", blobs, Type.BINARIES);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.BINARIES, Objects.requireNonNull(p).getType());
+assertEquals(13, p.count());
+
+reads.clear();
+assertEquals(BLOB_SIZE, p.size(0));
+// must not read the blob via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringBelowThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", "dummy", Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(5, p.size(0));
+// must not read the string via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", STRING_HUGEVALUE, 
Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(10050, p.size(0));
+// must not read the string via streams
+assertEquals(0, reads.size());
+}
+
+@Test
+public void compressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", "\"" + STRING_HUGEVALUE + "\"", 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), 
STRING_HUGEVALUE);
+
+verify(mockCompression, 
times(1)).getOutputStream(any(OutputStream.class));
+
+compressionThreshold.set(null, -1);
+
+}
+
+@Test
+public void uncompressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+OutputStream mockOutputStream= mock(OutputStream.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenReturn(mockOutputStream);
+
when(mockCompression.getInputStream(any(InputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", STRING_HUGEVALUE, 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), "{}");
+
+verify(mockCompression, 
times(1)).getInputStream(any(InputStream.class));
+
+

[GH] (jackrabbit-oak): Workflow run "SonarCloud" is working again!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has succeeded.
Run started by GitHub user nfsantos (triggered by nfsantos).

Head commit for run:
10b4bbab0f252193d0d60087c052b46878ea62fc / Nuno Santos 
Delete unused class

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9548324858

With regards,
GitHub Actions via GitBox



[PR] OAK-10691: remove use of Guava Charsets class [jackrabbit-oak]

2024-06-17 Thread via GitHub


reschke opened a new pull request, #1538:
URL: https://github.com/apache/jackrabbit-oak/pull/1538

   (no comment)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] OAK-10897 - Delete unused class: DocumentStoreSplitter [jackrabbit-oak]

2024-06-17 Thread via GitHub


nfsantos opened a new pull request, #1537:
URL: https://github.com/apache/jackrabbit-oak/pull/1537

   (no comment)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GH] (jackrabbit-oak): Workflow run "SonarCloud" failed!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has failed.
Run started by GitHub user stefan-egli (triggered by stefan-egli).

Head commit for run:
bb4c359788723665fd5adc985fed87b74127c89a / stefan-egli 
OAK-10845 : exclude another flaky test combination

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9547548656

With regards,
GitHub Actions via GitBox



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642766461


##
oak-store-document/src/test/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyStateTest.java:
##
@@ -81,4 +100,179 @@ public void multiValuedBinarySize() throws Exception {
 assertEquals(0, reads.size());
 }
 
-}
+@Test
+public void multiValuedAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+List blobs = newArrayList();
+for (int i = 0; i < 13; i++) {
+blobs.add(builder.createBlob(new RandomStream(BLOB_SIZE, i)));
+}
+builder.child(TEST_NODE).setProperty("p", blobs, Type.BINARIES);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.BINARIES, Objects.requireNonNull(p).getType());
+assertEquals(13, p.count());
+
+reads.clear();
+assertEquals(BLOB_SIZE, p.size(0));
+// must not read the blob via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringBelowThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", "dummy", Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(5, p.size(0));
+// must not read the string via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", STRING_HUGEVALUE, 
Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(10050, p.size(0));
+// must not read the string via streams
+assertEquals(0, reads.size());
+}
+
+@Test
+public void compressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", "\"" + STRING_HUGEVALUE + "\"", 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), 
STRING_HUGEVALUE);
+
+verify(mockCompression, 
times(1)).getOutputStream(any(OutputStream.class));
+
+compressionThreshold.set(null, -1);
+
+}
+
+@Test
+public void uncompressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+OutputStream mockOutputStream= mock(OutputStream.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenReturn(mockOutputStream);
+
when(mockCompression.getInputStream(any(InputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", STRING_HUGEVALUE, 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), "{}");
+
+verify(mockCompression, 
times(1)).getInputStream(any(InputStream.class));
+
+

(jackrabbit-oak) branch OAK-10691 deleted (was c1889bcc4e)

2024-06-17 Thread reschke
This is an automated email from the ASF dual-hosted git repository.

reschke pushed a change to branch OAK-10691
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git


 was c1889bcc4e OAK-10691: merge trunk

The revisions that were on this branch are still contained in
other references; therefore, this change does not discard any commits
from the repository.



(jackrabbit-oak) 01/01: OAK-10691: remove use of Guava Charsets class

2024-06-17 Thread reschke
This is an automated email from the ASF dual-hosted git repository.

reschke pushed a commit to branch OAK-10691
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git

commit 654e60d397523fb3dba4fdff35990b72a233bb2d
Author: Julian Reschke 
AuthorDate: Mon Jun 17 13:43:14 2024 +0100

OAK-10691: remove use of Guava Charsets class
---
 .../jackrabbit/oak/scalability/ScalabilityRunner.java|  4 ++--
 .../cloud/azure/blobstorage/AzureBlobStoreBackend.java   |  8 
 .../oak/plugins/blob/MarkSweepGarbageCollector.java  |  8 
 .../oak/plugins/blob/datastore/BlobIdTracker.java|  4 ++--
 .../oak/plugins/blob/datastore/OakFileDataStore.java |  4 ++--
 .../directaccess/DataRecordDownloadOptions.java  |  3 +--
 .../oak/plugins/blob/ConsolidatedDataStoreStatsTest.java |  4 ++--
 .../oak/plugins/blob/UploadStagingCacheTest.java |  4 ++--
 .../AbstractDataRecordAccessProviderTest.java|  6 +++---
 .../directaccess/DataRecordDownloadOptionsTest.java  |  6 +++---
 .../plugins/blob/serializer/FSBlobSerializerTest.java|  4 ++--
 .../jackrabbit/oak/spi/blob/AbstractBlobStore.java   |  4 ++--
 .../apache/jackrabbit/oak/spi/blob/split/BlobIdSet.java  |  4 ++--
 .../oak/plugins/index/datastore/DataStoreTextWriter.java | 10 +-
 .../plugins/index/importer/IndexDefinitionUpdater.java   |  4 ++--
 .../oak/plugins/index/property/PropertyIndexUtil.java|  4 ++--
 .../oak/plugins/nodetype/write/NodeTypeRegistry.java |  4 ++--
 .../oak/plugins/index/importer/IndexImporterTest.java|  6 +++---
 .../apache/jackrabbit/oak/http/HtmlRepresentation.java   |  4 ++--
 .../org/apache/jackrabbit/oak/jcr/TestContentLoader.java |  4 ++--
 .../directory/ActiveDeletedBlobCollectorFactory.java |  4 ++--
 .../index/lucene/directory/IndexRootDirectory.java   |  4 ++--
 .../plugins/index/lucene/LucenePropertyIndexTest.java|  6 +++---
 .../apache/jackrabbit/oak/run/osgi/ConfigTracker.java|  4 ++--
 .../indexer/document/flatfile/NodeStateEntrySorter.java  |  4 ++--
 .../indexer/document/flatfile/StateInBytesHolder.java|  6 +++---
 .../document/flatfile/TraverseWithSortStrategy.java  |  4 ++--
 .../jackrabbit/oak/exporter/NodeStateSerializer.java |  6 +++---
 .../oak/index/IndexConsistencyCheckPrinter.java  |  4 ++--
 .../oak/plugins/tika/CSVFileBinaryResourceProvider.java  |  4 ++--
 .../jackrabbit/oak/plugins/tika/CSVFileGenerator.java|  4 ++--
 .../jackrabbit/oak/plugins/tika/TextPopulator.java   |  4 ++--
 .../apache/jackrabbit/oak/run/DataStoreCheckCommand.java |  8 
 .../org/apache/jackrabbit/oak/run/DataStoreCommand.java  |  8 
 .../jackrabbit/oak/exporter/NodeStateSerializerTest.java |  6 +++---
 .../java/org/apache/jackrabbit/oak/index/ReindexIT.java  |  4 ++--
 .../plugins/tika/CSVFileBinaryResourceProviderTest.java  |  6 +++---
 .../jackrabbit/oak/plugins/tika/TextPopulatorTest.java   |  4 ++--
 .../jackrabbit/oak/plugins/tika/TikaHelperTest.java  |  6 +++---
 .../apache/jackrabbit/oak/run/DataStoreCheckTest.java|  6 +++---
 .../jackrabbit/oak/segment/azure/AzureGCJournalFile.java |  4 ++--
 .../jackrabbit/oak/segment/DefaultSegmentWriter.java |  6 +++---
 .../java/org/apache/jackrabbit/oak/segment/Segment.java  |  4 ++--
 .../org/apache/jackrabbit/oak/segment/SegmentBlob.java   |  4 ++--
 .../jackrabbit/oak/segment/SegmentBufferWriter.java  |  4 ++--
 .../org/apache/jackrabbit/oak/segment/SegmentDump.java   |  4 ++--
 .../org/apache/jackrabbit/oak/segment/SegmentParser.java |  6 +++---
 .../org/apache/jackrabbit/oak/segment/SegmentStream.java |  6 +++---
 .../jackrabbit/oak/segment/data/SegmentDataV12.java  |  6 +++---
 .../jackrabbit/oak/segment/file/LocalGCJournalFile.java  |  6 +++---
 .../oak/segment/file/tar/SegmentTarManager.java  |  6 +++---
 .../oak/segment/file/tar/SegmentTarWriter.java   | 16 
 .../file/tar/binaries/BinaryReferencesIndexLoaderV1.java |  4 ++--
 .../file/tar/binaries/BinaryReferencesIndexLoaderV2.java |  4 ++--
 .../file/tar/binaries/BinaryReferencesIndexWriter.java   |  6 +++---
 .../standby/codec/GetReferencesResponseEncoder.java  |  5 +++--
 .../oak/segment/standby/codec/ResponseDecoder.java   |  8 
 .../jackrabbit/oak/segment/DefaultSegmentWriterTest.java | 10 +-
 .../jackrabbit/oak/segment/file/tar/TarFileTest.java |  8 
 .../jackrabbit/oak/segment/file/tar/TarWriterTest.java   |  4 ++--
 .../tar/binaries/BinaryReferencesIndexLoaderTest.java|  4 ++--
 .../tar/binaries/BinaryReferencesIndexLoaderV1Test.java  |  4 ++--
 .../tar/binaries/BinaryReferencesIndexLoaderV2Test.java  |  4 ++--
 .../jackrabbit/oak/segment/standby/StandbyTestUtils.java |  4 ++--
 .../standby/codec/GetHeadResponseEncoderTest.java|  5 +++--
 .../standby/codec/GetReferencesResponseEncoderTest.java  |  5 +++--
 .../oak/segment/standby/codec/ResponseDecoderTest.java   | 14 

(jackrabbit-oak) branch OAK-10691 created (now 654e60d397)

2024-06-17 Thread reschke
This is an automated email from the ASF dual-hosted git repository.

reschke pushed a change to branch OAK-10691
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git


  at 654e60d397 OAK-10691: remove use of Guava Charsets class

This branch includes the following new commits:

 new 654e60d397 OAK-10691: remove use of Guava Charsets class

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




Re: [PR] OAK-10691: remove use of Guava Charsets class [jackrabbit-oak]

2024-06-17 Thread via GitHub


reschke closed pull request #1342: OAK-10691: remove use of Guava Charsets class
URL: https://github.com/apache/jackrabbit-oak/pull/1342


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10890 - Added log warning [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on PR #1535:
URL: https://github.com/apache/jackrabbit-oak/pull/1535#issuecomment-2173297991

   > @ionutzpi should I go ahead and merge, i.e. is the PR ready?
   
   yes, the PR is ready


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642746647


##
oak-store-document/src/test/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyStateTest.java:
##
@@ -81,4 +100,179 @@ public void multiValuedBinarySize() throws Exception {
 assertEquals(0, reads.size());
 }
 
-}
+@Test
+public void multiValuedAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+List blobs = newArrayList();
+for (int i = 0; i < 13; i++) {
+blobs.add(builder.createBlob(new RandomStream(BLOB_SIZE, i)));
+}
+builder.child(TEST_NODE).setProperty("p", blobs, Type.BINARIES);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.BINARIES, Objects.requireNonNull(p).getType());
+assertEquals(13, p.count());
+
+reads.clear();
+assertEquals(BLOB_SIZE, p.size(0));
+// must not read the blob via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringBelowThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", "dummy", Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(5, p.size(0));
+// must not read the string via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", STRING_HUGEVALUE, 
Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(10050, p.size(0));
+// must not read the string via streams
+assertEquals(0, reads.size());
+}
+
+@Test
+public void compressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", "\"" + STRING_HUGEVALUE + "\"", 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), 
STRING_HUGEVALUE);
+
+verify(mockCompression, 
times(1)).getOutputStream(any(OutputStream.class));
+
+compressionThreshold.set(null, -1);
+
+}
+
+@Test
+public void uncompressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+OutputStream mockOutputStream= mock(OutputStream.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenReturn(mockOutputStream);
+
when(mockCompression.getInputStream(any(InputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", STRING_HUGEVALUE, 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), "{}");
+
+verify(mockCompression, 
times(1)).getInputStream(any(InputStream.class));
+
+

Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642740716


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =
+SystemPropertySupplier.create("oak.mongo.compressionThreshold", 
-1).get();
 
 DocumentPropertyState(DocumentNodeStore store, String name, String value) {
+this(store, name, value, Compression.GZIP);
+}
+
+DocumentPropertyState(DocumentNodeStore store, String name, String value, 
Compression compression) {
 this.store = store;
 this.name = name;
-this.value = value;
+if (DEFAULT_COMPRESSION_THRESHOLD == -1) {
+this.value = value;
+this.compression = null;
+this.compressedValue = null;
+} else {
+this.compression = compression;
+int size = value.length();
+String localValue = value;
+byte[] localCompressedValue = null;
+if (compression != null && size > DEFAULT_COMPRESSION_THRESHOLD) {

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642740716


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =
+SystemPropertySupplier.create("oak.mongo.compressionThreshold", 
-1).get();
 
 DocumentPropertyState(DocumentNodeStore store, String name, String value) {
+this(store, name, value, Compression.GZIP);
+}
+
+DocumentPropertyState(DocumentNodeStore store, String name, String value, 
Compression compression) {
 this.store = store;
 this.name = name;
-this.value = value;
+if (DEFAULT_COMPRESSION_THRESHOLD == -1) {
+this.value = value;
+this.compression = null;
+this.compressedValue = null;
+} else {
+this.compression = compression;
+int size = value.length();
+String localValue = value;
+byte[] localCompressedValue = null;
+if (compression != null && size > DEFAULT_COMPRESSION_THRESHOLD) {

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642736298


##
oak-store-document/src/test/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyStateTest.java:
##
@@ -81,4 +100,179 @@ public void multiValuedBinarySize() throws Exception {
 assertEquals(0, reads.size());
 }
 
-}
+@Test
+public void multiValuedAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+List blobs = newArrayList();
+for (int i = 0; i < 13; i++) {
+blobs.add(builder.createBlob(new RandomStream(BLOB_SIZE, i)));
+}
+builder.child(TEST_NODE).setProperty("p", blobs, Type.BINARIES);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.BINARIES, Objects.requireNonNull(p).getType());
+assertEquals(13, p.count());
+
+reads.clear();
+assertEquals(BLOB_SIZE, p.size(0));
+// must not read the blob via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringBelowThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", "dummy", Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(5, p.size(0));
+// must not read the string via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", STRING_HUGEVALUE, 
Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(10050, p.size(0));
+// must not read the string via streams
+assertEquals(0, reads.size());
+}
+
+@Test
+public void compressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", "\"" + STRING_HUGEVALUE + "\"", 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), 
STRING_HUGEVALUE);
+
+verify(mockCompression, 
times(1)).getOutputStream(any(OutputStream.class));
+
+compressionThreshold.set(null, -1);
+
+}
+
+@Test
+public void uncompressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+OutputStream mockOutputStream= mock(OutputStream.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenReturn(mockOutputStream);
+
when(mockCompression.getInputStream(any(InputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", STRING_HUGEVALUE, 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), "{}");
+
+verify(mockCompression, 
times(1)).getInputStream(any(InputStream.class));
+
+

Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642735284


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =
+SystemPropertySupplier.create("oak.mongo.compressionThreshold", 
-1).get();
 
 DocumentPropertyState(DocumentNodeStore store, String name, String value) {
+this(store, name, value, Compression.GZIP);
+}
+
+DocumentPropertyState(DocumentNodeStore store, String name, String value, 
Compression compression) {
 this.store = store;
 this.name = name;
-this.value = value;
+if (DEFAULT_COMPRESSION_THRESHOLD == -1) {
+this.value = value;
+this.compression = null;
+this.compressedValue = null;
+} else {
+this.compression = compression;
+int size = value.length();
+String localValue = value;
+byte[] localCompressedValue = null;
+if (compression != null && size > DEFAULT_COMPRESSION_THRESHOLD) {

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10890 - Added log warning [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli commented on PR #1535:
URL: https://github.com/apache/jackrabbit-oak/pull/1535#issuecomment-2173265117

   @ionutzpi should I go ahead and merge, i.e. is the PR ready?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642734832


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =
+SystemPropertySupplier.create("oak.mongo.compressionThreshold", 
-1).get();

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642734255


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -128,7 +188,7 @@ public boolean equals(Object object) {
 } else if (object instanceof DocumentPropertyState) {
 DocumentPropertyState other = (DocumentPropertyState) object;
 return this.name.equals(other.name)
-&& this.value.equals(other.value);
+&& this.getValue().equals(other.getValue());

Review Comment:
   Yes.



##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(jackrabbit-oak) branch OAK-10845-2 created (now bb4c359788)

2024-06-17 Thread stefanegli
This is an automated email from the ASF dual-hosted git repository.

stefanegli pushed a change to branch OAK-10845-2
in repository https://gitbox.apache.org/repos/asf/jackrabbit-oak.git


  at bb4c359788 OAK-10845 : exclude another flaky test combination

No new revisions were added by this update.



[PR] OAK-10845 : exclude another flaky test combination [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli opened a new pull request, #1536:
URL: https://github.com/apache/jackrabbit-oak/pull/1536

   (no comment)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642721700


##
oak-store-document/src/test/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyStateTest.java:
##
@@ -81,4 +100,179 @@ public void multiValuedBinarySize() throws Exception {
 assertEquals(0, reads.size());
 }
 
-}
+@Test
+public void multiValuedAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+List blobs = newArrayList();
+for (int i = 0; i < 13; i++) {
+blobs.add(builder.createBlob(new RandomStream(BLOB_SIZE, i)));
+}
+builder.child(TEST_NODE).setProperty("p", blobs, Type.BINARIES);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.BINARIES, Objects.requireNonNull(p).getType());
+assertEquals(13, p.count());
+
+reads.clear();
+assertEquals(BLOB_SIZE, p.size(0));
+// must not read the blob via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringBelowThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", "dummy", Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(5, p.size(0));
+// must not read the string via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", STRING_HUGEVALUE, 
Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(10050, p.size(0));
+// must not read the string via streams
+assertEquals(0, reads.size());
+}
+
+@Test
+public void compressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", "\"" + STRING_HUGEVALUE + "\"", 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), 
STRING_HUGEVALUE);
+
+verify(mockCompression, 
times(1)).getOutputStream(any(OutputStream.class));
+
+compressionThreshold.set(null, -1);
+
+}
+
+@Test
+public void uncompressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+OutputStream mockOutputStream= mock(OutputStream.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenReturn(mockOutputStream);
+
when(mockCompression.getInputStream(any(InputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", STRING_HUGEVALUE, 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), "{}");
+
+verify(mockCompression, 
times(1)).getInputStream(any(InputStream.class));
+
+

Re: [PR] OAK-10890 - Added log warning [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi commented on code in PR #1535:
URL: https://github.com/apache/jackrabbit-oak/pull/1535#discussion_r1642721575


##
oak-core/src/main/java/org/apache/jackrabbit/oak/plugins/index/property/PropertyIndexEditor.java:
##
@@ -328,6 +332,7 @@ private void checkUniquenessConstraints() throws 
CommitFailedException {
 String msg = String.format(
 "Uniqueness constraint violated property %s having 
value %s",
 propertyNames, failed);
+log.warn(msg);

Review Comment:
   Done
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642713859


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =
+SystemPropertySupplier.create("oak.mongo.compressionThreshold", 
-1).get();
 
 DocumentPropertyState(DocumentNodeStore store, String name, String value) {
+this(store, name, value, Compression.GZIP);
+}
+
+DocumentPropertyState(DocumentNodeStore store, String name, String value, 
Compression compression) {
 this.store = store;
 this.name = name;
-this.value = value;
+if (DEFAULT_COMPRESSION_THRESHOLD == -1) {
+this.value = value;
+this.compression = null;
+this.compressedValue = null;
+} else {
+this.compression = compression;
+int size = value.length();
+String localValue = value;
+byte[] localCompressedValue = null;
+if (compression != null && size > DEFAULT_COMPRESSION_THRESHOLD) {

Review Comment:
   I think the `compression != null` part could be move to the `if` above?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642708995


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =
+SystemPropertySupplier.create("oak.mongo.compressionThreshold", 
-1).get();

Review Comment:
   the property isn't mongo specific. this isn't handled entirely consistent in 
the existing code base neither - perhaps it should be `oak.documentMK.` ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10890 - Added log warning [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli commented on PR #1535:
URL: https://github.com/apache/jackrabbit-oak/pull/1535#issuecomment-2173190927

   +1 other than the minor comment. 
   PS: there's another case of 0030 in 
[UniquenessConstraintValidator](https://github.com/apache/jackrabbit-oak/blob/0e327a27eb988a0f7656535275f3735ef8474e5d/oak-lucene/src/main/java/org/apache/jackrabbit/oak/plugins/index/lucene/property/UniquenessConstraintValidator.java#L81)
 but I'd go with this PropertyIndexEditor as a start. Was also wondering if the 
fact that it contains the effected paths is a leak of sorts - but given the 
CommitFailedException is frequently logged anyway, that doesn't seem to be a 
problem.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] OAK-10890 - Added log warning [jackrabbit-oak]

2024-06-17 Thread via GitHub


stefan-egli commented on code in PR #1535:
URL: https://github.com/apache/jackrabbit-oak/pull/1535#discussion_r1642692392


##
oak-core/src/main/java/org/apache/jackrabbit/oak/plugins/index/property/PropertyIndexEditor.java:
##
@@ -328,6 +332,7 @@ private void checkUniquenessConstraints() throws 
CommitFailedException {
 String msg = String.format(
 "Uniqueness constraint violated property %s having 
value %s",
 propertyNames, failed);
+log.warn(msg);

Review Comment:
   I'm wondering if we should add a prefix to the log. Without the prefix it 
might be hard for down-stream logging to distinguish between the two sources of 
this message (the log.warn and the exception), as they contain a large common 
part. What about something like below (or maybe a better variant of it) ?
   ```suggestion
   log.warn("checkUniquenessConstraints: " + msg);
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: Using oak run to compact older versions

2024-06-17 Thread Konrad Windszus
I did the same in the past without issues.
Using the newest oak-run usually is beneficial as it got rid of some bugs….

Konrad

> On 17. Jun 2024, at 12:45, Roy Teeuwen  wrote:
> 
> Hey,
> 
> I am using an oak-core 1.22 based application (AEM 6.5) but I’d like to use 
> the features of the current oak-run jar to do parallel offline compaction. Is 
> this feasible? I tried it on a local instance and everything seems to start 
> after compaction and seems to work, but I’d like to see if there is a way to 
> be more certain that there weren’t any hidden bugs introduced.
> 
> Seeing as the segment store hasn’t needed any upgrade in the last iterations 
> of AEM, I expect we should be fine?
> 
> Groeten,
> Roy



Re: Using oak run to compact older versions

2024-06-17 Thread Julian Sedding
Hi Roy

Looking at 
https://jackrabbit.apache.org/oak/docs/nodestore/segment/changes.html,
there don't seem to be any changes in the storage format of the TAR
segment store. Therefore, I would expect no issues. I may be missing
something, however.

Regards
Julian

On Mon, Jun 17, 2024 at 12:47 PM Roy Teeuwen  wrote:
>
> Hey,
>
> I am using an oak-core 1.22 based application (AEM 6.5) but I’d like to use 
> the features of the current oak-run jar to do parallel offline compaction. Is 
> this feasible? I tried it on a local instance and everything seems to start 
> after compaction and seems to work, but I’d like to see if there is a way to 
> be more certain that there weren’t any hidden bugs introduced.
>
> Seeing as the segment store hasn’t needed any upgrade in the last iterations 
> of AEM, I expect we should be fine?
>
> Groeten,
> Roy


[PR] OAK-10890 - Added log warning [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi opened a new pull request, #1535:
URL: https://github.com/apache/jackrabbit-oak/pull/1535

   (no comment)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Oak 10890 -- Logging for constraint violations (UUID already exists) [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi closed pull request #1534: Oak 10890 -- Logging for constraint 
violations (UUID already exists)
URL: https://github.com/apache/jackrabbit-oak/pull/1534


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Using oak run to compact older versions

2024-06-17 Thread Roy Teeuwen
Hey,

I am using an oak-core 1.22 based application (AEM 6.5) but I’d like to use the 
features of the current oak-run jar to do parallel offline compaction. Is this 
feasible? I tried it on a local instance and everything seems to start after 
compaction and seems to work, but I’d like to see if there is a way to be more 
certain that there weren’t any hidden bugs introduced.

Seeing as the segment store hasn’t needed any upgrade in the last iterations of 
AEM, I expect we should be fine?

Groeten,
Roy


Re: [PR] OAK-10803 -- compress/uncompress property [jackrabbit-oak]

2024-06-17 Thread via GitHub


reschke commented on code in PR #1526:
URL: https://github.com/apache/jackrabbit-oak/pull/1526#discussion_r1642564726


##
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyState.java:
##
@@ -38,24 +45,64 @@
 import org.apache.jackrabbit.oak.plugins.memory.StringPropertyState;
 import org.apache.jackrabbit.oak.plugins.value.Conversions;
 import org.jetbrains.annotations.NotNull;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * PropertyState implementation with lazy parsing of the JSOP encoded value.
  */
 final class DocumentPropertyState implements PropertyState {
 
+private static final Logger LOG = 
LoggerFactory.getLogger(DocumentPropertyState.class);
+
 private final DocumentNodeStore store;
 
 private final String name;
 
 private final String value;
 
 private PropertyState parsed;
+private final byte[] compressedValue;
+private final Compression compression;
+
+private static final int DEFAULT_COMPRESSION_THRESHOLD =

Review Comment:
   I'd set the logger here as well ("logTo()")



##
oak-store-document/src/test/java/org/apache/jackrabbit/oak/plugins/document/DocumentPropertyStateTest.java:
##
@@ -81,4 +100,179 @@ public void multiValuedBinarySize() throws Exception {
 assertEquals(0, reads.size());
 }
 
-}
+@Test
+public void multiValuedAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+List blobs = newArrayList();
+for (int i = 0; i < 13; i++) {
+blobs.add(builder.createBlob(new RandomStream(BLOB_SIZE, i)));
+}
+builder.child(TEST_NODE).setProperty("p", blobs, Type.BINARIES);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.BINARIES, Objects.requireNonNull(p).getType());
+assertEquals(13, p.count());
+
+reads.clear();
+assertEquals(BLOB_SIZE, p.size(0));
+// must not read the blob via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringBelowThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", "dummy", Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(5, p.size(0));
+// must not read the string via stream
+assertEquals(0, reads.size());
+}
+
+@Test
+public void stringAboveThresholdSize() throws Exception {
+NodeBuilder builder = ns.getRoot().builder();
+builder.child(TEST_NODE).setProperty("p", STRING_HUGEVALUE, 
Type.STRING);
+TestUtils.merge(ns, builder);
+
+PropertyState p = 
ns.getRoot().getChildNode(TEST_NODE).getProperty("p");
+assertEquals(Type.STRING, Objects.requireNonNull(p).getType());
+assertEquals(1, p.count());
+
+reads.clear();
+assertEquals(10050, p.size(0));
+// must not read the string via streams
+assertEquals(0, reads.size());
+}
+
+@Test
+public void compressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+
when(mockCompression.getOutputStream(any(OutputStream.class))).thenThrow(new 
IOException("Compression failed"));
+
+Field compressionThreshold = 
DocumentPropertyState.class.getDeclaredField("DEFAULT_COMPRESSION_THRESHOLD");
+compressionThreshold.setAccessible(true);
+Field modifiersField = Field.class.getDeclaredField("modifiers");
+modifiersField.setAccessible(true);
+modifiersField.setInt(compressionThreshold, 
compressionThreshold.getModifiers() & ~Modifier.FINAL);
+
+compressionThreshold.set(null, DEFAULT_COMPRESSION_THRESHOLD);
+
+DocumentPropertyState documentPropertyState = new 
DocumentPropertyState(mockDocumentStore, "p", "\"" + STRING_HUGEVALUE + "\"", 
mockCompression);
+
+assertEquals(documentPropertyState.getValue(Type.STRING), 
STRING_HUGEVALUE);
+
+verify(mockCompression, 
times(1)).getOutputStream(any(OutputStream.class));
+
+compressionThreshold.set(null, -1);
+
+}
+
+@Test
+public void uncompressValueThrowsException() throws IOException, 
NoSuchFieldException, IllegalAccessException {
+
+DocumentNodeStore mockDocumentStore = mock(DocumentNodeStore.class);
+Compression mockCompression = mock(Compression.class);
+OutputStream mockOutputStream= mock(OutputStream.class);
+

[PR] Oak 10890 -- Logging for constraint violations (UUID already exists) [jackrabbit-oak]

2024-06-17 Thread via GitHub


ionutzpi opened a new pull request, #1534:
URL: https://github.com/apache/jackrabbit-oak/pull/1534

   Assets sync - ConstraintViolationException: OakConstraint0030: Uniqueness 
constraint violated property [jcr:uuid]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] OAK-10894 - MongoDocumentStore: add a method to retrieve a node bypassing the DocumentNodeState cache. [jackrabbit-oak]

2024-06-17 Thread via GitHub


nfsantos opened a new pull request, #1533:
URL: https://github.com/apache/jackrabbit-oak/pull/1533

   (no comment)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: oak-dev-unsubscr...@jackrabbit.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GH] (jackrabbit-oak): Workflow run "SonarCloud" is working again!

2024-06-17 Thread GitBox


The GitHub Actions job "SonarCloud" on jackrabbit-oak.git has succeeded.
Run started by GitHub user nfsantos (triggered by nfsantos).

Head commit for run:
08cd6cc57fa06e943fb67cfda752e64ec14684a4 / Nuno Santos 
Merge remote-tracking branch 'upstream/trunk' into OAK-10889

Report URL: https://github.com/apache/jackrabbit-oak/actions/runs/9542813786

With regards,
GitHub Actions via GitBox