Re: Error in S3 Garbage Collection

2020-10-19 Thread Julian Reschke

Am 19.10.2020 um 16:17 schrieb Julian Reschke:

Am 19.10.2020 um 08:37 schrieb Tanvi Shah:

Hi ,

We have started Se Garbage Collection and S3 size is more than 2 TB.

We are facing memory issues while executing GC even when we have given
11 GB of memory to the application.


Code is:

final MarkSweepGarbageCollector gc =
documentNodeStore.createBlobGarbageCollector(seconds,
repository.toString(), wb, new
DefaultStatisticsProvider(Executors.newScheduledThreadPool(1)));​

gc.collectGarbage(markOnly);
final OperationsStatsMBean stats = gc.getOperationStats();
log.info(
"number deleted : " + stats.numDeleted() + " Size deleted : " +
stats.sizeDeleted());

The exceptions are :
...


It might be necessary to set the fetch size (see
,

around
).


Sorry, can't test that right now myself.


Best regards, Julian


...or you could try to set the defaultRowFetchSize as parameter in the
JDBC URL (see
).

Best regards, Julian


Re: Error in S3 Garbage Collection

2020-10-19 Thread Julian Reschke

Am 19.10.2020 um 08:37 schrieb Tanvi Shah:

Hi ,

We have started Se Garbage Collection and S3 size is more than 2 TB.

We are facing memory issues while executing GC even when we have given 11 GB of 
memory to the application.


Code is:

final MarkSweepGarbageCollector gc = 
documentNodeStore.createBlobGarbageCollector(seconds, repository.toString(), 
wb, new DefaultStatisticsProvider(Executors.newScheduledThreadPool(1)));​

gc.collectGarbage(markOnly);
final OperationsStatsMBean stats = gc.getOperationStats();
log.info(
"number deleted : " + stats.numDeleted() + " Size deleted : " + 
stats.sizeDeleted());

The exceptions are :
...


It might be necessary to set the fetch size (see
,
around
).

Sorry, can't test that right now myself.


Best regards, Julian


Re: Error while migrating the Datastores from PostgreSQL to MongoDB

2020-10-19 Thread Mayur Barge
Thanks Julian,
Finally I could resolve it with your help.
Below command worked for me.
java -verbose -cp "oak-upgrade-1.34.0.jar:postgresql-42.2.17.jar"
org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade --src-user=***
--src-password='***' 'jdbc:postgresql://SRC-DB:5432/acdc_cms_dev'
mongodb://USER:PASS@HOST:27017/migration-poc=admin

Thanks & Regards,
Mayur

On Fri, Oct 16, 2020 at 4:10 PM Mayur Barge  wrote:

> *Tried with both options "upgrade" and without "upgrade" option*
> *java -verbose -cp "oak-upgrade-1.34.0.jar:**postgresql-42.2.17.jar" -jar
> oak-upgrade-1.34.0.jar --src-user=acdc_cms --src-password='***'
> --user=mayur --password=*** 'jdbc:postgresql://SRC*
>
> *_HOST:5432/acdc_cms_dev'*
> *'mongodb://TARGET_HOST:27017/**migration-test'*
>
> On Fri, Oct 16, 2020 at 4:09 PM Mayur Barge 
> wrote:
>
>> Hello,
>> I tried setting up classpath as suggested by Julian
>>
>> *java -verbose -cp "oak-upgrade-1.34.0.jar:**postgresql-42.2.17.jar"
>> -jar oak-upgrade-1.34.0.jar upgrade --src-user=acdc_cms
>> --src-password='***' --user=mayur --password=*** 'jdbc:postgresql://SRC*
>>
>> *_HOST:5432/acdc_cms_dev'*
>> *'mongodb://TARGET_HOST:27017/**migration-test'*
>>
>> Still I am facing the issue for
>>
>>
>>
>>
>>
>>
>> *[Loaded java.sql.SQLWarning from
>> /Library/Java/JavaVirtualMachines/jdk1.8.0_261.jdk/Contents/Home/jre/lib/rt.jar]16.10.2020
>> 16:08:30.873 [main] *INFO*
>>  org.apache.jackrabbit.oak.plugins.document.rdb.RDBDataSourceFactory -
>> trying to obtain driver for
>> jdbc:postgresql://op-ampp-dev-pgsql-01.springernature.com:5432/acdc_cms_dev
>> java.sql.SQLException:
>> No suitable driver at
>> java.sql.DriverManager.getDriver(DriverManager.java:315) at
>> org.apache.jackrabbit.oak.plugins.document.rdb.RDBDataSourceFactory.forJdbcUrl(RDBDataSourceFactory.java:71)
>> at
>> org.apache.jackrabbit.oak.plugins.document.rdb.RDBDataSourceFactory.forJdbcUrl(RDBDataSourceFactory.java:102)
>> at org.apac*
>>
>> Kindly let us know the right way to sidegrade from Postgres to Mongo.
>>
>> Thanks,
>> Mayur
>>
>> On Thu, Oct 15, 2020 at 3:19 PM Mayur Barge 
>> wrote:
>>
>>> Hello,
>>> I am trying to migrate my datasource from Postgres to Mongo. I have
>>> added the driver file in CLASSPATH
>>>
>>> *CLASSPATH=/Users/mbn2671/Downloads/jar_files/oak-upgrade-1.8.7.jar:/Users/mbn2671/Downloads/jar_files/postgresql-42.2.17.jar:.*
>>>
>>> This is a Mac machine I am using. Now what I want to achieve is migrate
>>> datastores from
>>> Postgres => MongoDB
>>> The below command i am trying to use
>>>
>>> *java -verbose -jar oak-upgrade-1.8.7.jar --src-password='***'
>>> --src-user=ABC --user=XYZ --password=***
>>> 'jdbc:postgresql://POSTGRESHOST:5432/DB'
>>> 'mongodb://MONGOHOST:27017/migration-test'*
>>>
>>> It says,
>>>
>>>
>>> *15.10.2020 15:17:44.041 [main] *INFO*
>>>  org.apache.jackrabbit.oak.plugins.document.rdb.RDBDataSourceFactory -
>>> trying to obtain driver for
>>> jdbc:postgresql://op-ampp-dev-pgsql-01.springernature.com:5432/acdc_cms_dev
>>> java.sql.SQLException:
>>> No suitable driver at
>>> java.sql.DriverManager.getDriver(DriverManager.java:315) ~[na:1.8.0_261]*
>>>
>>>
>>> Kindly let me know what could be the issue
>>>
>>


Error in S3 Garbage Collection

2020-10-19 Thread Tanvi Shah
Hi ,

We have started Se Garbage Collection and S3 size is more than 2 TB.

We are facing memory issues while executing GC even when we have given 11 GB of 
memory to the application.


Code is:

final MarkSweepGarbageCollector gc = 
documentNodeStore.createBlobGarbageCollector(seconds, repository.toString(), 
wb, new DefaultStatisticsProvider(Executors.newScheduledThreadPool(1)));​

gc.collectGarbage(markOnly);
final OperationsStatsMBean stats = gc.getOperationStats();
log.info(
"number deleted : " + stats.numDeleted() + " Size deleted : " + 
stats.sizeDeleted());

The exceptions are :

java.lang.RuntimeException: org.postgresql.util.PSQLException: Ran out of 
memory retrieving query results.
at 
org.apache.jackrabbit.oak.plugins.document.rdb.RDBDocumentStore$3.iterator(RDBDocumentStore.java:1876)
at 
org.apache.jackrabbit.oak.plugins.document.rdb.RDBBlobReferenceIterator.getIteratorOverDocsWithBinaries(RDBBlobReferenceIterator.java:48)
at 
org.apache.jackrabbit.oak.plugins.document.BlobReferenceIterator.loadBatch(BlobReferenceIterator.java:64)
at 
org.apache.jackrabbit.oak.plugins.document.BlobReferenceIterator.computeNext(BlobReferenceIterator.java:52)
at 
org.apache.jackrabbit.oak.plugins.document.BlobReferenceIterator.computeNext(BlobReferenceIterator.java:36)
at 
com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
at 
com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
at 
org.apache.jackrabbit.oak.plugins.document.DocumentBlobReferenceRetriever.collectReferences(DocumentBlobReferenceRetriever.java:49)
at 
org.apache.jackrabbit.oak.plugins.blob.MarkSweepGarbageCollector.iterateNodeTree(MarkSweepGarbageCollector.java:600)
at 
org.apache.jackrabbit.oak.plugins.blob.MarkSweepGarbageCollector.mark(MarkSweepGarbageCollector.java:397)
at 
org.apache.jackrabbit.oak.plugins.blob.MarkSweepGarbageCollector.markAndSweep(MarkSweepGarbageCollector.java:347)
at 
org.apache.jackrabbit.oak.plugins.blob.MarkSweepGarbageCollector.collectGarbage(MarkSweepGarbageCollector.java:235)
at 
com.springernature.mango.cms.common.helper.RepositoryUtil.s3GarbageCollection(RepositoryUtil.java:324)
at 
com.springernature.mango.cms.services.rest.service.ServicesController.lambda$runS3GC$1(ServicesController.java:460)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

org.postgresql.util.PSQLException: Ran out of memory retrieving query results.
at 
org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2145)
at 
org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
at 
org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
at 
org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:118)
at sun.reflect.GeneratedMethodAccessor49.invoke
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.tomcat.jdbc.pool.interceptor.StatementDecoratorInterceptor$StatementProxy.invoke(StatementDecoratorInterceptor.java:237)
at com.sun.proxy.$Proxy93.executeQuery
at sun.reflect.GeneratedMethodAccessor49.invoke
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.tomcat.jdbc.pool.interceptor.AbstractQueryReport$StatementProxy.invoke(AbstractQueryReport.java:210)
at com.sun.proxy.$Proxy93.executeQuery
at sun.reflect.GeneratedMethodAccessor49.invoke
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.tomcat.jdbc.pool.StatementFacade$StatementProxy.invoke(StatementFacade.java:114)
at com.sun.proxy.$Proxy93.executeQuery
at 
org.apache.jackrabbit.oak.plugins.document.rdb.RDBDocumentStoreJDBC$ResultSetIterator.(RDBDocumentStoreJDBC.java:591)
at 
org.apache.jackrabbit.oak.plugins.document.rdb.RDBDocumentStoreJDBC.queryAsIterator(RDBDocumentStoreJDBC.java:558)
at