[jira] [Assigned] (OAK-6452) IllegalStateException: too much data for a segment during oak-upgrade from segment to segment-tar

2017-08-14 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/OAK-6452?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Dürig reassigned OAK-6452:
--

Assignee: Valentin Olteanu  (was: Michael Dürig)

> IllegalStateException: too much data for a segment during oak-upgrade from 
> segment to segment-tar
> -
>
> Key: OAK-6452
> URL: https://issues.apache.org/jira/browse/OAK-6452
> Project: Jackrabbit Oak
>  Issue Type: Bug
>  Components: segment-tar, upgrade
>Affects Versions: 1.7.3
>Reporter: Valentin Olteanu
>Assignee: Valentin Olteanu
>Priority: Critical
> Fix For: 1.7.6
>
>
> During the migration of a big repo from the {{old-segment}} format to 
> {{segment-tar}} using {{oak-upgrade-1.7.3}}, I've got the following error:
> {code}
> 14.07.2017 09:05:51.920 [main] *INFO*  
> org.apache.jackrabbit.oak.upgrade.RepositorySidegrade - Copying node 
> #89333: /oak:index/uuid/:index/a9f9a3ed-6183-4e9e-9480-1b4fd196a829
> 14.07.2017 10:00:27.957 [TarMK flush 
> [extracted/crx-quickstart/repository-oak-upgrade/segmentstore]] *ERROR*  
> org.apache.jackrabbit.oak.segment.file.SafeRunnable - Uncaught exception in 
> TarMK flush [extracted/crx-quickstart/repository-oak-upgrade/segmentstore]
> java.lang.IllegalStateException: too much data for a segment
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriter.flush(SegmentBufferWriter.java:322)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriterPool.flush(SegmentBufferWriterPool.java:142)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.DefaultSegmentWriter.flush(DefaultSegmentWriter.java:138)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore$8.call(FileStore.java:345) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore$8.call(FileStore.java:342) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.TarRevisions.doFlush(TarRevisions.java:213)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.TarRevisions.flush(TarRevisions.java:201)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore.flush(FileStore.java:342) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore$3.run(FileStore.java:242) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.SafeRunnable.run(SafeRunnable.java:67) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> [na:1.8.0_131]
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) 
> [na:1.8.0_131]
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  [na:1.8.0_131]
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  [na:1.8.0_131]
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  [na:1.8.0_131]
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  [na:1.8.0_131]
> at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
> 14.07.2017 10:00:28.448 [main] *INFO*  
> org.apache.jackrabbit.oak.segment.file.FileStore - TarMK closed: 
> extracted/crx-quickstart/repository-oak-upgrade/segmentstore
> 14.07.2017 10:00:28.658 [main] *INFO*  
> org.apache.jackrabbit.oak.plugins.segment.file.FileStore - TarMK closed: 
> extracted/crx-quickstart/repository/segmentstore
> Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit 
> exceeded
> at 
> org.apache.jackrabbit.oak.segment.RecordType.values(RecordType.java:24)
> at 
> org.apache.jackrabbit.oak.segment.ImmutableRecordNumbers$1$1.getType(ImmutableRecordNumbers.java:86)
> at 
> org.apache.jackrabbit.oak.segment.Segment.forEachRecord(Segment.java:703)
> at 
> org.apache.jackrabbit.oak.segment.file.AbstractFileStore.readBinaryReferences(AbstractFileStore.java:277)
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore.writeSegment(FileStore.java:511)
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriter.flush(SegmentBufferWriter.java:356)
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriter.prepare(SegmentBufferWriter.java:423)
> at 
> org.apache.jackrabbit.oak.segment.RecordWriters$RecordWriter.write(RecordWriters.java:70)
> at 
> 

[jira] [Assigned] (OAK-6452) IllegalStateException: too much data for a segment during oak-upgrade from segment to segment-tar

2017-08-02 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/OAK-6452?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Dürig reassigned OAK-6452:
--

Assignee: Michael Dürig

> IllegalStateException: too much data for a segment during oak-upgrade from 
> segment to segment-tar
> -
>
> Key: OAK-6452
> URL: https://issues.apache.org/jira/browse/OAK-6452
> Project: Jackrabbit Oak
>  Issue Type: Bug
>  Components: segment-tar, upgrade
>Affects Versions: 1.7.3
>Reporter: Valentin Olteanu
>Assignee: Michael Dürig
>Priority: Critical
> Fix For: 1.7.6
>
>
> During the migration of a big repo from the {{old-segment}} format to 
> {{segment-tar}} using {{oak-upgrade-1.7.3}}, I've got the following error:
> {code}
> 14.07.2017 09:05:51.920 [main] *INFO*  
> org.apache.jackrabbit.oak.upgrade.RepositorySidegrade - Copying node 
> #89333: /oak:index/uuid/:index/a9f9a3ed-6183-4e9e-9480-1b4fd196a829
> 14.07.2017 10:00:27.957 [TarMK flush 
> [extracted/crx-quickstart/repository-oak-upgrade/segmentstore]] *ERROR*  
> org.apache.jackrabbit.oak.segment.file.SafeRunnable - Uncaught exception in 
> TarMK flush [extracted/crx-quickstart/repository-oak-upgrade/segmentstore]
> java.lang.IllegalStateException: too much data for a segment
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriter.flush(SegmentBufferWriter.java:322)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriterPool.flush(SegmentBufferWriterPool.java:142)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.DefaultSegmentWriter.flush(DefaultSegmentWriter.java:138)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore$8.call(FileStore.java:345) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore$8.call(FileStore.java:342) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.TarRevisions.doFlush(TarRevisions.java:213)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.TarRevisions.flush(TarRevisions.java:201)
>  ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore.flush(FileStore.java:342) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore$3.run(FileStore.java:242) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> org.apache.jackrabbit.oak.segment.file.SafeRunnable.run(SafeRunnable.java:67) 
> ~[oak-upgrade-1.7.3.jar:1.7.3]
> at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> [na:1.8.0_131]
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) 
> [na:1.8.0_131]
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  [na:1.8.0_131]
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  [na:1.8.0_131]
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  [na:1.8.0_131]
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  [na:1.8.0_131]
> at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
> 14.07.2017 10:00:28.448 [main] *INFO*  
> org.apache.jackrabbit.oak.segment.file.FileStore - TarMK closed: 
> extracted/crx-quickstart/repository-oak-upgrade/segmentstore
> 14.07.2017 10:00:28.658 [main] *INFO*  
> org.apache.jackrabbit.oak.plugins.segment.file.FileStore - TarMK closed: 
> extracted/crx-quickstart/repository/segmentstore
> Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit 
> exceeded
> at 
> org.apache.jackrabbit.oak.segment.RecordType.values(RecordType.java:24)
> at 
> org.apache.jackrabbit.oak.segment.ImmutableRecordNumbers$1$1.getType(ImmutableRecordNumbers.java:86)
> at 
> org.apache.jackrabbit.oak.segment.Segment.forEachRecord(Segment.java:703)
> at 
> org.apache.jackrabbit.oak.segment.file.AbstractFileStore.readBinaryReferences(AbstractFileStore.java:277)
> at 
> org.apache.jackrabbit.oak.segment.file.FileStore.writeSegment(FileStore.java:511)
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriter.flush(SegmentBufferWriter.java:356)
> at 
> org.apache.jackrabbit.oak.segment.SegmentBufferWriter.prepare(SegmentBufferWriter.java:423)
> at 
> org.apache.jackrabbit.oak.segment.RecordWriters$RecordWriter.write(RecordWriters.java:70)
> at 
>