Values in /etc/security/limits.d/cassandra.conf

# Provided by the cassandra package
cassandra  -  memlock  unlimited
cassandra  -  nofile   100000


On Mon, Apr 20, 2015 at 12:21 PM, Kiran mk <coolkiran2...@gmail.com> wrote:

> Hi,
>
> Thanks for the info,
>
> Does the nproc,nofile,memlock settings in
> /etc/security/limits.d/cassandra.conf are set to optimum value ?
>
> What is the consistency level ?
>
> Best Regardds,
> Kiran.M.K.
>
>
> On Mon, Apr 20, 2015 at 11:55 AM, Neha Trivedi <nehajtriv...@gmail.com>
> wrote:
>
>> hi,
>>
>> What is the count of records in the column-family ?
>>       We have about 38,000 Rows in the column-family for which we are
>> trying to export
>> What  is the Cassandra Version ?
>>      We are using Cassandra 2.0.11
>>
>> MAX_HEAP_SIZE and HEAP_NEWSIZE is the default .
>> The Server is 8 GB.
>>
>> regards
>> Neha
>>
>> On Mon, Apr 20, 2015 at 11:39 AM, Kiran mk <coolkiran2...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> check  the MAX_HEAP_SIZE configuration in cassandra-env.sh environment
>>> file
>>>
>>> Also HEAP_NEWSIZE ?
>>>
>>> What is the Consistency Level you are using ?
>>>
>>> Best REgards,
>>> Kiran.M.K.
>>>
>>> On Mon, Apr 20, 2015 at 11:13 AM, Kiran mk <coolkiran2...@gmail.com>
>>> wrote:
>>>
>>>> Seems like the is related to JAVA HEAP Memory.
>>>>
>>>> What is the count of records in the column-family ?
>>>>
>>>> What  is the Cassandra Version ?
>>>>
>>>> Best Regards,
>>>> Kiran.M.K.
>>>>
>>>> On Mon, Apr 20, 2015 at 11:08 AM, Neha Trivedi <nehajtriv...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hello all,
>>>>>
>>>>> We are getting the OutOfMemoryError on one of the Node and the Node is
>>>>> down, when we run the export command to get all the data from a table.
>>>>>
>>>>>
>>>>> Regards
>>>>> Neha
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> ERROR [ReadStage:532074] 2015-04-09 01:04:00,603 CassandraDaemon.java
>>>>> (line 199) Exception in thread Thread[ReadStage:532074,5,main]
>>>>> java.lang.OutOfMemoryError: Java heap space
>>>>>         at
>>>>> org.apache.cassandra.io.util.RandomAccessReader.readBytes(RandomAccessReader.java:347)
>>>>>         at
>>>>> org.apache.cassandra.utils.ByteBufferUtil.read(ByteBufferUtil.java:392)
>>>>>         at
>>>>> org.apache.cassandra.utils.ByteBufferUtil.readWithLength(ByteBufferUtil.java:355)
>>>>>         at
>>>>> org.apache.cassandra.db.ColumnSerializer.deserializeColumnBody(ColumnSerializer.java:124)
>>>>>         at
>>>>> org.apache.cassandra.db.OnDiskAtom$Serializer.deserializeFromSSTable(OnDiskAtom.java:85)
>>>>>         at org.apache.cassandra.db.Column$1.computeNext(Column.java:75)
>>>>>         at org.apache.cassandra.db.Column$1.computeNext(Column.java:64)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
>>>>>         at
>>>>> org.apache.cassandra.db.columniterator.SimpleSliceReader.computeNext(SimpleSliceReader.java:88)
>>>>>         at
>>>>> org.apache.cassandra.db.columniterator.SimpleSliceReader.computeNext(SimpleSliceReader.java:37)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
>>>>>         at
>>>>> org.apache.cassandra.db.columniterator.SSTableSliceIterator.hasNext(SSTableSliceIterator.java:82)
>>>>>         at
>>>>> org.apache.cassandra.db.columniterator.LazyColumnIterator.computeNext(LazyColumnIterator.java:82)
>>>>>         at
>>>>> org.apache.cassandra.db.columniterator.LazyColumnIterator.computeNext(LazyColumnIterator.java:59)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
>>>>>         at
>>>>> org.apache.cassandra.db.filter.QueryFilter$2.getNext(QueryFilter.java:157)
>>>>>         at
>>>>> org.apache.cassandra.db.filter.QueryFilter$2.hasNext(QueryFilter.java:140)
>>>>>         at
>>>>> org.apache.cassandra.utils.MergeIterator$OneToOne.computeNext(MergeIterator.java:200)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
>>>>>         at
>>>>> org.apache.cassandra.db.filter.SliceQueryFilter.collectReducedColumns(SliceQueryFilter.java:185)
>>>>>         at
>>>>> org.apache.cassandra.db.filter.QueryFilter.collateColumns(QueryFilter.java:122)
>>>>>         at
>>>>> org.apache.cassandra.db.filter.QueryFilter.collateOnDiskAtom(QueryFilter.java:80)
>>>>>         at
>>>>> org.apache.cassandra.db.RowIteratorFactory$2.getReduced(RowIteratorFactory.java:101)
>>>>>         at
>>>>> org.apache.cassandra.db.RowIteratorFactory$2.getReduced(RowIteratorFactory.java:75)
>>>>>         at
>>>>> org.apache.cassandra.utils.MergeIterator$ManyToOne.consume(MergeIterator.java:115)
>>>>>         at
>>>>> org.apache.cassandra.utils.MergeIterator$ManyToOne.computeNext(MergeIterator.java:98)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
>>>>>         at
>>>>> com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards,
>>>> Kiran.M.K.
>>>>
>>>
>>>
>>>
>>> --
>>> Best Regards,
>>> Kiran.M.K.
>>>
>>
>>
>
>
> --
> Best Regards,
> Kiran.M.K.
>

Reply via email to