Re: Batch too large exception

2018-03-07 Thread Goutham reddy
Mkadek,
Sorry for the late reply. Thanks for the insight that I am unknowingly
using batch inserts (Spring Data Cassandra) using the repository.save where
I am inserting a list of objects at one go. And Cassandra is treating it as
Batch Inserts aborting because of size and write timeout exception. I have
now changed the logic and inserting each partition as all the partition
keys are distributed across all the coordinators on the cluster unlike in
Batch all the set of inserts are redirected to one coordinator node. Hope
somebody avoids this mistake of inserting list of objects.

http://christopher-batey.blogspot.com/2015/02/cassandra-anti-pattern-misuse-of.html?m=1

Above site explained clearly how to perform huge writes into Cassandra.

Thanks and Regards,
Goutham Reddy Aenugu.

On Wed, Feb 28, 2018 at 5:05 AM Marek Kadek -T (mkadek - CONSOL PARTNERS
LTD at Cisco) <mka...@cisco.com> wrote:

> Hi,
>
>
>
> Are you writing the batch to same partition? If not, there is a much
> stricter limit (I think 50Kb).
>
> Check https://docs.datastax.com/en/cql/3.3/cql/cql_using/useBatch.html ,
> and followups.
>
>
>
> *From: *Goutham reddy <goutham.chiru...@gmail.com>
> *Reply-To: *"user@cassandra.apache.org" <user@cassandra.apache.org>
> *Date: *Tuesday, February 27, 2018 at 9:55 PM
> *To: *"user@cassandra.apache.org" <user@cassandra.apache.org>
> *Subject: *Batch too large exception
>
>
>
> Hi,
>
> I have been getting batch too large exception when performing WRITE from
> Client application. My insert size is 5MB, so I have to split the 10 insert
> objects to insert at one go. It save some inserts and closes after some
> uncertain time. And it is a wide column table, we do have 113 columns. Can
> anyone kindly provide solution what was going wrong on my execution.
> Appreciate your help.
>
>
> Regards
>
> Goutham Reddy
>
>
>
-- 
Regards
Goutham Reddy


Re: Batch too large exception

2018-02-28 Thread Marek Kadek -T (mkadek - CONSOL PARTNERS LTD at Cisco)
Hi,

Are you writing the batch to same partition? If not, there is a much stricter 
limit (I think 50Kb).
Check https://docs.datastax.com/en/cql/3.3/cql/cql_using/useBatch.html , and 
followups.

From: Goutham reddy <goutham.chiru...@gmail.com>
Reply-To: "user@cassandra.apache.org" <user@cassandra.apache.org>
Date: Tuesday, February 27, 2018 at 9:55 PM
To: "user@cassandra.apache.org" <user@cassandra.apache.org>
Subject: Batch too large exception

Hi,
I have been getting batch too large exception when performing WRITE from Client 
application. My insert size is 5MB, so I have to split the 10 insert objects to 
insert at one go. It save some inserts and closes after some uncertain time. 
And it is a wide column table, we do have 113 columns. Can anyone kindly 
provide solution what was going wrong on my execution. Appreciate your help.

Regards
Goutham Reddy



Batch too large exception

2018-02-27 Thread Goutham reddy
 Hi,
I have been getting batch too large exception when performing WRITE from
Client application. My insert size is 5MB, so I have to split the 10 insert
objects to insert at one go. It save some inserts and closes after some
uncertain time. And it is a wide column table, we do have 113 columns. Can
anyone kindly provide solution what was going wrong on my execution.
Appreciate your help.

Regards
Goutham Reddy