Object mapper. InvalidQueryException Key may not be empty

2016-09-12 Thread Alexandr Porunov
*Hello,*

*I am using DataStax Java Driver and can not execute "save" command with
the object mapper.*

*I have next table:*

CREATE TABLE media_upload.audio_info (
  swift_id blob,
  size bigint,
  audio_id blob,
  last_succeed_segment bigint,
  PRIMARY KEY (swift_id)
);

*Here is my table in Java:*

package com.loader.entity.tmp;

import com.datastax.driver.mapping.annotations.Column;
import com.datastax.driver.mapping.annotations.PartitionKey;
import com.datastax.driver.mapping.annotations.Table;
import com.datastax.driver.mapping.annotations.Transient;

import javax.xml.bind.DatatypeConverter;
import java.nio.ByteBuffer;

@Table(keyspace = "media_upload", name = "audio_info",
readConsistency = "QUORUM",
writeConsistency = "QUORUM",
caseSensitiveKeyspace = false,
caseSensitiveTable = false)
public class AudioInfo{

private ByteBuffer swiftId;

private Long size;

private ByteBuffer audioId;

private Long lastSucceedSegment;

@PartitionKey
@Column(name = "swift_id")
public ByteBuffer getSwiftId() {
return swiftId;
}

@Column(name = "size")
public Long getSize() {
return size;
}

@Column(name = "audio_id")
public ByteBuffer getAudioId() {
return audioId;
}

@Column(name = "last_succeed_segment")
public Long getLastSucceedSegment() {
return lastSucceedSegment;
}

public void setSwiftId(ByteBuffer swiftId){
this.swiftId=swiftId;
}

@Transient
public String getHexSwiftId() {
return DatatypeConverter.printHexBinary(swiftId.array());
}

@Transient
public void setSize(Long size){
this.size=size;
}

@Transient
public String getHexAudioId() {
return DatatypeConverter.printHexBinary(audioId.array());
}

@Transient
public void setAudioId(ByteBuffer audioId){
this.audioId=audioId;
}

@Transient
public void setLastSucceedSegment(Long lastSucceedSegment){
this.lastSucceedSegment=lastSucceedSegment;
}

@Transient
@Override
public String toString() {
return "{swift_id='"+getHexSwiftId()+
"',size='"+getSize()+
"',audio_id='"+getHexAudioId()+
"',last_succeed_segment='"+lastSucceedSegment+"'}";
}
}

*Here is what I am doing to save AudioInfo:*

AudioInfo audioInfo = new AudioInfo();
audioInfo.setSwiftId(ByteBuffer.allocate(Long.BYTES).putLong(123));
audioInfo.setAudioId(ByteBuffer.allocate(Long.BYTES).putLong(124));
audioInfo.setLastSucceedSegment(0L);
audioInfo.setSize(100L);
mapper.save(audioInfo);

*After "mapper.save(audioInfo);" I am getting an error:*
com.datastax.driver.core.exceptions.InvalidQueryException: Key may not be
empty
at
com.datastax.driver.core.Responses$Error.asException(Responses.java:136)
at
com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:179)
at
com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:174)
at
com.datastax.driver.core.RequestHandler.access$2600(RequestHandler.java:43)
at
com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult(RequestHandler.java:793)
at
com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:627)
at
com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1012)
at
com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:935)
at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:321)
at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:321)
at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:328)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:321)
at
io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
at
io.

Re: How to define blob column in Java?

2016-09-11 Thread Alexandr Porunov
Hello Andy,

Thank you very much!

Sincerely,
Alexandr

On Sun, Sep 11, 2016 at 9:53 PM, Andrew Tolbert  wrote:

> Hi Alexandr,
>
> I am assuming you are referring to the @Table annotation in the mapping
> module in the Datastax Java Driver for Apache Cassandra (please correct me
> if I am wrong).
>
> You can achieve this with any of these three types using a custom codec
> <http://datastax.github.io/java-driver/manual/object_mapper/custom_codecs/>,
> but it will work as is using ByteBuffer.  Here's a quick example:
>
> import com.datastax.driver.core.Cluster;
> import com.datastax.driver.core.Session;
> import com.datastax.driver.core.utils.Bytes;
> import com.datastax.driver.mapping.Mapper;
> import com.datastax.driver.mapping.MappingManager;
> import com.datastax.driver.mapping.annotations.Column;
> import com.datastax.driver.mapping.annotations.PartitionKey;
> import com.datastax.driver.mapping.annotations.Table;
>
> import java.nio.ByteBuffer;
>
> public class MapperBlobExample {
>
> @Table(keyspace="ex", name="blob_ex")
> static class BlobEx {
>
> @PartitionKey
> int k;
>
> @Column
> ByteBuffer b;
>
> int getK() {
> return k;
> }
>
> void setK(int k) {
> this.k = k;
> }
>
> ByteBuffer getB() {
> return b;
> }
>
> void setB(ByteBuffer b) {
> this.b = b;
> }
> }
>
> public static void main(String args[]) {
> Cluster cluster = Cluster.builder().addContactPoint("127.0.0.1").
> build();
> try {
> Session session = cluster.connect();
> session.execute("CREATE KEYSPACE IF NOT EXISTS ex WITH
> replication = {'class': 'SimpleStrategy', 'replication_factor': 1};");
> session.execute("CREATE TABLE IF NOT EXISTS ex.blob_ex (k int
> PRIMARY KEY, b blob);");
>
> MappingManager manager = new MappingManager(session);
> Mapper mapper = manager.mapper(BlobEx.class);
>
> // insert row
> BlobEx ex = new BlobEx();
> ex.setK(0);
> ex.setB(Bytes.fromHexString("0xffee"));
> mapper.save(ex);
>
> // retrieve row
> BlobEx ex0 = mapper.get(0);
> System.out.println(Bytes.toHexString(ex0.getB()));
> } finally {
> cluster.close();
> }
> }
> }
>
> There are a few pitfalls around using ByteBuffer with the driver that you
> should be aware of (this example
> <https://github.com/datastax/java-driver/blob/3.0/driver-examples/src/main/java/com/datastax/driver/examples/datatypes/Blobs.java>
>  covers
> them).  The java-driver-user mailing list
> <https://groups.google.com/a/lists.datastax.com/forum/#!forum/java-driver-user>
>  can
> also help.
>
> Thanks!
> Andy
>
> On Sun, Sep 11, 2016 at 1:50 AM Alexandr Porunov <
> alexandr.poru...@gmail.com> wrote:
>
>> Hello,
>>
>> I am using @Table annotation to define tables in cassandra. How properly
>> I need to define blob type in Java? With ByteBuffer, byte[], String?
>>
>> Sincerely,
>> Alexandr
>>
>


How to define blob column in Java?

2016-09-10 Thread Alexandr Porunov
Hello,

I am using @Table annotation to define tables in cassandra. How properly I
need to define blob type in Java? With ByteBuffer, byte[], String?

Sincerely,
Alexandr


Re: Is a blob storage cost of cassandra is the same as bigint storage cost for long variables?

2016-09-08 Thread Alexandr Porunov
Hello Romain,

Thank you very much for the explanation!

I have just run a simple test to compare both situations.
I have run two VM equivalent machines.
Machine 1:
CREATE KEYSPACE "test" WITH REPLICATION = { 'class' : 'SimpleStrategy',
'replication_factor' : 1 };

CREATE TABLE test.simple (
  id bigint PRIMARY KEY
);

Machine 2:
CREATE KEYSPACE "test" WITH REPLICATION = { 'class' : 'SimpleStrategy',
'replication_factor' : 1 };

CREATE TABLE test.simple (
  id blob PRIMARY KEY
);

And have put 13421772 primary keys from 1 to 13421772 in both machines.

Results:
Machine 1: size of the data folder: 495864 bytes
Machine 2: size of the data folder: 495004 bytes

So here is almost no any difference between them (even happened with blob
storage cost 1 MB less).

I am happy about it because I need to store special encoded primary keys
with 80 bits each. So I can use blob as a primary key without hesitation.

Best regards,
Alexandr

On Fri, Sep 9, 2016 at 1:20 AM, Romain Hardouin  wrote:

> Hi,
>
> Disk-wise it's the same because a bigint is serialized as a 8 bytes
> ByteBuffer and if you want to store a Long as bytes into a blob type it
> will take 8 bytes too, right?
> The difference is the validation. The blob ByteBuffer will be stored as is
> whereas the bigint will be validated. So technically the Long is slower,
> but I guess that's not noticeable.
>
> Yes you can use a blob as a partition key. I would use the bigint both
> for validation and clarity.
>
> Best,
>
> Romain
>
>
> Le Mercredi 7 septembre 2016 22h54, Alexandr Porunov <
> alexandr.poru...@gmail.com> a écrit :
>
>
> Hello,
>
> I need to store a "Long" Java variable.
> The question is: whether the storage cost is the same both for store hex
> representation of "Long" variable to the blob and for store "Long" variable
> to the bigint?
> Are there any performance pros or cons?
> Is it OK to use blob as primary key?
>
> Sincerely,
> Alexandr
>
>
>


Overhead of data types in cassandra

2016-09-08 Thread Alexandr Porunov
Hello,

Where can I find information about overhead of data types in cassandra?
I am interested about blob, text, uuid, timeuuid data types. Does a blob
type store a value with the length of the blob data? If yes then which type
of the length it is using (int, bigint)?
If I want to store 80 bits how much of disk space will be used for it? If I
want to store 64 bits is it better to use bigint?

Sincerely,
Alexandr


Is a blob storage cost of cassandra is the same as bigint storage cost for long variables?

2016-09-07 Thread Alexandr Porunov
Hello,

I need to store a "Long" Java variable.
The question is: whether the storage cost is the same both for store hex
representation of "Long" variable to the blob and for store "Long" variable
to the bigint?
Are there any performance pros or cons?
Is it OK to use blob as primary key?

Sincerely,
Alexandr


How to configure cassandra in a multi cluster mode?

2016-08-25 Thread Alexandr Porunov
Hello,

I am little bit confusing about cassandra's configuration.
There are 2 parameters which I don't understand:
listen_address
seeds

I have 4 identical nodes:
192.168.0.61 cassandra1
192.168.0.62 cassandra2
192.168.0.63 cassandra3
192.168.0.64 cassandra4

What shell I do to configure those 4 nodes into a single cluster?

Sincerely,
Alexandr