I figured my mistake:
the
props.put(serializer.class, kafka.serializer.DefaultEncoder);
was set to StringEncoder instead of the DefaultEncoder which takes a
byte[]However the key also needs to be a byte[] and i had to change that too.
From: ele...@msn.com
To: user@storm.apache.org
Subject:
I am trying to troubleshoot an issue with our storm cluster where a worker
process on one of the machines in the cluster does not perform any work.
All the counts(emitted/transferred/executed) for all executors in that
worker are 0 as shown below. Even if I restart the worker, storm supervisor
and here are config from storm.yaml
supervisor.worker.start.timeout.secs: 300
supervisor.worker.timeout.secs: 60
nimbus.task.timeout.secs: 30
storm.zookeeper.session.timeout: 6
storm.zookeeper.connection.timeout: 5
On Thu, Mar 19, 2015 at 11:59 AM, Tousif tousif.pa...@gmail.com
In my experience (note this was before pluggable serializer) the
performance gains of going to c++ were more than negated by the overhead of
serialization.
On Mar 19, 2015 5:57 AM, Denis DEBARBIEUX ddebarbi...@norsys.fr wrote:
Dear all,
I am wondering if it is good idea to switch my
Please correct me if I'm wrong. I think the only way you can use C/C++ is
through JNI.
Thanks,
Supun..
On Thu, Mar 19, 2015 at 5:56 AM, Denis DEBARBIEUX ddebarbi...@norsys.fr
wrote:
Dear all,
I am wondering if it is good idea to switch my implementation of
bolt/spout from Java to C/C++ (in
More information about your topology would help, but..
I’ll assume you’re using a core API topology (spouts/bolts).
On the kafka spout side, does the spout parallelism == the # of kafka
partitions? (It should.)
On the bolt side, are you using fields groupings at all, and if so, what does
the
.//storm/storm-core/src/clj/backtype/storm/daemon/task.clj
But the bigger question is what are you trying to do? A little more context
would likely help us help you.
-Taylor
On Mar 19, 2015, at 7:02 PM, Ravali Kandur kandu...@umn.edu wrote:
Hi,
I am trying to use Storm for my
Hello
I'm reading data from Kafka formatted as a Protobuf object (it comes out as a
byte[] )
This works fine and I can read / decode the data, but trying to push back to
the queue, when declaring the Kafka Bolt without any type specifics, it seems
to require a String object that then gets
Hello all,
I came across the following issue in my cluster and I would like to share
it with you, in case you have any proposals/solutions:
I am running my Storm (0.9.2) application in a 5 node cluster. Each bolt is
assigned randomly to one of the available workers and during execution,
each
Hi,
I'm currently experimenting with Storm in order to figure out whether it is
the right fit for my project, and I would like to seek other user's
opinions on this, as the tests I'm currently doing are getting costlier and
costlier (I'm now working on setting up a full scale cluster and trying
NTP?
http://en.wikipedia.org/wiki/Network_Time_Protocol
[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]
Grant Overby
Software Engineer
Cisco.comhttp://www.cisco.com/
grove...@cisco.commailto:grove...@cisco.com
Mobile: 865 724 4910
Hello Grant and thanks for your reply. I am aware of NTP, but I would not
like to go through the process of setting it up in the cluster. Is there
any tool inside Storm that can help me overcome this issue?
Thank you,
Nick
2015-03-19 10:22 GMT-04:00 Grant Overby (groverby) grove...@cisco.com:
Actually Grant, I just figured out that is not difficult to setup NTP on
those machines. I will go with that for now. Thank you very much
Nikos
2015-03-19 10:26 GMT-04:00 Nick R. Katsipoulakis nick.kat...@gmail.com:
Hello Grant and thanks for your reply. I am aware of NTP, but I would not
Hi, I've got once the same error when in deserialization was reading the
wrong number of bytes that were sent in serialization code.
2015-03-19 19:01 GMT+04:00 Luke Rohde rohde.l...@gmail.com:
Hi, I'm seeing this exception from kryo deep inside storm and I haven't
been able to figure it out.
How did you resolve it? This started being a problem after I added a third
field to an output tuple, just a String.
On Thu, Mar 19, 2015 at 4:20 PM Vladimir Protsenko protsenk...@gmail.com
wrote:
Hi, I've got once the same error when in deserialization was reading the
wrong number of bytes
Hi Luke,
Can you elaborate on the steps necessary to reproduce the problem? There’s not
much to go on here.
-Taylor
On Mar 19, 2015, at 4:40 PM, Luke Rohde rohde.l...@gmail.com wrote:
How did you resolve it? This started being a problem after I added a third
field to an output tuple, just
16 matches
Mail list logo