I'm getting this exception
2014-06-03 19:59:13 STDIO [ERROR][id:] Jun 03, 2014 7:59:13 PM
org.jboss.netty.channel.DefaultChannelPipeline
WARNING: An exception was thrown by a user handler while handling an
exception event ([id: 0xdcf3be42] EXCEPTION: java.net.ConnectException:
Connection refused)
My definition of stream is continuous feed of data of certain type or with
certain purpose (depends on how you want to define your process)
I have a situation, where the Domain Object is same across the whole
topology, however, each component working on bits and pieces to construct
the final
My definition of stream is continuous feed of data of certain type or with
certain purpose (depends on how you want to define your process)
I have a situation, where the Domain Object is same across the whole
topology, however, each component working on bits and pieces to construct
the final
Can you elaborate on what exactly you mean by join. You can have a bolt
defined as part of topology which will load the other jar in the prepare()
method call actual functional methods in the execute method. This way ,
you are dynamically loading the other jar into your storm topo...(which in
my
out). This was resulting in
leak.
Thanks,
Prasun
On Mon, May 19, 2014 at 1:12 AM, P Ghosh javadevgh...@gmail.com wrote:
I have a topology, that looks like
*All Bolts emits Fields id, json*
*Spout emits only id*
*All bolts/spout uses the stream ComponentName_stream name while
emitting
I have a topology, that looks like
*All Bolts emits Fields id, json*
*Spout emits only id*
*All bolts/spout uses the stream ComponentName_stream name while emitting*
*Topology Definition*
*==*
builder.setSpout(citySpout, citySpout,10);
Use soft links.
Prasun
Sent from Galaxy Nexus
On May 14, 2014 6:50 AM, Neha Jain neha_sj...@persistent.co.in wrote:
Hello,
I have created a storm cluster on Amazon EC2 machines. The requirement we
have is to save Storm configuration file to Amazon S3.
There would be a bucket in S3
I have few topologies running. The spout puts the ID of the object it is
emitting into an WIP list in REDIS. When the spout gets the ack or fail
method called, it takes it out of the WIP list.
The environment and application are undergoing lot of changes.. and as a
result I'm required to
I added metrics to my storm implementation by implementing IMetric. It is
working and can see the metrics log populated with all stats. I've a 3 node
(3 worker) and 1 nimbus/zookeeper in Development.
On WORKER1's Metrics Log I can see , some metrics with reference to
WORKER2 and WORKER3. For