Thanks for your help parth

When i trying to run the topology to write the data to hdfs it throws
exception Class Not Found:
org.apache.hadoop.client.hdfs.HDFSDataOutputStream$SyncFlags
Can anyone tell me what are the jars needed to execute the code to write
data to hdfs. Please tell me all the required jars.


On Wed, Jul 16, 2014 at 10:46 AM, Parth Brahmbhatt <
pbrahmbh...@hortonworks.com> wrote:

> You can use
>
> https://github.com/ptgoetz/storm-hdfs
>
> It supports writing to HDFS with both Storm bolts and trident states.
> Thanks
> Parth
>
> On Jul 16, 2014, at 10:41 AM, amjad khan <amjadkhan987...@gmail.com>
> wrote:
>
> Can anyone provide the code for bolt to write its data to hdfs. Kindly
> tell me the jar's required to run that bolt.
>
>
> On Mon, Jul 14, 2014 at 2:33 PM, Max Evers <mcev...@gmail.com> wrote:
>
>> Can you expand on your use case? What is the query selecting on? Is the
>> column you are querying on indexed?  Do you really need to look at the
>> entire 20 gb every 20ms?
>>  On Jul 14, 2014 6:39 AM, "amjad khan" <amjadkhan987...@gmail.com> wrote:
>>
>>> I made a storm topoogy where spout was fetching data from mysql using
>>> select query. The select query was fired after every 30 msec but because
>>> the size of the table is more than 20 GB the select query takes more than
>>> 10 sec to execute therefore this is not working. I need to know what are
>>> the possible alternatives for this situation. Kindly reply as soon as
>>> possible.
>>>
>>> Thanks,
>>>
>>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

Reply via email to