Thanks for the replies.

Harsh J, "hadoop classpath" was exactly what I needed. Got it working now.

Cheers,

Krishna

On 6 January 2013 11:14, John Hancock <jhancock1...@gmail.com> wrote:

> Krishna,
>
> You should be able to take the command you are using to start the hadoop
> job (hadoop jar ..) and paste it into a text file.  Then make the file
> executable and call it as a shell script in a CRON job (crontab -e).  To be
> safe, use absolute paths to reference any files in the command.
>
> Or, I suppose what you crazy kids and your object oriented programming
> would do is use Quartz.
>
>
> -John
>
> On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
> chitreshdeshpa...@gmail.com> wrote:
>
>> Hi Krishna,
>>
>> I dont know what do you mean by Hadoop daemon, but if you mean run when
>> all the other hadoop daemons like namenode, datanode etc are started, then
>> you can change start-all file in conf directory.
>>
>> Thanks and Regards,
>> Chitresh Deshpande
>>
>>
>> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <krishnanj...@gmail.com>wrote:
>>
>>> Hi al,
>>>
>>> I have a java application jar that converts some files and writes
>>> directly into hdfs.
>>>
>>> If I want to run the jar I need to run it using "hadoop jar <application
>>> jar>", so that it can access HDFS (that is running "java -jar <application
>>> jar> results in a HDFS error").
>>>
>>> Is it possible to run an jar as a hadoop daemon?
>>>
>>> Cheers,
>>>
>>> Krishna
>>>
>>
>>
>

Reply via email to