Great. Good luck with that.

Warm Regards,
Tariq
cloudfront.blogspot.com


On Thu, Jul 18, 2013 at 12:43 AM, Bharati Adkar <
bharati.ad...@mparallelo.com> wrote:

> Hi Tariq,
>
> No Problems,
> It was the hive.jar.path property that was not being set. Figured it out
> and fixed it.
> Got the plan.xml and jobconf.xml now will debug hadoop to get the rest of
> info.
>
> Thanks,
> Warm regards,
> Bharati
>
> On Jul 17, 2013, at 12:08 PM, Mohammad Tariq <donta...@gmail.com> wrote:
>
> Hello ma'm,
>
> Apologies first of all for responding so late. Stuck with some urgent
> deliverables. Was out of touch for a while.
>
> java.io.IOException: Cannot run program
> "/Users/bharati/hive-0.11.0/src/testutils/hadoop" (in directory
> "/Users/bharati/eclipse/tutorial/src"): error=13, Permission denied
>  at java.lang.ProcessBuilder.processException(ProcessBuilder.java:478)
>  at java.lang.ProcessBuilder.start(ProcessBuilder.java:457)
>  at java.lang.Runtime.exec(Runtime.java:593)
>  at java.lang.Runtime.exec(Runtime.java:431)
>  at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:269)
>  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
>  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(
> TaskRunner.java:57)
>  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
>  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
>  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
>  at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(
> HiveServer.java:198)
>  at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(
> ThriftHive.java:644)
>  at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(
> ThriftHive.java:1)
>  at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>  at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>  at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(
> TThreadPoolServer.java:206)
>  at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
> ThreadPoolExecutor.java:895)
>  at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:918)
>  at java.lang.Thread.run(Thread.java:680)
> Caused by: java.io.IOException: error=13, Permission denied
>  at java.lang.UNIXProcess.forkAndExec(Native Method)
>  at java.lang.UNIXProcess.<init>(UNIXProcess.java:53)
>  at java.lang.ProcessImpl.start(ProcessImpl.java:91)
>  at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
>  ... 17 more
>
> Please make sure you have proper permissions set for this path.
>
> Warm Regards,
> Tariq
> cloudfront.blogspot.com
>
>
> On Wed, Jul 17, 2013 at 8:03 PM, Puneet Khatod 
> <puneet.kha...@tavant.com>wrote:
>
>>  Hi,****
>>
>> ** **
>>
>> There are many online tutorials and blogs to provide quick get-set-go
>> sort of information. To start with you can learn Hadoop. For detailed
>> knowledge you will have to go through e-books as mentioned by Lefty.
>> These books are bulky but will provide every bit of hadoop.
>>
>> I recently came across an android app called 'Big data Xpert', which has
>> tips and tricks about big data technologies. I think that it can be quick
>> and good reference for beginners as well as experienced developers.. ****
>>
>> For reference: ****
>>
>>
>> https://play.google.com/store/apps/details?id=com.mobiknights.xpert.bigdata
>> ****
>>
>> ** **
>>
>> Thanks,****
>>
>> Puneet****
>>
>> ** **
>>
>> *From:* Lefty Leverenz [mailto:le...@hortonworks.com]
>> *Sent:* Thursday, June 20, 2013 11:05 AM
>> *To:* user@hive.apache.org
>> *Subject:* Re: New to hive.****
>>
>> ** **
>>
>> "Programming Hive" and "Hadoop: The Definitive Guide" are available at
>> the O'Reilly website (http://oreilly.com/) and on Amazon. ****
>>
>> ** **
>>
>> But don't forget the Hive wiki:****
>>
>>    - Hive Home -- https://cwiki.apache.org/confluence/display/Hive/Home *
>>    ***
>>    - Getting Started --
>>    https://cwiki.apache.org/confluence/display/Hive/GettingStarted****
>>    - Hive Tutorial --
>>    https://cwiki.apache.org/confluence/display/Hive/Tutorial****
>>
>> – Lefty****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Jun 19, 2013 at 7:02 PM, Mohammad Tariq <donta...@gmail.com>
>> wrote:****
>>
>> Hello ma'am,****
>>
>> ** **
>>
>>       Hive queries are parsed using ANTLR <http://www.antlr.org/> and
>> and are converted into corresponding MR jobs(actually a lot of things
>> happen under the hood). I had answered a similar 
>> question<http://stackoverflow.com/questions/17090022/hive-query-control-flow/17095756?noredirect=1#comment24873979_17095756>few
>>  days ago on SO, you might find it helpful. But I would suggest you to
>> go through the original 
>> paper<http://cs.brown.edu/courses/cs227/papers/hive.pdf>which explains all 
>> these things in proper detail. I would also recommend
>> you to go through the book "Programming Hive". It's really nice.****
>>
>> ** **
>>
>> HTH****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Thu, Jun 20, 2013 at 4:24 AM, Bharati <bharati.ad...@mparallelo.com>
>> wrote:****
>>
>>  Hi Folks,
>>
>> I am new to hive and need information, tutorials etc that you can point
>> to. I have installed hive to work with MySQL.
>>
>>  I can run queries. Now I would like to understand how the map and reduce
>> classes are created and how I can look at the data for the map job and map
>> class the hive query generates.  Also is there a way to create custom map
>> classes.
>> I would appreciate if anyone can help me get started.
>>
>> Thanks,
>> Bharati
>>
>> Sent from my iPad****
>>
>> Fortigate Filtered****
>>
>> ** **
>>
>> ** **
>>
>>
>>
>> Any comments or statements made in this email are not necessarily those
>> of Tavant Technologies.
>> The information transmitted is intended only for the person or entity to
>> which it is addressed and may
>> contain confidential and/or privileged material. If you have received
>> this in error, please contact the
>> sender and delete the material from any computer. All e-mails sent from
>> or to Tavant Technologies
>> may be subject to our monitoring procedures.
>>
>>
>>
>>
>>
>
>

Reply via email to