RE: Mappers spawning Hive queries

2016-04-18 Thread Ryan Harris
into uncharted territory and you may uncover unexpected edge-cases. From: Shirish Tatikonda [mailto:shirish.tatiko...@gmail.com] Sent: Monday, April 18, 2016 3:44 PM To: user@hive.apache.org Subject: Re: Mappers spawning Hive queries I am using Hive 1.2.1 with MR backend. Ryan, I hear you. I totally a

Re: Mappers spawning Hive queries

2016-04-18 Thread Shirish Tatikonda
ll fail. > > > > There may be a valid reason for approaching the problem the way that you > are, but I'd encourage you to look at restructuring your approach to the > problem to save you more headaches down the road. > > > > *From:* Shirish Tatikonda [mailto:sh

RE: Mappers spawning Hive queries

2016-04-18 Thread Ryan Harris
nt: Monday, April 18, 2016 2:00 PM To: user@hive.apache.org Subject: Re: Mappers spawning Hive queries Hi John, 2) The shell script is invoked in the mappers of a Hadoop streaming job. 1) The use case is that I have to process multiple entities in parallel. Each entity is associated with its ow

Re: Mappers spawning Hive queries

2016-04-18 Thread Mich Talebzadeh
What is the version of Hive and the execution engine (MR, Tez, Spark)? HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmic

Re: Mappers spawning Hive queries

2016-04-18 Thread Shirish Tatikonda
Hi John, 2) The shell script is invoked in the mappers of a Hadoop streaming job. 1) The use case is that I have to process multiple entities in parallel. Each entity is associated with its own data set. The processing involves a few hive queries to do joins and aggregations, which is followed by

Re: Mappers spawning Hive queries

2016-04-16 Thread Jörn Franke
Just out of curiosity, what is the use case behind this? How do you call the shell script? > On 16 Apr 2016, at 00:24, Shirish Tatikonda > wrote: > > Hello, > > I am trying to run multiple hive queries in parallel by submitting them > through a map-reduce job. > More specifically, I have a