Re: Welcome new Hive committer, Zhihai Xu

2017-05-05 Thread Jimmy Xiang
Congrats!!

On Fri, May 5, 2017 at 10:15 AM, Chinna Rao Lalam
 wrote:
> Congratulations Zhihai...
>
> On Fri, May 5, 2017 at 10:22 PM, Xuefu Zhang  wrote:
>>
>> Hi all,
>>
>> I'm very please to announce that Hive PMC has recently voted to offer
>> Zhihai a committership which he accepted. Please join me in congratulating
>> on this recognition and thanking him for his contributions to Hive.
>>
>> Regards,
>> Xuefu
>
>
>
>
> --
> Hope It Helps,
> Chinna


Re: [ANNOUNCE] New Hive Committer - Rajesh Balamohan

2016-12-14 Thread Jimmy Xiang
Congrats, Rajesh!!

On Wed, Dec 14, 2016 at 11:32 AM, Sergey Shelukhin
 wrote:
> Congratulations!
>
> From: Chao Sun 
> Reply-To: "user@hive.apache.org" 
> Date: Wednesday, December 14, 2016 at 10:52
> To: "d...@hive.apache.org" 
> Cc: "user@hive.apache.org" , "rbalamo...@apache.org"
> 
> Subject: Re: [ANNOUNCE] New Hive Committer - Rajesh Balamohan
>
> Congrats Rajesh!
>
> On Wed, Dec 14, 2016 at 9:26 AM, Vihang Karajgaonkar 
> wrote:
>>
>> Congrats Rajesh!
>>
>> On Wed, Dec 14, 2016 at 1:54 AM, Jesus Camacho Rodriguez <
>> jcamachorodrig...@hortonworks.com> wrote:
>>
>> > Congrats Rajesh, well deserved! :)
>> >
>> > --
>> > Jesús
>> >
>> >
>> >
>> >
>> > On 12/14/16, 8:41 AM, "Lefty Leverenz"  wrote:
>> >
>> > >Congratulations Rajesh!
>> > >
>> > >-- Lefty
>> > >
>> > >
>> > >On Tue, Dec 13, 2016 at 11:58 PM, Rajesh Balamohan
>> > > > > >
>> > >wrote:
>> > >
>> > >> Thanks a lot for providing this opportunity and to all for their
>> > messages.
>> > >> :)
>> > >>
>> > >> ~Rajesh.B
>> > >>
>> > >> On Wed, Dec 14, 2016 at 11:33 AM, Dharmesh Kakadia
>> > >> > > >
>> > >> wrote:
>> > >>
>> > >> > Congrats Rajesh !
>> > >> >
>> > >> > Thanks,
>> > >> > Dharmesh
>> > >> >
>> > >> > On Tue, Dec 13, 2016 at 7:37 PM, Vikram Dixit K <
>> > vikram.di...@gmail.com>
>> > >> > wrote:
>> > >> >
>> > >> >> Congrats Rajesh! :)
>> > >> >>
>> > >> >> On Tue, Dec 13, 2016 at 9:36 PM, Pengcheng Xiong
>> > >> >> 
>> > >> >> wrote:
>> > >> >>
>> > >> >>> Congrats Rajesh! :)
>> > >> >>>
>> > >> >>> On Tue, Dec 13, 2016 at 6:51 PM, Prasanth Jayachandran <
>> > >> >>> prasan...@apache.org
>> > >> >>> > wrote:
>> > >> >>>
>> > >> >>> > The Apache Hive PMC has voted to make Rajesh Balamohan a
>> > committer on
>> > >> >>> the
>> > >> >>> > Apache Hive Project. Please join me in congratulating Rajesh.
>> > >> >>> >
>> > >> >>> > Congratulations Rajesh!
>> > >> >>> >
>> > >> >>> > Thanks
>> > >> >>> > Prasanth
>> > >> >>>
>> > >> >>
>> > >> >>
>> > >> >>
>> > >> >> --
>> > >> >> Nothing better than when appreciated for hard work.
>> > >> >> -Mark
>> > >> >>
>> > >> >
>> > >> >
>> > >>
>> >
>
>


Re: [ANNOUNCE] New PMC Member : Pengcheng

2016-07-18 Thread Jimmy Xiang
Congrats!!

On Mon, Jul 18, 2016 at 9:55 AM, Vihang Karajgaonkar
 wrote:
> Congratulations!
>
>> On Jul 18, 2016, at 5:28 AM, Peter Vary  wrote:
>>
>> Congratulations Pengcheng!
>>
>>
>>> On Jul 18, 2016, at 6:55 AM, Wei Zheng  wrote:
>>>
>>> Congrats Pengcheng!
>>>
>>> Thanks,
>>>
>>> Wei
>>>
>>>
>>>
>>>
>>>
>>>
>>> On 7/17/16, 16:01, "Xuefu Zhang"  wrote:
>>>
 Congrats, PengCheng!

 On Sun, Jul 17, 2016 at 2:28 PM, Sushanth Sowmyan 
 wrote:

> Welcome aboard Pengcheng! :)
>
> On Jul 17, 2016 12:01, "Lefty Leverenz"  wrote:
>
>> Congratulations Pengcheng!
>>
>> -- Lefty
>>
>> On Sun, Jul 17, 2016 at 1:03 PM, Ashutosh Chauhan 
>> wrote:
>>

 Hello Hive community,

 I'm pleased to announce that Pengcheng Xiong has accepted the Apache
>>> Hive
 PMC's
 invitation, and is now our newest PMC member. Many thanks to Pengcheng
>>> for
 all of his hard work.

 Please join me congratulating Pengcheng!

 Best,
 Ashutosh
 (On behalf of the Apache Hive PMC)

>>>
>>
>>
>
>>
>


Re: [ANNOUNCE] New PMC Member : Jesus

2016-07-18 Thread Jimmy Xiang
Congrats!!

On Mon, Jul 18, 2016 at 9:54 AM, Vihang Karajgaonkar
 wrote:
> Congratulations Jesus!
>
>> On Jul 18, 2016, at 8:30 AM, Sergio Pena  wrote:
>>
>> Congrats Jesus !!!
>>
>> On Mon, Jul 18, 2016 at 7:28 AM, Peter Vary  wrote:
>>
>>> Congratulations Jesus!
>>>
 On Jul 18, 2016, at 6:55 AM, Wei Zheng  wrote:

 Congrats Jesus!

 Thanks,

 Wei







 On 7/17/16, 14:29, "Sushanth Sowmyan"  wrote:

> Good to have you onboard, Jesus! :)
>
> On Jul 17, 2016 12:00, "Lefty Leverenz" 
>>> wrote:
>
>> Congratulations Jesus!
>>
>> -- Lefty
>>
>> On Sun, Jul 17, 2016 at 1:01 PM, Ashutosh Chauhan <
>>> hashut...@apache.org>
>> wrote:
>>
>>> Hello Hive community,
>>>
>>> I'm pleased to announce that Jesus Camacho Rodriguez has accepted the
>>> Apache Hive PMC's
>>> invitation, and is now our newest PMC member. Many thanks to Jesus for
>>> all of
>>> his hard work.
>>>
>>> Please join me congratulating Jesus!
>>>
>>> Best,
>>> Ashutosh
>>> (On behalf of the Apache Hive PMC)
>>>
>>
>>
>>>
>>>
>


Re: [Announce] New Hive Committer - Mohit Sabharwal

2016-07-01 Thread Jimmy Xiang
Congrats!!

On Fri, Jul 1, 2016 at 1:04 PM, Lenni Kuff  wrote:
> Congrats Mohit!
>
> On Fri, Jul 1, 2016 at 3:27 PM, Peter Vary  wrote:
>
>> Congratulations Mohit!
>> 2016. júl. 1. 19:10 ezt írta ("Vihang Karajgaonkar" > >):
>>
>> > Congratulations Mohit!
>> >
>> > > On Jul 1, 2016, at 10:05 AM, Chao Sun  wrote:
>> > >
>> > > Congratulations Mohit! Good job!
>> > >
>> > > Best,
>> > > Chao
>> > >
>> > > On Fri, Jul 1, 2016 at 9:57 AM, Szehon Ho > > > wrote:
>> > > On behalf of the Apache Hive PMC, I'm pleased to announce that Mohit
>> > Sabharwal has been voted a committer on the Apache Hive project.
>> > >
>> > > Please join me in congratulating Mohit !
>> > >
>> > > Thanks,
>> > > Szehon
>> > >
>> >
>> >
>>


Re: [VOTE] Bylaws change to allow some commits without review

2016-04-19 Thread Jimmy Xiang
+1

On Tue, Apr 19, 2016 at 2:58 PM, Alpesh Patel  wrote:
> +1
>
> On Tue, Apr 19, 2016 at 1:29 PM, Lars Francke 
> wrote:
>>
>> Thanks everyone! Vote runs for at least one more day. I'd appreciate it if
>> you could ping/bump your colleagues to chime in here.
>>
>> I'm not entirely sure how many PMC members are active and how many votes
>> we need but I think a few more are probably needed.
>>
>> On Mon, Apr 18, 2016 at 8:02 PM, Thejas Nair 
>> wrote:
>>>
>>> +1
>>>
>>> 
>>> From: Wei Zheng 
>>> Sent: Monday, April 18, 2016 10:51 AM
>>> To: user@hive.apache.org
>>> Subject: Re: [VOTE] Bylaws change to allow some commits without review
>>>
>>> +1
>>>
>>> Thanks,
>>> Wei
>>>
>>> From: Siddharth Seth 
>>> Reply-To: "user@hive.apache.org" 
>>> Date: Monday, April 18, 2016 at 10:29
>>> To: "user@hive.apache.org" 
>>> Subject: Re: [VOTE] Bylaws change to allow some commits without review
>>>
>>> +1
>>>
>>> On Wed, Apr 13, 2016 at 3:58 PM, Lars Francke 
>>> wrote:

 Hi everyone,

 we had a discussion on the dev@ list about allowing some forms of
 contributions to be committed without a review.

 The exact sentence I propose to add is: "Minor issues (e.g. typos, code
 style issues, JavaDoc changes. At committer's discretion) can be committed
 after soliciting feedback/review on the mailing list and not receiving
 feedback within 2 days."

 The proposed bylaws can also be seen here
 

 This vote requires a 2/3 majority of all Active PMC members so I'd love
 to get as many votes as possible. The vote will run for at least six days.

 Thanks,
 Lars
>>>
>>>
>>
>


Re: About hive on spark

2015-12-30 Thread Jimmy Xiang
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started

On Tue, Dec 29, 2015 at 7:45 PM, Todd  wrote:
> Hi,
> I would explore whether hive on spark is stable enough to adopt it in our
> production environment.
> As a starting point, is there some documentation for me to get started?
> Thanks.


Re: Hive version with Spark

2015-11-30 Thread Jimmy Xiang
Hi Sofia,

For Hive 1.2.1, you should not use Spark 1.5. There are some incompatible
interface change in Spark 1.5.

Have you tried Hive 1.2.1 with Spark 1.3.1? As Udit pointed out, you can
follow the instruction on

https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started

to build the Spark assembly, which should not contain any Hive/Hadoop
related classes.

Thanks,
Jimmy


On Sun, Nov 29, 2015 at 4:03 PM, Xuefu Zhang  wrote:

> Sofia,
>
> What specific problem did you encounter when trying spark.master other
> than local?
>
> Thanks,
> Xuefu
>
> On Sat, Nov 28, 2015 at 1:14 AM, Sofia Panagiotidi <
> sofia.panagiot...@taiger.com> wrote:
>
>> Hi Mich,
>>
>>
>> I never managed to run Hive on Spark with a spark master other than local
>> so I am afraid I don’t have a reply here.
>> But do try some things. Firstly, run hive as
>>
>> hive --hiveconf hive.root.logger=DEBUG,console
>>
>>
>> so that you are able to see what the exact error is.
>>
>> I am afraid I cannot be much of a help as I think I reached the same
>> point (where it would work only when setting spark.master=local) before
>> abandoning.
>>
>> Cheers
>>
>>
>>
>> On 27 Nov 2015, at 01:59, Mich Talebzadeh  wrote:
>>
>> Hi Sophia,
>>
>>
>> There is no Hadoop-2.6. I believe you should use Hadoop-2.4 as shown below
>>
>>
>> mvn -Phadoop-2.4 -Dhadoop.version=2.6.0 -DskipTests clean package
>>
>> Also if you are building it for Hive on Spark engine, you should not
>> include Hadoop.jar files in your build.
>>
>> For example I tried to build spark 1.3 from source code (I read that this
>> version works OK with Hive, having tried unsuccessfully spark 1.5.2).
>>
>> The following command created the tar file
>>
>> ./make-distribution.sh --name "hadoop2-without-hive" --tgz
>> "-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"
>>
>> spark-1.3.0-bin-hadoop2-without-hive.tar.gz
>>
>>
>> Now I have other issues making Hive to use Spark execution engine
>> (requires Hive 1.1 or above )
>>
>> In hive I do
>>
>> set spark.home=/usr/lib/spark;
>> set hive.execution.engine=spark;
>> set spark.master=spark://127.0.0.1:7077;
>> set spark.eventLog.enabled=true;
>> set spark.eventLog.dir=/usr/lib/spark/logs;
>> set spark.executor.memory=512m;
>> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
>> use asehadoop;
>> select count(1) from t;
>>
>> I get the following
>>
>> OK
>> Time taken: 0.753 seconds
>> Query ID = hduser_20151127003523_e9863e84-9a81-4351-939c-36b3bef36478
>> Total jobs = 1
>> Launching Job 1 out of 1
>> In order to change the average load for a reducer (in bytes):
>>   set hive.exec.reducers.bytes.per.reducer=
>> In order to limit the maximum number of reducers:
>>   set hive.exec.reducers.max=
>> In order to set a constant number of reducers:
>>   set mapreduce.job.reduces=
>> Failed to execute spark task, with exception
>> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark
>> client.)'
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.spark.SparkTask
>>
>> HTH,
>>
>> Mich
>>
>> NOTE: The information in this email is proprietary and confidential. This
>> message is for the designated recipient only, if you are not the intended
>> recipient, you should destroy it immediately. Any information in this
>> message shall not be understood as given or endorsed by Peridale Technology
>> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
>> the responsibility of the recipient to ensure that this email is virus
>> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
>> accept any responsibility.
>>
>> *From:* Sofia [mailto:sofia.panagiot...@taiger.com
>> ]
>> *Sent:* 18 November 2015 16:50
>> *To:* user@hive.apache.org
>> *Subject:* Hive version with Spark
>>
>> Hello
>>
>> After various failed tries to use my Hive (1.2.1) with my Spark (Spark
>> 1.4.1 built for Hadoop 2.2.0) I decided to try to build again Spark with
>> Hive.
>> I would like to know what is the latest Hive version that can be used to
>> build Spark at this point.
>>
>> When downloading Spark 1.5 source and trying:
>>
>> *mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-1.2.1
>> -Phive-thriftserver  -DskipTests clean package*
>>
>> I get :
>>
>> *The requested profile "hive-1.2.1" could not be activated because it
>> does not exist.*
>>
>> Thank you
>> Sofia
>>
>>
>>
>


Re: [ANNOUNCE] New PMC Member : John Pullokkaran

2015-11-24 Thread Jimmy Xiang
Congrats!!

On Tue, Nov 24, 2015 at 3:04 PM, Szehon Ho  wrote:

> Congratulations!
>
> On Tue, Nov 24, 2015 at 3:02 PM, Xuefu Zhang  wrote:
>
> > Congratulations, John!
> >
> > --Xuefu
> >
> > On Tue, Nov 24, 2015 at 3:01 PM, Prasanth J 
> > wrote:
> >
> >> Congratulations and Welcome John!
> >>
> >> Thanks
> >> Prasanth
> >>
> >> On Nov 24, 2015, at 4:59 PM, Ashutosh Chauhan 
> >> wrote:
> >>
> >> On behalf of the Hive PMC I am delighted to announce John Pullokkaran is
> >> joining Hive PMC.
> >> John is a long time contributor in Hive and is focusing on compiler and
> >> optimizer areas these days.
> >> Please give John a warm welcome to the project!
> >>
> >> Ashutosh
> >>
> >>
> >>
> >
>


Re: Hive On Spark - Using custom SerDe

2015-11-16 Thread Jimmy Xiang
It is better to use safety valve for hive configuration.

On Mon, Nov 16, 2015 at 8:03 AM, Daniel Haviv <
daniel.ha...@veracity-group.com> wrote:

> Hi,
> How should I set it ? just a normal set in hive or add it via the safety
> valve to the hive or sparks configuartion?
>
> Thank you.
> Daniel
>
> On Mon, Nov 16, 2015 at 5:46 PM, Jimmy Xiang <jxi...@cloudera.com> wrote:
>
>> Have you add your class to "spark.kryo.classesToRegister"? You also need
>> to make sure your jar is in ""hive.aux.jars.path".
>>
>> Thanks,
>> Jimmy
>>
>> On Mon, Nov 16, 2015 at 1:44 AM, Daniel Haviv <
>> daniel.ha...@veracity-group.com> wrote:
>>
>>> Hi,
>>> We have a custom SerDe we would like to use with Hive on Spark but I'm
>>> not sure how to.
>>> The error messages are pretty clear about the fact that it can't find my
>>> SerDE's class:
>>>
>>> Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable 
>>> to find class: com.mycompany.hive.WhaleAvroGenericInputFormat
>>>
>>>
>>>
>>>
>>> Thank you.
>>>
>>> Daniel
>>>
>>>
>>
>


Re: Hive On Spark - Using custom SerDe

2015-11-16 Thread Jimmy Xiang
Have you add your class to "spark.kryo.classesToRegister"? You also need to
make sure your jar is in ""hive.aux.jars.path".

Thanks,
Jimmy

On Mon, Nov 16, 2015 at 1:44 AM, Daniel Haviv <
daniel.ha...@veracity-group.com> wrote:

> Hi,
> We have a custom SerDe we would like to use with Hive on Spark but I'm not
> sure how to.
> The error messages are pretty clear about the fact that it can't find my
> SerDE's class:
>
> Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to 
> find class: com.mycompany.hive.WhaleAvroGenericInputFormat
>
>
>
>
> Thank you.
>
> Daniel
>
>


Re: HiveServer2 Goes down while parallel Query execution

2015-10-09 Thread Jimmy Xiang
Could it be because HS2 runs out of worker threads?
Are you trying to open multiple sessions on the same connection?

Thanks,
Jimmy

On Fri, Oct 9, 2015 at 12:34 PM, Vineet Mishra 
wrote:

> Any idea about this?
>
> Frequent Connectivity issue to HiveServer2
>
>   2015-10-10 00:23:11,070 [main] ERROR (HiveConnection.java:439) - Error
> opening session
>   org.apache.thrift.transport.TTransportException
> at
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
> at
> org.apache.thrift.transport.TSaslTransport.readLength(TSaslTransport.java:355)
> at
> org.apache.thrift.transport.TSaslTransport.readFrame(TSaslTransport.java:432)
> at org.apache.thrift.transport.TSaslTransport.read(TSaslTransport.java:414)
> at
> org.apache.thrift.transport.TSaslClientTransport.read(TSaslClientTransport.java:37)
> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:297)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:204)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
> at
> org.apache.hive.service.cli.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:160)
> at
> org.apache.hive.service.cli.thrift.TCLIService$Client.OpenSession(TCLIService.java:147)
> at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:429)
> at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:192)
> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
> at java.sql.DriverManager.getConnection(DriverManager.java:571)
> at java.sql.DriverManager.getConnection(DriverManager.java:215)
> at com.sd.dwh.sc.tungsten.custom.HiveRunnable.(HiveRunnable.java:42)
> at com.sd.dwh.sc.tungsten.custom.HiveInvoker.main(HiveInvoker.java:62)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> 2015-10-10 00:23:11,071 [main] ERROR (HiveRunnable.java:48) - Could not
> establish connection to jdbc:hive2://hadoop-hs2:1/mydb: null
>
> URGENT CALL!
>
> Thanks!
>
> On Fri, Oct 9, 2015 at 2:42 PM, Vineet Mishra 
> wrote:
>
>> This looks out to be the issue
>>
>> https://issues.apache.org/jira/browse/HIVE-2314
>>
>> Any workaround or resolution to the same.
>>
>> Thanks!
>>
>> On Fri, Oct 9, 2015 at 1:24 PM, Vineet Mishra 
>> wrote:
>>
>>> Hi All,
>>>
>>> I am trying to connect to HiveServer2 to query some data in parallel and
>>> landing up into some weird exception, stack trace mentioned below
>>>
>>> java.sql.SQLException: Error while cleaning up the server resources
>>> at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:569)
>>> at
>>> com.sd.dwh.sc.tungsten.custom.HiveRunnable.mergeJDBC(HiveRunnable.java:93)
>>> at com.sd.dwh.sc.tungsten.custom.HiveRunnable.run(HiveRunnable.java:55)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: org.apache.thrift.transport.TTransportException
>>> at
>>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>>> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>>> at
>>> org.apache.thrift.transport.TSaslTransport.readLength(TSaslTransport.java:355)
>>> at
>>> org.apache.thrift.transport.TSaslTransport.readFrame(TSaslTransport.java:432)
>>> at
>>> org.apache.thrift.transport.TSaslTransport.read(TSaslTransport.java:414)
>>> at
>>> org.apache.thrift.transport.TSaslClientTransport.read(TSaslClientTransport.java:37)
>>> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>>> at
>>> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
>>> at
>>> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:297)
>>> at
>>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:204)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
>>> at
>>> org.apache.hive.service.cli.thrift.TCLIService$Client.recv_CloseSession(TCLIService.java:183)
>>> at
>>> org.apache.hive.service.cli.thrift.TCLIService$Client.CloseSession(TCLIService.java:170)
>>> at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:567)
>>> ... 5 more
>>>
>>>
>>> Any quick revert will be highly appreciable!
>>>
>>> Thanks!
>>>
>>>
>>
> --
>
> ---
> You received this 

Re: Hive compilation error

2015-10-01 Thread Jimmy Xiang
Have you tried "mvn -U clean install -DskipTests -Pdist,hadoop-2"?

On Thu, Oct 1, 2015 at 12:38 AM, Giannis Giannakopoulos <
gg...@cslab.ece.ntua.gr> wrote:

> Hello,
>
> thank you for your prompt response. I tried to compile with those
> options (also run mvn clean first) but I keep getting the same output.
> Any other suggestions?
>
> Thanks again,
> Giannis
>
>
> On 10/01/2015 10:24 AM, Jitendra Yadav wrote:
> > Try with below options
> >
> > mvn clean install -DskipTests -Pdist,hadoop-1  ( on MR1)
> >
> > or
> >
> > mvn clean install -DskipTests -Pdist,hadoop-2 ( on YARN)
> >
> >
> > Thx
> >
> >
> >
> >
> > On Thu, Oct 1, 2015 at 12:14 AM, Giannis Giannakopoulos
> > > wrote:
> >
> > Hi all,
> >
> > I am trying to compile hive from source, but I get the following error:
> > [ERROR] Failed to execute goal
> > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> > (default-compile) on project hive-storage-api: Compilation failure:
> > Compilation failure:
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> > torizedRowBatch.java:[24,28]
> > package org.apache.hadoop.io  does not
> > exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> > torizedRowBatch.java:[25,28]
> > package org.apache.hadoop.io  does not
> > exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> > torizedRowBatch.java:[34,44]
> > cannot find symbol
> > [ERROR] symbol: class Writable
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[25,34]
> > package org.apache.commons.logging does not exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[26,34]
> > package org.apache.commons.logging does not exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[29,28]
> > package org.apache.hadoop.io  does not
> > exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[30,28]
> > package org.apache.hadoop.io  does not
> > exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[32,45]
> > cannot find symbol
> > [ERROR] symbol: class WritableComparable
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[34,24]
> > cannot find symbol
> > [ERROR] symbol:   class Log
> > [ERROR] location: class
> > org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> > rgumentImpl.java:[30,39]
> > package org.apache.commons.codec.binary does not exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> > rgumentImpl.java:[31,34]
> > package org.apache.commons.logging does not exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> > rgumentImpl.java:[32,34]
> > package org.apache.commons.logging does not exist
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> > rgumentImpl.java:[38,23]
> > cannot find symbol
> > [ERROR] symbol:   class Log
> > [ERROR] location: class
> > org.apache.hadoop.hive.ql.io.sarg.SearchArgumentImpl
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> > torizedRowBatch.java:[108,34]
> > cannot find symbol
> > [ERROR] symbol:   class NullWritable
> > [ERROR] location: class
> > org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> > torizedRowBatch.java:[157,3]
> > method does not override or implement a method from a supertype
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> > torizedRowBatch.java:[162,3]
> > method does not override or implement a method from a supertype
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[34,34]
> > cannot find symbol
> > [ERROR] symbol:   variable LogFactory
> > [ERROR] location: class
> > org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[96,3]
> > method does not override or implement a method from a supertype
> > [ERROR]
> > /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> > malWritable.java:[98,13]
> > cannot find symbol
> > [ERROR] symbol:   variable WritableUtils
> > [ERROR] location: class
> > org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> > [ERROR]
> > 

Re: [ANNOUNCE] New Hive PMC Chair - Ashutosh Chauhan

2015-09-16 Thread Jimmy Xiang
Congrats!

On Wed, Sep 16, 2015 at 1:24 PM, Alpesh Patel 
wrote:

> Congratulations Ashutosh
>
> On Wed, Sep 16, 2015 at 1:23 PM, Pengcheng Xiong 
> wrote:
>
>> Congratulations Ashutosh!
>>
>> On Wed, Sep 16, 2015 at 1:17 PM, John Pullokkaran <
>> jpullokka...@hortonworks.com> wrote:
>>
>>> Congrats Ashutosh!
>>>
>>> From: Vaibhav Gumashta 
>>> Reply-To: "user@hive.apache.org" 
>>> Date: Wednesday, September 16, 2015 at 1:01 PM
>>> To: "user@hive.apache.org" , "d...@hive.apache.org"
>>> 
>>> Cc: Ashutosh Chauhan 
>>> Subject: Re: [ANNOUNCE] New Hive PMC Chair - Ashutosh Chauhan
>>>
>>> Congrats Ashutosh!
>>>
>>> —Vaibhav
>>>
>>> From: Prasanth Jayachandran 
>>> Reply-To: "user@hive.apache.org" 
>>> Date: Wednesday, September 16, 2015 at 12:50 PM
>>> To: "d...@hive.apache.org" , "user@hive.apache.org"
>>> 
>>> Cc: "d...@hive.apache.org" , Ashutosh Chauhan <
>>> hashut...@apache.org>
>>> Subject: Re: [ANNOUNCE] New Hive PMC Chair - Ashutosh Chauhan
>>>
>>> Congratulations Ashutosh!
>>>
>>>
>>>
>>>
>>>
>>> On Wed, Sep 16, 2015 at 12:48 PM -0700, "Xuefu Zhang" <
>>> xzh...@cloudera.com> wrote:
>>>
>>> Congratulations, Ashutosh!. Well-deserved.
>>>
>>> Thanks to Carl also for the hard work in the past few years!
>>>
>>> --Xuefu
>>>
>>> On Wed, Sep 16, 2015 at 12:39 PM, Carl Steinbach  wrote:
>>>
>>> > I am very happy to announce that Ashutosh Chauhan is taking over as the
>>> > new VP of the Apache Hive project. Ashutosh has been a longtime
>>> contributor
>>> > to Hive and has played a pivotal role in many of the major advances
>>> that
>>> > have been made over the past couple of years. Please join me in
>>> > congratulating Ashutosh on his new role!
>>> >
>>>
>>
>>
>


Re: [ANNOUNCE] New Hive Committer - Lars Francke

2015-09-07 Thread Jimmy Xiang
Congrats!!

On Mon, Sep 7, 2015 at 5:09 PM, Navis Ryu  wrote:

> Congratulation!
>
> 2015-09-08 8:38 GMT+09:00 Lefty Leverenz :
> > Congratulations Lars!
> >
> > -- Lefty
> >
> >
> > On Mon, Sep 7, 2015 at 5:16 AM, Loïc Chanel <
> loic.cha...@telecomnancy.net>
> > wrote:
> >>
> >> Congrats Lars ! :)
> >>
> >> Loïc CHANEL
> >> Engineering student at TELECOM Nancy
> >> Trainee at Worldline - Villeurbanne
> >>
> >> 2015-09-07 10:54 GMT+02:00 Carl Steinbach :
> >>>
> >>> The Apache Hive PMC has voted to make Lars Francke a committer on the
> >>> Apache Hive Project.
> >>>
> >>> Please join me in congratulating Lars!
> >>>
> >>> Thanks.
> >>>
> >>> - Carl
> >>>
> >>
> >
>


Re: hive on spark

2015-08-27 Thread Jimmy Xiang
Have you checked this:
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
?

On Wed, Aug 26, 2015 at 11:12 PM, Jeetendra G jeetendr...@housing.com
wrote:

 HI All,

 I am trying to rum hive on spark means from Hive terminal setting up
 execution engine as spark.

 I have copied the hive-default.xml to spark conf directory.

 Hive is not able to find the table.giving me error table_name not found?
 Can you help me with exact steps how to make spark as a engine for hove
 queries?


 regards
 jeetendra



Re: [ANNOUNCE] New Hive Committer - Mithun Radhakrishnan

2015-04-14 Thread Jimmy Xiang
Congrats!

On Tue, Apr 14, 2015 at 8:46 PM, Lefty Leverenz leftylever...@gmail.com
wrote:

 Congrats Mithun -- when they gave me the cape, they called it a cloak of
 invisibility.  But the only thing it makes invisible is itself.  Maybe I
 should open a jira

 -- Lefty

 On Tue, Apr 14, 2015 at 9:03 PM, Xu, Cheng A cheng.a...@intel.com wrote:

  Congrats Mithun!



 *From:* Gunther Hagleitner [mailto:ghagleit...@hortonworks.com]
 *Sent:* Wednesday, April 15, 2015 8:10 AM
 *To:* d...@hive.apache.org; Chris Drome; user@hive.apache.org
 *Cc:* mit...@apache.org
 *Subject:* Re: [ANNOUNCE] New Hive Committer - Mithun Radhakrishnan



 Congrats Mithun!



 Thanks,

 Gunther.
  --

 *From:* Chao Sun c...@cloudera.com
 *Sent:* Tuesday, April 14, 2015 3:48 PM
 *To:* d...@hive.apache.org; Chris Drome
 *Cc:* user@hive.apache.org; mit...@apache.org
 *Subject:* Re: [ANNOUNCE] New Hive Committer - Mithun Radhakrishnan



 Congrats Mithun!



 On Tue, Apr 14, 2015 at 3:29 PM, Chris Drome 
 cdr...@yahoo-inc.com.invalid wrote:

 Congratulations Mithun!




  On Tuesday, April 14, 2015 2:57 PM, Carl Steinbach c...@apache.org
 wrote:


  The Apache Hive PMC has voted to make Mithun Radhakrishnan a committer
 on the Apache Hive Project.
 Please join me in congratulating Mithun.
 Thanks.
 - Carl








 --

 Best,

 Chao





Re: [ANNOUNCE] New Hive Committers - Jimmy Xiang, Matt McCline, and Sergio Pena

2015-03-24 Thread Jimmy Xiang
Thanks everyone! Congrats to Sergio and Matt!!

On Mon, Mar 23, 2015 at 11:36 PM, Xu, Cheng A cheng.a...@intel.com wrote:

  Congratulations!!!



 *From:* @Sanjiv Singh [mailto:sanjiv.is...@gmail.com]
 *Sent:* Tuesday, March 24, 2015 12:45 PM
 *To:* user@hive.apache.org
 *Cc:* d...@hive.apache.org; mmccl...@hortonworks.com; jxi...@apache.org;
 sergio.p...@cloudera.com
 *Subject:* Re: [ANNOUNCE] New Hive Committers - Jimmy Xiang, Matt
 McCline, and Sergio Pena



 Congratulations !!!


   Regards
 Sanjiv Singh
 Mob :  +091 9990-447-339



 On Mon, Mar 23, 2015 at 11:38 PM, Carl Steinbach c...@apache.org wrote:

 The Apache Hive PMC has voted to make Jimmy Xiang, Matt McCline, and
 Sergio Pena committers on the Apache Hive Project.



 Please join me in congratulating Jimmy, Matt, and Sergio.



 Thanks.



 - Carl







Re: Hive on Spark

2015-03-16 Thread Jimmy Xiang
One more thing, java.lang.NoSuchFieldError:
SPARK_RPC_CLIENT_CONNECT_TIMEOUT, are your jar files consistent?

On Mon, Mar 16, 2015 at 6:47 AM, Xuefu Zhang xzh...@cloudera.com wrote:

 It seems that your remote driver failed to start. I suggest #1: try
 spark.master=local first; #2: check spark.log to find out why the remote
 driver fails.

 --Xuefu

 On Sun, Mar 15, 2015 at 10:17 PM, Amith sha amithsh...@gmail.com wrote:

 Hi,

 I have already added the spark-assembly jar in hive lib  here is my hive
 log


 2015-03-16 10:40:08,299 INFO  [main]: SessionState
 (SessionState.java:printInfo(852)) - Added

 [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
 to class path
 2015-03-16 10:40:08,300 INFO  [main]: SessionState
 (SessionState.java:printInfo(852)) - Added resources:

 [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
 2015-03-16 10:40:36,914 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogBegin(121)) - PERFLOG method=Driver.run
 from=org.apache.hadoop.hive.ql.Driver
 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogBegin(121)) - PERFLOG method=TimeToSubmit
 from=org.apache.hadoop.hive.ql.Driver
 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogBegin(121)) - PERFLOG method=compile
 from=org.apache.hadoop.hive.ql.Driver
 2015-03-16 10:40:36,916 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogBegin(121)) - PERFLOG method=parse
 from=org.apache.hadoop.hive.ql.Driver
 2015-03-16 10:40:36,916 INFO  [main]: parse.ParseDriver
 (ParseDriver.java:parse(185)) - Parsing command: insert into table
 test values(5,8900)
 2015-03-16 10:40:36,917 INFO  [main]: parse.ParseDriver
 (ParseDriver.java:parse(206)) - Parse Completed
 2015-03-16 10:40:36,925 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogEnd(148)) - /PERFLOG method=parse
 start=1426482636916 end=1426482636925 duration=9
 from=org.apache.hadoop.hive.ql.Driver
 2015-03-16 10:40:36,929 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogBegin(121)) - PERFLOG method=semanticAnalyze
 from=org.apache.hadoop.hive.ql.Driver
 2015-03-16 10:40:37,034 INFO  [main]: parse.CalcitePlanner
 (SemanticAnalyzer.java:analyzeInternal(10146)) - Starting Semantic
 Analysis
 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
 (SemanticAnalyzer.java:genResolvedParseTree(10129)) - Completed phase
 1 of Semantic Analysis
 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
 (SemanticAnalyzer.java:getMetaData(1434)) - Get metadata for source
 tables
 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
 (SemanticAnalyzer.java:getMetaData(1582)) - Get metadata for
 subqueries
 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
 (SemanticAnalyzer.java:getMetaData(1606)) - Get metadata for
 destination tables
 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]:
 metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 2:
 source:10.10.10.25 get_table : db=test tbl=test
 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]: HiveMetaStore.audit
 (HiveMetaStore.java:logAuditEvent(358)) - ugi=hadoop2
 ip=10.10.10.25cmd=source:10.10.10.25 get_table : db=test tbl=test
 2015-03-16 10:40:37,316 INFO  [main]: parse.CalcitePlanner
 (SemanticAnalyzer.java:genResolvedParseTree(10133)) - Completed
 getting MetaData in Semantic Analysis
 2015-03-16 10:40:37,318 INFO  [main]: parse.BaseSemanticAnalyzer
 (CalcitePlanner.java:canHandleAstForCbo(349)) - Not invoking CBO
 because the statement has too few joins
 2015-03-16 10:40:37,320 INFO  [main]: common.FileUtils
 (FileUtils.java:mkdir(501)) - Creating directory if it doesn't exist:

 hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1
 2015-03-16 10:40:37,429 INFO  [main]: parse.CalcitePlanner
 (SemanticAnalyzer.java:genFileSinkPlan(6474)) - Set stats collection
 dir :
 hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1/-ext-10001
 2015-03-16 10:40:37,450 INFO  [main]: ppd.OpProcFactory
 (OpProcFactory.java:process(657)) - Processing for FS(3)
 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
 (OpProcFactory.java:process(657)) - Processing for SEL(2)
 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
 (OpProcFactory.java:process(657)) - Processing for SEL(1)
 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
 (OpProcFactory.java:process(384)) - Processing for TS(0)
 2015-03-16 10:40:37,507 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogBegin(121)) - PERFLOG
 method=partition-retrieving
 from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner
 2015-03-16 10:40:37,510 INFO  [main]: log.PerfLogger
 (PerfLogger.java:PerfLogEnd(148)) - /PERFLOG
 method=partition-retrieving start=1426482637507 end=1426482637510
 duration=3 from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner
 2015-03-16 10:40:37,583 INFO  [main]: 

Re: [ANNOUNCE] New Hive PMC Member - Sergey Shelukhin

2015-02-26 Thread Jimmy Xiang
Congrats Sergey!

On Thu, Feb 26, 2015 at 9:22 AM, Nick Dimiduk ndimi...@gmail.com wrote:

 Nice work Sergey!


 On Wednesday, February 25, 2015, Carl Steinbach c...@apache.org wrote:

 I am pleased to announce that Sergey Shelukhin has been elected to the
 Hive Project Management Committee. Please join me in congratulating Sergey!

 Thanks.

 - Carl




Re: [ANNOUNCE] New Hive Committers -- Chao Sun, Chengxiang Li, and Rui Li

2015-02-09 Thread Jimmy Xiang
Congrats!!

On Mon, Feb 9, 2015 at 12:36 PM, Alexander Pivovarov apivova...@gmail.com
wrote:

 Congrats!

 On Mon, Feb 9, 2015 at 12:31 PM, Carl Steinbach c...@apache.org wrote:

 The Apache Hive PMC has voted to make Chao Sun, Chengxiang Li, and Rui Li
 committers on the Apache Hive Project.

 Please join me in congratulating Chao, Chengxiang, and Rui!

 Thanks.

 - Carl





Re: [ANNOUNCE] New Hive PMC Members - Szehon Ho, Vikram Dixit, Jason Dere, Owen O'Malley and Prasanth Jayachandran

2015-01-28 Thread Jimmy Xiang
Congrats!!

On Wed, Jan 28, 2015 at 7:20 PM, Devopam Mittra devo...@gmail.com wrote:
 +1

 Congratulations !!

 warm regards
 Devopam


 On Thu, Jan 29, 2015 at 2:45 AM, Carl Steinbach c...@apache.org wrote:

 I am pleased to announce that Szehon Ho, Vikram Dixit, Jason Dere, Owen
 O'Malley and Prasanth Jayachandran have been elected to the Hive Project
 Management Committee. Please join me in congratulating the these new PMC
 members!

 Thanks.

 - Carl




 --
 Devopam Mittra
 Life and Relations are not binary