Re: DateFunction

2017-01-16 Thread Jitendra Yadav
Ref:
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DateFunctions
.

int

month(string date)

Returns the month part of a date or a timestamp string: month("1970-11-01
00:00:00") = 11, month("1970-11-01") = 11.

Does it fit in your requirement?.

Thanks

On Mon, Jan 16, 2017 at 12:21 PM, Mahender Sarangam <
mahender.bigd...@outlook.com> wrote:

> Hi,
>
> Is there any Date Function which returns Full Month Name for given time
> stamp.
>
>


Re: Hive compilation error

2015-10-01 Thread Jitendra Yadav
Try with below options

mvn clean install -DskipTests -Pdist,hadoop-1  ( on MR1)

or

mvn clean install -DskipTests -Pdist,hadoop-2 ( on YARN)


Thx




On Thu, Oct 1, 2015 at 12:14 AM, Giannis Giannakopoulos <
gg...@cslab.ece.ntua.gr> wrote:

> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA256
>
> Hi all,
>
> I am trying to compile hive from source, but I get the following error:
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> (default-compile) on project hive-storage-api: Compilation failure:
> Compilation failure:
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> torizedRowBatch.java:[24,28]
> package org.apache.hadoop.io does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> torizedRowBatch.java:[25,28]
> package org.apache.hadoop.io does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> torizedRowBatch.java:[34,44]
> cannot find symbol
> [ERROR] symbol: class Writable
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[25,34]
> package org.apache.commons.logging does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[26,34]
> package org.apache.commons.logging does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[29,28]
> package org.apache.hadoop.io does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[30,28]
> package org.apache.hadoop.io does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[32,45]
> cannot find symbol
> [ERROR] symbol: class WritableComparable
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[34,24]
> cannot find symbol
> [ERROR] symbol:   class Log
> [ERROR] location: class
> org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> rgumentImpl.java:[30,39]
> package org.apache.commons.codec.binary does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> rgumentImpl.java:[31,34]
> package org.apache.commons.logging does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> rgumentImpl.java:[32,34]
> package org.apache.commons.logging does not exist
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchA
> rgumentImpl.java:[38,23]
> cannot find symbol
> [ERROR] symbol:   class Log
> [ERROR] location: class
> org.apache.hadoop.hive.ql.io.sarg.SearchArgumentImpl
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> torizedRowBatch.java:[108,34]
> cannot find symbol
> [ERROR] symbol:   class NullWritable
> [ERROR] location: class
> org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> torizedRowBatch.java:[157,3]
> method does not override or implement a method from a supertype
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/Vec
> torizedRowBatch.java:[162,3]
> method does not override or implement a method from a supertype
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[34,34]
> cannot find symbol
> [ERROR] symbol:   variable LogFactory
> [ERROR] location: class
> org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[96,3]
> method does not override or implement a method from a supertype
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[98,13]
> cannot find symbol
> [ERROR] symbol:   variable WritableUtils
> [ERROR] location: class
> org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[99,24]
> cannot find symbol
> [ERROR] symbol:   variable WritableUtils
> [ERROR] location: class
> org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[106,3]
> method does not override or implement a method from a supertype
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[108,5]
> cannot find symbol
> [ERROR] symbol:   variable WritableUtils
> [ERROR] location: class
> org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
> [ERROR]
> /tmp/hive/storage-api/src/java/org/apache/hadoop/hive/serde2/io/HiveDeci
> malWritable.java:[109,5]
> cannot 

Re: skewjoin problem

2015-05-11 Thread Jitendra Yadav
May be your one reducer is overloaded due to groupby keys. If you are using
groupby then try below property and see if reducer data distributed.

set hive.groupby.skewindata=true;

Thanks
Jitendra

On Mon, May 11, 2015 at 12:35 PM, r7raul1...@163.com r7raul1...@163.com
wrote:

 Status: Running (Executing on YARN cluster with App id
 application_1419300485749_1493279)

 

 VERTICES STATUS TOTAL COMPLETED RUNNING PENDING FAILED KILLED
 

 Map 1 .. SUCCEEDED 200 200 0 0 0 0
 Map 4 .. SUCCEEDED 3 3 0 0 0 0
 Map 5 .. SUCCEEDED 152 152 0 0 0 0
 Reducer 2 . RUNNING 20 19 1 0 0 0
 Reducer 3 RUNNING 23 0 23 0 0 0
 

 VERTICES: 03/05 [--] 93% ELAPSED TIME: 791.14 s


 A reduce run for a long time.

 I try set hive.exec.reducers.bytes.per.reducer = 40;
 set hive.skewjoin.key = 10;
 set hive.optimize.skewjoin =true;

 but nothing helped. Only the reduce num decrease


 --
 r7raul1...@163.com



Re: Hive with SSL metastore

2013-10-29 Thread Jitendra Yadav
I haven't tried it, but I think you need to create trust store/key store
file using existing certificates.



*keytool -import -alias alias -file path-to-cerficate-file -keystore
truststorefile*



And then you need to provide the TrustStore location in jdbc connection
string

jdbc:Host://host:port/database;ssl=true;*
sslTrustStore=path-to-truststore*;sslTrustStorePassword=password


Hope this will help you.

Thanks
On Tue, Oct 29, 2013 at 1:40 PM, Kanwaljit Singh batman.n...@gmail.com
 wrote:

 Hi,

 In my deployment I have an SSL connection only PostGre SQL server.

 I tried giving the postgres IP with SSL=true, but Hive is unable to find
 the certificates. I have no clue where Hive or PG's JDBC client driver look
 for certificates. Any help in this regard would be appreciated!



On Tue, Oct 29, 2013 at 1:40 PM, Kanwaljit Singh batman.n...@gmail.comwrote:

 Hi,

 In my deployment I have an SSL connection only PostGre SQL server.

 I tried giving the postgres IP with SSL=true, but Hive is unable to find
 the certificates. I have no clue where Hive or PG's JDBC client driver look
 for certificates. Any help in this regard would be appreciated!



Re: Hive + mongoDB

2013-09-11 Thread Jitendra Yadav
Hi,

1. you may use Hadoop-mongodb connector, create a map reduce program
to process your data from mongodb to hive.

https://github.com/mongodb/mongo-hadoop


2. As an alternative you can also use pig mongodb combination to get
the data from mongodb through pig, then after you can create a table
in hive that will points to the pig output file on hdfs.

https://github.com/mongodb/mongo-hadoop/blob/master/pig/README.md

Regards
Jitendra
On 9/11/13, Jérôme Verdier verdier.jerom...@gmail.com wrote:
 Hi,

 You can use Talend to import data from mongodb to hive

 More informations here : http://www.talend.com/products/big-data


 2013/9/11 Sandeep Nemuri nhsande...@gmail.com

 Hi every one ,
I am trying to import data from mongodb to hive .
 i
 got some jar files to connect mongo and hive .
 now how to import the data from mongodb to hive ?

 Thanks in advance.

 --
 --Regards
   Sandeep Nemuri




 --
 *Jérôme VERDIER*
 06.72.19.17.31
 verdier.jerom...@gmail.com



Re: error executing the tutorial - https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely

2013-09-05 Thread Jitendra Yadav
Hi,

Did you configured fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey
properties
in your core-site.xml file?

Regards
Jitendra
On Thu, Sep 5, 2013 at 3:52 PM, shouvanik.hal...@accenture.com wrote:

  Hi,

 ** **

 I am executing the example, but got initial roadblock while trying out
 “Setting up tables (DDL Statements)” section.

 ** **

 When I executed the command “hive create external table kv (key int,
 values string)  location 's3n://data.s3ndemo.hive/kv';”

 ** **

 I got the error

 ** **

 FAILED: Error in metadata:
 MetaException(message:java.lang.IllegalArgumentException: AWS Access Key ID
 and Secret Access Key must be specified as the username or password
 (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or
 fs.s3n.awsSecretAccessKey properties (respectively).)

 FAILED: Execution Error, return code 1 from
 org.apache.hadoop.hive.ql.exec.DDLTask

 hive

 ** **

 Please help.

 ** **

 ** **

 Thanks,

 Shouvanik

  --
  This message is for the designated recipient only and may contain
 privileged, proprietary, or otherwise confidential information. If you have
 received it in error, please notify the sender immediately and delete the
 original. Any other use of the e-mail by you is prohibited.

 Where allowed by local law, electronic communications with Accenture and
 its affiliates, including e-mail and instant messaging (including content),
 may be scanned by our systems for the purposes of information security and
 assessment of internal compliance with Accenture policy.


 __

 www.accenture.com