Ref:
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DateFunctions
.
int
month(string date)
Returns the month part of a date or a timestamp string: month("1970-11-01
00:00:00") = 11, month("1970-11-01") = 11.
Does it fit in your requirement?.
Thanks
On
Try with below options
mvn clean install -DskipTests -Pdist,hadoop-1 ( on MR1)
or
mvn clean install -DskipTests -Pdist,hadoop-2 ( on YARN)
Thx
On Thu, Oct 1, 2015 at 12:14 AM, Giannis Giannakopoulos <
gg...@cslab.ece.ntua.gr> wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA256
>
May be your one reducer is overloaded due to groupby keys. If you are using
groupby then try below property and see if reducer data distributed.
set hive.groupby.skewindata=true;
Thanks
Jitendra
On Mon, May 11, 2015 at 12:35 PM, r7raul1...@163.com r7raul1...@163.com
wrote:
Status: Running
I haven't tried it, but I think you need to create trust store/key store
file using existing certificates.
*keytool -import -alias alias -file path-to-cerficate-file -keystore
truststorefile*
And then you need to provide the TrustStore location in jdbc connection
string
Hi,
1. you may use Hadoop-mongodb connector, create a map reduce program
to process your data from mongodb to hive.
https://github.com/mongodb/mongo-hadoop
2. As an alternative you can also use pig mongodb combination to get
the data from mongodb through pig, then after you can create a table
Hi,
Did you configured fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey
properties
in your core-site.xml file?
Regards
Jitendra
On Thu, Sep 5, 2013 at 3:52 PM, shouvanik.hal...@accenture.com wrote:
Hi,
** **
I am executing the example, but got initial roadblock while trying out