Hi,
I am executing a mapreduce program with hcatalog and hive database. Even if
the jars are included its showing this error:
Exception in thread main java.io.IOException:
com.google.common.util.concurrent.UncheckedExecutionException:
javax.jdo.JDOFatalUserException: Class
Try to include hive-site.xml and yarn-site.xml to eclipse and include yarn
application class path in yarn-site.xml
Thanks
Karthik
On Sep 23, 2014, at 2:06, Poorvi Ahirwal poorvi.ahir...@gmail.com wrote:
Hi,
I am executing a mapreduce program with hcatalog and hive database. Even if
the
sorry i forgot to mention. i am not using eclipse
On Tue, Sep 23, 2014 at 12:14 AM, Karthiksrivasthava
karthiksrivasth...@gmail.com wrote:
Try to include hive-site.xml and yarn-site.xml to eclipse and include yarn
application class path in yarn-site.xml
Thanks
Karthik
On Sep 23, 2014, at
hello,i use
https://github.com/apache/hive/blob/trunk/service/if/TCLIService.thrift to
gen golang sdk, but it ddn't work. The golang sdk can only opensession. I
compare my code within python sdk and didn't find any error usage. below is
my golang code.
socket, err :=
Hi Hive users:
Does anyone know how to convert LazyLong to Long in hive generic udf ?
Thanks
Dan
Comment the jdo properties from mapred and hdfs
On Sep 23, 2014 4:42 PM, Poorvi Ahirwal poorvi.ahir...@gmail.com wrote:
Hi,
I am executing a mapreduce program with hcatalog and hive database. Even
if the jars are included its showing this error:
Exception in thread main java.io.IOException:
you are saying about mapred program or some property file?
On Tue, Sep 23, 2014 at 4:14 AM, hadoop hive hadooph...@gmail.com wrote:
Comment the jdo properties from mapred and hdfs
On Sep 23, 2014 4:42 PM, Poorvi Ahirwal poorvi.ahir...@gmail.com
wrote:
Hi,
I am executing a mapreduce program
Thanks Xuefu. I was hoping I don't have to do that. We create this file by
serializing java array so was hoping that it would be a common case to
handle arrays with square brackets.
Thanks,
Ankita
On Mon, Sep 22, 2014 at 7:55 PM, Xuefu Zhang xzh...@cloudera.com wrote:
Hive doesn't know it
Shushant,
Creating a patched jar that would include the lock functionality you
want is unlikely to work. Wouldn't the following workflow work for you:
1. Writer locks the table explicitly via LOCK TABLE
2. Writer inserts
3. Writer unlocks the table explicitly via UNLOCK TABLE
If you're
What version of Hive are you using?
Did you explicitly create the tables in the metastore via the Oracle
script or depend on DataNucleus to do it for you?
Alan.
Rahul Channe mailto:drah...@googlemail.com
September 22, 2014 at 10:47
Hi All,
I am using oracle as hive metastore. I could see
Hi Alan,
I am using version 0.12.0, I did not create tables explicitly.
Temporarily I modified the hive-metastore jar to update long data type to
clob and it worked
Not sure if it's hive bug
On Tuesday, September 23, 2014, Alan Gates ga...@hortonworks.com wrote:
What version of Hive are you
Hello,
We are currently experiencing a severe reproducible hiveserver2 crash when
using the RJDBC connector in RStudio (please refer to the description below for
the detailed test case). We have a hard time pinpointing the source of the
problem and we are wondering whether this is a known
Ritesh thanks for your response.
Where do I download and place the jars?
Do you mean on the hive server itself? I believe the files are already
there since I can query the same table via command line.
It feels like the serde is not being sent along with the query? or I need
to get the jar sent
can you share hiveserver2 heap size and your table size ?
On Tue, Sep 23, 2014 at 11:31 PM, Shiang Luong shiang.lu...@openx.com
wrote:
Ritesh thanks for your response.
Where do I download and place the jars?
Do you mean on the hive server itself? I believe the files are already
there since
What does your GenericUDF look like?
What version of Hive? Does the query work without the UDF?
On Sep 22, 2014, at 3:28 PM, Dan Fan d...@appnexus.com wrote:
Dear hive users:
Quick question about hive longwritable convert to long.
I have a generic udf called protected_column, which works
You can import the metastore db in oracle directly.
Inside oracle db
@/path/to/oracle-hive-0.12.0.sql
On Sep 23, 2014 9:08 PM, Rahul Channe drah...@googlemail.com wrote:
Hi Alan,
I am using version 0.12.0, I did not create tables explicitly.
Temporarily I modified the hive-metastore jar to
I am talking bout property file
On Sep 23, 2014 5:18 PM, Poorvi Ahirwal poorvi.ahir...@gmail.com wrote:
you are saying about mapred program or some property file?
On Tue, Sep 23, 2014 at 4:14 AM, hadoop hive hadooph...@gmail.com wrote:
Comment the jdo properties from mapred and hdfs
On Sep
Hi Alan
1.When writer takes exclusive lock , hive won't allow to write anyone(even
the session which holds lock) to write in table.
Do I need to pass lock handle to read query or I am missing here something.
2.Or you mean to insert using hadoop filesystem not using hive ?
On Tue, Sep 23, 2014 at
Hello,
I am trying to run a query on Hive from a reporting tool but it fails with
a File not found exception. I am using hiverserver1.
From the /tmp/user/hive.log file, I can see that the Map reduce jobs
completed fine, but in the end of the file I see the following error. Any
pointers on what
Hi ,
My MapReduce program takes almost 10 minutes to finish the job after it
reaches
map 100% reduce 100% ..
Thanks
Karthik
Is it your custom job or any mapreduce-example jobs?
How many mappers and reducers are running?
Check application master container logs why job is not finished.
Thanks Regards
Rohith Sharma K S
-Original Message-
From: Karthiksrivasthava [mailto:karthiksrivasth...@gmail.com]
Sent: 24
Do you have any type of calculation in Driver?
On Wed, Sep 24, 2014 at 10:19 AM, Rohith Sharma K S
rohithsharm...@huawei.com wrote:
Is it your custom job or any mapreduce-example jobs?
How many mappers and reducers are running?
Check application master container logs why job is not
Is there any other method to use instead of
HcatInputFormat.setInput(job,db,table)..? Because this is dependent on
many jars which are not there in hadoop-1.2.1 and hive-0.13.1. Or any other
way of reading data from hive table with mapreduce program.?
Thanks
On Wed, Sep 24, 2014 at 12:14 AM,
23 matches
Mail list logo