You Need to use “load data local inpath”
From: Vineet Mishra [mailto:clearmido...@gmail.com]
Sent: Tuesday, October 20, 2015 6:08 PM
To: user@hive.apache.org; cdh-u...@cloudera.or to
Subject: HiveServer2 load data inpath fails
Hi All,
I am trying to run load data inpath to
Hi folks,
i have few question like
1:- How to format output from reduce( like default is tab separator can we
make it , separator)
2:- and how to make output in different directories according to reducer
values.
Thanks in advance
r
@ check you jobtracker or tasktracker is running or not ...
On Mon, Jan 2, 2012 at 7:23 PM, wd w...@wdicc.com wrote:
Because 'select *' will not run map reduce job, may be you should
check if your hadoop cluster is work
On Mon, Jan 2, 2012 at 10:37 AM, Aditya Kumar adityakumar...@yahoo.com
=border:none; frameborder=0/iframe
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
to read or write a
particular region some times it takes few seconds to few minutes. Is there
anyway we can avoid this situation or improve the situation.
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
Hey ,
There is no need for Ant and stuff...you can directly install from by tar.gz
here is the full documentation
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-InstallationandConfiguration
Regards
Vikas Srivastava
On Fri, Nov 11, 2011 at 5:59 AM, Vandana
hey Aditya!!
col values is case sensitive so u have to put exact value .
select * from table where col_name=*'EXACT_VALUE'*
regards
Vikas Srivastava
On Tue, Nov 8, 2011 at 5:17 PM, Aditya Singh30
aditya_sing...@infosys.comwrote:
Hi,
I have setup a two node hadoop
Hey Ashu/anh
Is this true that hive 0.8 supports insert and append data into table .
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast...@one97.net
W: www.one97world.com
One97 | Let's get talking !
From: Ashutosh Chauhan
Hey ,
I m new to MAHOUT can you guys give me some idea about MAHOUT or any pdf on
that !!!
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast...@one97.net
W: www.one97world.com
One97 | Let's get talking !
From: trang van
Stage-1 map = 100%, reduce = 100%
Ended Job = job_201110111849_0024 with errors
FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.MapRedTask
Thank you,
Mark
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
Hi
Did you change the new host name in /etc/hosts of all the Datanodes and on
hive server.??
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast...@one97.net
W: www.one97world.com
One97 | Let's get talking !
From: Steven Wong
ql.Driver (SessionState.java:printError(365))
- FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
Please help I m stuck here
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast...@one97.net
W
: chen0727
Mobil: 886-937545215
Tel: 886-2-8798-2988 #222
Fax:886-2-8751-5499
-Original Message-
From: vikas srivastava [mailto:vikas.srivast...@one97.net]
Sent: Tuesday, October 11, 2011 3:29 PM
To: user@hive.apache.org
Subject: problem in hive
Hi All,
I facing a problem like I
that also i have dine.. i put msql connector in that lib
On Tue, Oct 11, 2011 at 5:39 PM, Ankit Jain ankitjainc...@gmail.com wrote:
Hello Vikas,
I think you have to put the mysql Driver in lib dir of hive.
Thanks,
Ankit
On Tue, Oct 11, 2011 at 5:18 PM, Vikas Srivastava
vikas.srivast
Yup!!!
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast...@one97.net
W: www.one97world.com
One97 | Let's get talking !
From: Ankit Jain [mailto:ankitjainc...@gmail.com]
Sent: Tuesday, October 11, 2011 6:22 PM
To: user
)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast...@one97.net
W: www.one97world.com
One97 | Let's get talking !
Hey All,
I have sum question
1:- maximum space of a datanode in hadoop cluster.
2:- Best Raid for hadoop (hdfs)
3:- minimum size of Hadoop cluster for gud performance.
Please help ..
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast
Hey Folks,
I have configured a 8 tb Dn in my Hadoop cluster but in Dfs URL is showing
.25 instead of 8 tb ,
The rest data node are of 2 tb and this one is of 8tb.
Please suggest why its showing less capacity !!!
Is it configurable or what.
With Regards
Vikas Srivastava
DWH Analytics Team
M
data in to hadoop.
Please provide your valuable suggestion.
With Regards
Vikas Srivastava
DWH Analytics Team
M: +91 9560885900
P: + 91 120 4770102
Email: vikas.srivast...@one97.net
W: www.one97world.com
One97 | Let's get talking !
*Any help would be appreciated*
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
please suggest any help would be appreciated!!
--
With Regards
Vikas Srivastava
DWH Analytics Team
:953)
please suggest any help would be appreciated!!
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
.. this all config with 16 gb ram
Regards
Vikas Srivastava
On Tue, Sep 13, 2011 at 11:20 PM, Ayon Sinha ayonsi...@yahoo.com wrote:
What you can do for each node:
1. decommission node (or 2 nodes if you want to do this faster). You can do
this with the excludes file.
2. Wait for blocks to be moved off
thanks Ayon!!
ll try that then l let ya knw how its work...
regards
Vikas Srivastava
On Wed, Sep 14, 2011 at 9:33 PM, Ayon Sinha ayonsi...@yahoo.com wrote:
Hi Vikas,
The imbalance does create imbalance in MR but with your configuration it
may not be a big issue. Basically the balancer
HI ,
can ny1 tell me how we can migrate hadoop or replace old hard disks with new
big size hdd.
actually i need to replace old hdd of 300 tbs to 1 tb so how can i do this
efficiently!!!
ploblem is to migrate data from 1 hdd to other
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
was if the data in path $path can
itself be a script that feeds the data in streaming fashion? Something like
load data using script 'loader.py' into table foo.
On 2011/09/12, at 15:36, Vikas Srivastava wrote:
hive -e load data local in path '$path' into table $table
partition(date='$date
-- Forwarded message --
From: Vikas Srivastava vikas.srivast...@one97.net
Date: Fri, Aug 26, 2011 at 6:21 PM
Subject: Need help in hive
To: user@hive.apache.org
Hey folks,
i m getting this error while running simple query...
like desc table.
i m using hive 0.7 and hadoop
need help on this
-- Forwarded message --
From: Vikas Srivastava vikas.srivast...@one97.net
Date: Thu, Aug 25, 2011 at 6:37 PM
Subject: Re: Problem in hive
To: user@hive.apache.org
Hey Ashu!!1
i have given full permission to hadoop user on new server(A) with user name
)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
2011-08-26 17:41:59,229 ERROR ql.Driver (SessionState.java:printError(351))
- FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's
hey ashutosh,
thanks for reply..
the output of that is
*Failed with exception null
FAILED: Execution Error, return code 1 from org.apache.hado**op.hive.ql.exec
**.DDLTask*
regards
Vikas Srivastava
On Thu, Aug 25, 2011 at 4:52 AM, Ashutosh Chauhan hashut...@apache.orgwrote:
Vikas,
Looks
Hey Ashu!!1
i have given full permission to hadoop user on new server(A) with user name
and password.
it can only read those tables made by this server(A) and desc them,
and from other server(B) we cant be able to read tables created by this
server(A).
regards
Vikas Srivastava
On Thu, Aug 25
-- Forwarded message --
From: Vikas Srivastava vikas.srivast...@one97.net
Date: Tue, Aug 23, 2011 at 7:26 PM
Subject: Problem in hive
To: user@hive.apache.org
HI team,
i m facing this problem.
show tables is running fine but when i run below query.
hive select * from
sequence of queries u have executed.
When I checked the trunk code this exception will come when getCols()
returns null. Check u r metadata is in good state or not.
Thanks
Chinna Rao Lalam
-- Forwarded message --
From: *Vikas Srivastava* vikas.srivast...@one97.net
(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
. So I expect hive should kick
off 3 map tasks, one on each task nodes. What can make hive only run one map
task? Do I need to set something to kick off multiple map task? in my
config, I didn't change hive config.
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
||0\N0
*actually problem is hive reads single '|' as a fields separators due to
which 2 columns divided into 3 columns .*
Anybody have the solution for that !!
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
hey sid!!
thanks bro...
but cant parse the file actually have 3TB data in that format . so i need to
find the solution and 1 more thing it ll take much time to parse it.
regards
Vikas Srivastava
On Fri, Aug 19, 2011 at 6:41 PM, Siddharth Tiwari siddharth.tiw...@live.com
wrote:
You
Hey All,
Please tell me where to enter datanode IP's in CHD3U2 , actally i installed
all the components in namenode and datanode but confuse where to put
datanode IPS in namenode so thet they get connected.
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's
HI ,
How to create a read-only user in hive and wat are the steps to be taken!!
regards
vikas Srivastava
Hey ,
Is any1 using google snappy i tried it but didnt get success.
If there any1 is using it please tell me the procedure to use it.
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
Hey
can any1 tell me how to use or install patches given in jira for hadoop or
hive!!
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
Hey ,
i just want to use any compression in hadoop so i heard about lzo which is
best among all the compression (after snappy)
please any1 tell me who is already using any kind of compression in hadoop
0.20.2
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97
/ayonsinha/
Also check out my Blog for answers to commonly asked
questions.http://dailyadvisor.blogspot.com
--
With Regards
Vikas Srivastava
DWH Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !
:
Connection refused)'
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.MapRedTask
actually i added namenode into hive site at *dfs.default.name* but now i
facing above error!!!
please advice!!
regards
vikas srivastava
On Thu, Jul 21, 2011 at 11:29 AM, Guy Doulberg
the ping from name-node that is an issue here, you
should run a ping command from data-node to all data-nodes/name-node.
Thanks,
Viral
On Tue, Jul 19, 2011 at 6:50 AM, Edward Capriolo edlinuxg...@gmail.comwrote:
On Tue, Jul 19, 2011 at 9:46 AM, Vikas Srivastava
vikas.srivast...@one97.net wrote
,
expected: hdfs://hadoopnametes:9000
For your fs.default.name config, avoid putting in an IP, and place a
hostname instead.
On Wed, Jul 20, 2011 at 2:30 PM, Vikas Srivastava
vikas.srivast...@one97.net wrote:
HI Team,
i m facing problem. please help me out its a testing sever
showing
On Tue, Jul 19, 2011 at 6:29 PM, Vikas Srivastava
vikas.srivast...@one97.net wrote:
HI Team,
we are using 1 namenode with 11 Datanode each of (16GB ram and 1.4 tb hdd)
i m getting this error while running any query , simple its not working
when we use any map tasks.
and we are using
into that..
Regards
Vikas Srivastava
9560885900
On Tue, Jul 19, 2011 at 7:03 PM, Edward Capriolo edlinuxg...@gmail.comwrote:
It must be a hostname or DNS problem. Use dig and ping to find out what is
wrong.
On Tue, Jul 19, 2011 at 9:05 AM, Vikas Srivastava
vikas.srivast...@one97.net wrote:
On Tue
49 matches
Mail list logo