Fwd: Hive with JDBC

2012-03-16 Thread hadoop hive
-- Forwarded message -- From: hadoop hive hadooph...@gmail.com Date: Fri, Mar 16, 2012 at 2:04 PM Subject: Hive with JDBC To: u...@hive.apache.org HI folks, I m facing a problem while when i fired a query through java code, its returns around half a million records which make

help for snappy

2012-02-26 Thread hadoop hive
Hey folks, i m using hadoop 0.20.2 + r911707 , please tell me the installation and how to use snappy for compression and decompression Regards Vikas Srivastava

Re: help for snappy

2012-02-26 Thread hadoop hive
+Installation#SnappyInstallation-UsingSnappyforMapReduceCompression best, Alex -- Alexander Lorenz http://mapredit.blogspot.com On Feb 27, 2012, at 7:16 AM, hadoop hive wrote: Hey folks, i m using hadoop 0.20.2 + r911707 , please tell me the installation and how to use snappy

Re: help for snappy

2012-02-26 Thread hadoop hive
for. For storing snappy compressed files in HDFS you should use Pig or Flume. -- Alexander Lorenz http://mapredit.blogspot.com On Feb 27, 2012, at 7:28 AM, hadoop hive wrote: thanks Alex, i m using Apache hadoop, steps i followed 1:- untar snappy 2:- entry in mapred site

Re: help for snappy

2012-02-26 Thread hadoop hive
the jars in your classpath for. For storing snappy compressed files in HDFS you should use Pig or Flume. -- Alexander Lorenz http://mapredit.blogspot.com On Feb 27, 2012, at 7:28 AM, hadoop hive wrote: thanks Alex, i m using Apache hadoop, steps i followed 1:- untar snappy

Changing into Replication factor

2012-02-21 Thread hadoop hive
HI Folks, Rite now i m having replication factor 2, but now i want to make it three for sum tables so how can i do that for specific tables, so that whenever the data would be loaded in those tables it can automatically replicated into three nodes. Or i need to replicate for all the tables. and

Re: HELP - Problem in setting up Hadoop - Multi-Node Cluster

2012-02-09 Thread hadoop hive
did you make check the ssh between localhost means its should be ssh password less between localhost public-key =authorized_key On Thu, Feb 9, 2012 at 1:06 AM, Robin Mueller-Bady robin.mueller-b...@oracle.com wrote: Dear Guruprasad, it would be very helpful to provide details from your

Re: is it possible to specify an empty key-value separator for TextOutputFormat?

2012-02-09 Thread hadoop hive
hey luca, you can use conf.set(*mapred.textoutputformat.separator*, ); hope it works fine regards Vikas Srivastava On Thu, Feb 9, 2012 at 3:57 PM, Luca Pireddu pire...@crs4.it wrote: Hello list, I'm trying to specify from the command line an empty string as the key-value separator for

Re: How to setup Hive on a single node ?

2012-02-09 Thread hadoop hive
hey Lac, its showing like you dont have DBS table in metastore(derby or mysql), actually you have to again install the hive or again build hive through ANT. Check you metastore(that DBS is exists or not) Thanks regards Vikas Srivastava On Fri, Feb 10, 2012 at 8:33 AM, Lac Trung

Why its take much take to live all the datanode in Jobtracker UI

2012-02-07 Thread hadoop hive
Hi Folks, I added a node in cluster , and restart the cluster but its taking much time to come all the server live in Jobtracker UI, its only showing the added server in cluster. I there any specific reason for this or anything, Thanks Vikas Srivastava

Re: Why its take much take to live all the datanode in Jobtracker UI

2012-02-07 Thread hadoop hive
is also adding by their ip's Regards Vikas Srivastava On Wed, Feb 8, 2012 at 11:28 AM, Harsh J ha...@cloudera.com wrote: Hi, Can you provide your tasktracker startup log as a pastebin.com link? Also your JT log grepped for Adding a new node? On Wed, Feb 8, 2012 at 11:13 AM, hadoop hive hadooph

Problem in reduce phase

2012-02-03 Thread hadoop hive
hey folks, i m getting this error while running mapreduce and these comes up in reduce phase.. 2012-02-03 16:41:19,780 WARN org.apache.hadoop.mapred.ReduceTask: attempt_201201271626_5282_r_00_0 copy failed: attempt_201201271626_5282_m_07_2 from hadoopdata3 2012-02-03 16:41:19,954 WARN

Problem in reduce phase(critical)

2012-02-03 Thread hadoop hive
On Fri, Feb 3, 2012 at 4:56 PM, hadoop hive hadooph...@gmail.com wrote: hey folks, i m getting this error while running mapreduce and these comes up in reduce phase.. 2012-02-03 16:41:19,780 WARN org.apache.hadoop.mapred.ReduceTask: attempt_201201271626_5282_r_00_0 copy failed

Problem when starting datanode

2012-02-02 Thread hadoop hive
hey folks , I m getting an when i starting my datanode. can any1 have the idea what this error about. 2012-02-03 11:57:02,947 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration( 10.0.3.31:50010, storageID=DS-1677953808-10.0.3.31-50010-1318330317888, infoPort=50075,

Re: Reduce copy at 0.00 MB/s

2012-02-01 Thread hadoop hive
Hey , Can any1 help me with this, i have increases the reduce slowstart to .25 but its still hangs after copy . tell me what else i can change it to make it working fine. regards Vikas Srivastava On Wed, Jan 25, 2012 at 7:45 PM, praveenesh kumar praveen...@gmail.comwrote: Yeah , I am doing

Re: ClassNotFound just started with custom mapper

2012-01-30 Thread hadoop hive
hey Hema, I m not sure but the problem is with you hdfs name *hdfs://vm-acd2-4c51:54310/ , *change you host name and it ll run fine. specially remove - from hostname. regards Vikas Srivastava On Tue, Jan 31, 2012 at 4:07 AM, Subramanian, Hema hema.subraman...@citi.com wrote: I am facing

Re: jobtracker url(Critical)

2012-01-29 Thread hadoop hive
then a couple of days. On Friday, January 27, 2012, hadoop hive hadooph...@gmail.com wrote: Hey Harsh, but after sumtym they are available 1 by 1 in jobtracker URL. any idea how they add up slowly slowly. regards Vikas On Fri, Jan 27, 2012 at 5:05 PM, Harsh J ha...@cloudera.com

Re: NoSuchElementException while Reduce step

2012-01-27 Thread hadoop hive
hey there must be sum problem with the key or value, reducer didnt find the expected value. On Fri, Jan 27, 2012 at 1:23 AM, Rajesh Sai T tsairaj...@gmail.com wrote: Hi, I'm new to Hadoop. I'm trying to write my custom data types for Writable types. So, that Map class will produce my

jobtracker url(Critical)

2012-01-27 Thread hadoop hive
Hey folks, i m facing a problem, with job Tracker URL, actually i added a node to the cluster and after sometime i restart the cluster, then i found that my job tracker is showing recent added node in *nodes * but rest of nodes are not available not even in *blacklist. * * * can any1 have any

Re: jobtracker url(Critical)

2012-01-27 Thread hadoop hive
no communication errors in their logs? Did you perhaps bring up a firewall accidentally, that was not present before? On Fri, Jan 27, 2012 at 4:47 PM, hadoop hive hadooph...@gmail.com wrote: Hey folks, i m facing a problem, with job Tracker URL, actually i added a node to the cluster

Re: Reduce copy at 0.00 MB/s

2012-01-25 Thread hadoop hive
i face the same issue but after sumtime when i balanced the cluster the jobs started running fine, On Wed, Jan 25, 2012 at 3:34 PM, praveenesh kumar praveen...@gmail.comwrote: Hey, Can anyone explain me what is reduce copy phase in the reducer section ? The (K,List(V)), is passed to the

Re: Reduce copy at 0.00 MB/s

2012-01-25 Thread hadoop hive
this problem arise after adding a node , so then i start balancer to make it balance , On Wed, Jan 25, 2012 at 4:38 PM, praveenesh kumar praveen...@gmail.comwrote: @hadoophive Can you explain more by balance the cluster ? Thanks, Praveenesh On Wed, Jan 25, 2012 at 4:29 PM, hadoop hive

Re: JobTracker webUI stopped showing suddenly

2012-01-11 Thread hadoop hive
your job tracker is not running On Wed, Jan 11, 2012 at 7:08 PM, praveenesh kumar praveen...@gmail.comwrote: Jobtracker webUI suddenly stopped showing. It was working fine before. What could be the issue ? Can anyone guide me how can I recover my WebUI ? Thanks, Praveenesh