Re: comments in hive ql

2010-02-16 Thread Amr Awadallah
-- On 2/16/2010 9:03 PM, prasenjit mukherjee wrote: How do I add a commented line in hive ql file ? -- or # or //

comments in hive ql

2010-02-16 Thread prasenjit mukherjee
How do I add a commented line in hive ql file ? -- or # or //

Re: [VOTE] release hive 0.5.0

2010-02-16 Thread Carl Steinbach
+1 On Mon, Feb 15, 2010 at 1:41 PM, Zheng Shao wrote: > Hive branch 0.5 was created 5 weeks ago: > https://svn.apache.org/viewvc/hadoop/hive/branches/branch-0.5/ > > It has also been running as the production version of Hive at Facebook > for 2 weeks. > > > We'd like to start making release cand

Re: Help with Compressed Storage

2010-02-16 Thread Adam O'Donnell
Adding these to my hive-site.xml file worked fine: hive.exec.compress.output true Compress output mapred.output.compression.type BLOCK Block compression On Tue, Feb 16, 2010 at 1:43 PM, Brent Miller wrote: > Hello, I've seen issues si

Re: Help with Compressed Storage

2010-02-16 Thread Brent Miller
Thank you for the responses and I'm terribly sorry if I'm missing something obvious here, but after going through google searches a second time and reviewing your feedback, I'm still having issues with compressed storage not seeming to work correctly. The commands that I've been entering into the

Re: Help with Compressed Storage

2010-02-16 Thread Yongqiang He
Like Zheng said, Try set hive.exec.compress.output=true; "set hive.exec.compress.intermediate=true" is not recommended because of the cpu cost. Also in some cases, set hive.merge.mapfiles = false; will help getting a better compression. On 2/16/10 2:04 PM, "Zheng Shao" wrote: > Try google "Hiv

Re: Help with Compressed Storage

2010-02-16 Thread Zheng Shao
Try google "Hive compression": See http://svn.apache.org/viewvc/hadoop/hive/trunk/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java?p2=/hadoop/hive/trunk/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java&p1=/hadoop/hive/trunk/common/src/java/org/apache/hadoop/hive/conf/HiveConf.ja

Help with Compressed Storage

2010-02-16 Thread Brent Miller
Hello, I've seen issues similar to this one come up once or twice before, but I haven't ever seen a solution to the problem that I'm having. I was following the Compressed Storage page on the Hive Wiki http://wiki.apache.org/hadoop/CompressedStorage and realized that the sequence files that are cre

Re: Working UDF for GeoIP lookup?

2010-02-16 Thread Edward Capriolo
On Tue, Feb 16, 2010 at 3:23 PM, Edward Capriolo wrote: > On Tue, Feb 16, 2010 at 2:54 PM, Eric Arenas wrote: >> Hi Ed, >> >> I created a similar UDF some time ago, and if I am not mistaken you have to >> assume that your file is going to be in the same directory, as in: >> >> path_of_dat_file =

Re: Working UDF for GeoIP lookup?

2010-02-16 Thread Edward Capriolo
On Tue, Feb 16, 2010 at 2:54 PM, Eric Arenas wrote: > Hi Ed, > > I created a similar UDF some time ago, and if I am not mistaken you have to > assume that your file is going to be in the same directory, as in: > > path_of_dat_file = "./name_of_file"; > > And it worked for me, > > let me know if t

Re: Fields delimited by unicode chars

2010-02-16 Thread Eric Arenas
Hi Prasen, this should work... (I use CTRL+A as delimiter) hive> create table test_ea (col1 string) row format delimited fields terminated by '\001' stored as textfile; In your case, '\002' should work. regards Eric Arenas From: prasenjit mukherjee To: hi

Re: Working UDF for GeoIP lookup?

2010-02-16 Thread Eric Arenas
Hi Ed, I created a similar UDF some time ago, and if I am not mistaken you have to assume that your file is going to be in the same directory, as in: path_of_dat_file = "./name_of_file"; And it worked for me, let me know if this solves your issue, and if not, I will look into my old code and

Re: Working UDF for GeoIP lookup?

2010-02-16 Thread Edward Capriolo
On Mon, Feb 15, 2010 at 12:02 PM, Edward Capriolo wrote: > On Mon, Feb 15, 2010 at 11:27 AM, Adam J. O'Donnell wrote: >> Edward: >> >> I don't have access to the individual data nodes, so I can't install the >> pure perl module. I tried distributing it via the add file command, but that >> is man

Fields delimited by unicode chars

2010-02-16 Thread prasenjit mukherjee
I have a pig statement which works fine : raw_data = LOAD '$input_path' USING PigStorage*('\u0002'*) AS ... And I am trying to use the corresponding hive QL : CREATE EXTERNAL TABLE tx_log(..) ROW FORMAT DELIMITED FIELDS TERMINATED BY *'\u0002'* STORED AS TEXTFILE LOCATION '/ip/data/tx_lo