Great! So how far is a release candidate or nightly build now ?
Saurabh.
On Wed, Aug 5, 2009 at 10:46 AM, Zheng Shao wrote:
> Namit just committed Todd's patch of HIVE-487, so hive trunk is
> already compatible with hadoop 0.20 now.
>
> Zheng
>
> On Tue, Aug 4, 2009 at 10:01 PM, Saurabh Nanda
>
Namit just committed Todd's patch of HIVE-487, so hive trunk is
already compatible with hadoop 0.20 now.
Zheng
On Tue, Aug 4, 2009 at 10:01 PM, Saurabh Nanda wrote:
> Another line of reasoning -- if Hive-trunk is not compiling with Hadoop
> 0.20, then people with Hadoop 0.20 are anyways living wi
We have added some examples and general guide on how to write a new
SerDe and UDF/UDAF.
Please see
http://www.slideshare.net/ragho/hive-user-meeting-august-2009-facebook
Page 59-87
--
Yours,
Zheng
Another line of reasoning -- if Hive-trunk is not compiling with Hadoop
0.20, then people with Hadoop 0.20 are anyways living with their bugs. That
should not be a reason to not put out a new build. With a new build, at
least people who're on newer versions of Hadoop get their Hive bug fixes &
feat
> [btw from my experience support for handling dates in hive is extremely
> weak, and code becomes heavily convoluted to do it properly]
>
That's the first thing I noticed -- lack of native date format. However, if
you mange the zero-padding while importing the data, storing dates as
strings work
> The major thing on that is we have to build releases for every hadoop
> major/minor and possibly one off the trunk. I was thinking of doing
> something similar on my site since accomplishing this is possible with
> hudson. Does anyone think adding this to hadoop hive is something we
> should do.
Hadoop Fans, it looks like most of you prefer to have this on Thursday
(November 5th), so that's what we'll plan for.
Anyone is welcome to come to this meetup, even if you don't attend
ApacheCon. We'd love to hear more about various sub-projects and cool
Hadoop applications, tips, tricks, etc.
If
I would love to see nightly/periodic builds published somewhere, especially
if it's going to be some time before Hive 0.4 is released.
It seems like people new to Hive get the "check out and build from the
trunk" or "apply this patch" answer often on this list. Having nightly
builds would make lif
We don't have such a udf right now, but it is extremely easy to write one.
Take a look at this (and search for UDF):
http://www.slideshare.net/ragho/hive-user-meeting-august-2009-facebook
Let us know if you have any questions.
Zheng
On Tue, Aug 4, 2009 at 7:21 AM, Andraz Tori wrote:
> Hi Ashish
Everything inside the quotes will be executed using a shell, similar to that of
Hadoop Streaming. Unless Hadoop Streeaming changes, Hive would not. And it is
highly unlikely that Hadoop Streaming changes since lot of people are dependent
on it.
Prasad
From: Sa
On Tue, Aug 4, 2009 at 1:20 PM, Saurabh Nanda wrote:
> Is there any possibility of having a nightly build off the trunk,
> before hive 0.4 is officially released?
>
> On 8/4/09, Edward Capriolo wrote:
>> On Tue, Aug 4, 2009 at 10:43 AM, Bill Graham wrote:
>>> +1
>>>
>>> I agree. I do not know the
Prasad, my query was more about the concept. Not the syntax. I hope
this will still work when hive 0.4 is released. I'm basing a lot on
it.
On 8/4/09, Prasad Chakka wrote:
> Quotes should work.
>
> Prasad
>
>
>
> From: Saurabh Nanda
> Reply-To:
> Date: Tue, 4 Au
Is there any possibility of having a nightly build off the trunk,
before hive 0.4 is officially released?
On 8/4/09, Edward Capriolo wrote:
> On Tue, Aug 4, 2009 at 10:43 AM, Bill Graham wrote:
>> +1
>>
>> I agree. I do not know the answer to that one. Can anyone comment on the
>> future Hive rel
Quotes should work.
Prasad
From: Saurabh Nanda
Reply-To:
Date: Tue, 4 Aug 2009 05:43:34 -0700
To:
Subject: Passing parameters to custom map/reduce scripts
Can I pass parameters to custom map/reduce scripts like this:
from try_hits
select transform ip
On Tue, Aug 4, 2009 at 10:43 AM, Bill Graham wrote:
> +1
>
> I agree. I do not know the answer to that one. Can anyone comment on the
> future Hive release schedule?
>
>
> On Tue, Aug 4, 2009 at 7:39 AM, Saurabh Nanda
> wrote:
>>
>> I was dreading this response! Are there any plans to push out a n
+1
I agree. I do not know the answer to that one. Can anyone comment on the
future Hive release schedule?
On Tue, Aug 4, 2009 at 7:39 AM, Saurabh Nanda wrote:
> I was dreading this response! Are there any plans to push out a new Hive
> build with the latest features & bug fixes? Building from t
I was dreading this response! Are there any plans to push out a new Hive
build with the latest features & bug fixes? Building from trunk is not
everyone's cup of tea, you know :-)
Any nightly builds that I can pick up?
Saurabh.
On Tue, Aug 4, 2009 at 8:05 PM, Bill Graham wrote:
> This bug has
This bug has been fixed on the trunk. Check out the hive trunk and build the
JDBC driver and you should be fine.
On Tue, Aug 4, 2009 at 12:47 AM, Saurabh Nanda wrote:
> Here's what I'm trying:
>
>ResultSet rs=st.executeQuery("show tables");
> while(rs.next()) {
> Syste
Hi Ashish, thanks for help!
1) regarding the group-by-week, it was entirely my fault... I miswrote
the function... to get grouping by week one needs to have
FIELD: date_add('2007-12-31', (datediff(date_field, '2007-12-31') / 7) *
7)
GROUP BY: datediff(date_field, '2007-12-31') / 7
This then works
> Does Hive currently support column-based storage?
Y, hive support column-based storage. RCFile. You can create a table using
command like :
Create table tbl(cols) stored as RCFile
to try Hive's column storage.
> For example, can we use HBase tables as Hive table inputs for HQL?
Right now, H
Hi to all Hivers out there!
Does Hive currently support column-based storage?
For example, can we use HBase tables as Hive table inputs for HQL?
Thanks in advance,
Haggai
Can I pass parameters to custom map/reduce scripts like this:
from try_hits
select transform ip_address, aid, uid
using 'parse_logs.rb MY_PARAM_HERE' as ip_address, aid, uid
I've tried it, and it seems to work, but I just want to be sure whether this
is guaranteed to work under all ci
Here's what I'm trying:
ResultSet rs=st.executeQuery("show tables");
while(rs.next()) {
System.out.println(rs.getString(1));
}
The while loop never terminates, after going through the list of tables, it
keeps printing the last table name over & over again. Am I
23 matches
Mail list logo