2010/3/11 tom kersnick :
> Tried this:
>
> package com.eharmony.pipeline.analytics.udf;
>
> import org.apache.hadoop.hive.ql.exec.UDF;
> import java.util.Collections;
>
>
> public final class test extends UDF {
> public double evaluate("Double[]" values) {
No " in your code and also try return
Tried this:
package com.eharmony.pipeline.analytics.udf;
import org.apache.hadoop.hive.ql.exec.UDF;
import java.util.Collections;
public final class test extends UDF {
public double evaluate("Double[]" values) {
final Integer len = values.length;
final Integer half = len / 2;
Try "Double[]". Primitive arrays (like double[], int[]) are not
supported yet, because that needs special handling for each of the
primitive type.
Zheng
On Wed, Mar 10, 2010 at 4:55 PM, tom kersnick wrote:
> Gents,
>
> Any ideas why this happens? Im using hive 0.50 with hadoop 20.2.
>
> This is
Gents,
Any ideas why this happens? Im using hive 0.50 with hadoop 20.2.
This is a super simple UDF.
Im just taking the length of the values and then dividing by pi. It keeps
popping up with this error:
FAILED: Unknown exception: [D cannot be cast to [Ljava.lang.Object;
Here is my approach
The parse_url UDF works in general but the common use case is querying
apache logs which do not include the protocol or host portions - you need to
include a concat() call.
Also, the docs on parse_url are wrong around the query parameter parsing
feature. The describe statement above shows the actu
Thanks Carl,
I was consistently using IP addresses throughout, but swapping to host names
made it all work fine.
Cheers,
Tim
On Wed, Mar 10, 2010 at 6:10 PM, Carl Steinbach wrote:
> Hi Tim,
>
> You need to consistently use the same IP address in all of your
> configuration files. See http:/
Hi Tim,
You need to consistently use the same IP address in all of your
configuration files. See
http://issues.apache.org/jira/browse/HADOOP-5191for more information.
Thanks.
Carl
On Wed, Mar 10, 2010 at 5:05 AM, Tim Robertson wrote:
> Hi all,
>
> I am running through the first basics on a fre
Hi all,
I am running through the first basics on a fresh SVN build of Hive on top of
0.20.1 and seeing:
Wrong FS: hdfs://
192.168.76.254:54310/tmp/hive-root/hive_2010-03-10_13-57-21_380_7559079400609274856,
expected: hdfs://master.gbif.clu:54310
All my Hadoop config has IP addresses. Do they ne
What is the format of your data?
TBinaryProtocol does not work with TextFile format, as you can imagine.
On 3/10/10, Anty wrote:
> Hi: ALL
>
> I encounter a problem, any suggestion will be appreciated!
> MY hive version is 0.30.0
> I create a table in CLI.
> CREATE TABLE table2 (boo int,bar str
Hi: ALL
I encounter a problem, any suggestion will be appreciated!
MY hive version is 0.30.0
I create a table in CLI.
CREATE TABLE table2 (boo int,bar string)
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe'
WITH SERDEPROPERTIES (
'serialization.format'=org.apache.hadoop.
10 matches
Mail list logo