[jira] Updated: (HIVE-1311) bug is use of hadoop supports splittable

2010-04-15 Thread Namit Jain (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1311?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Namit Jain updated HIVE-1311:
-

Attachment: hive.1311.1.patch

 bug is use of hadoop supports splittable
 

 Key: HIVE-1311
 URL: https://issues.apache.org/jira/browse/HIVE-1311
 Project: Hadoop Hive
  Issue Type: Bug
  Components: Query Processor
Reporter: Namit Jain
Assignee: Namit Jain
 Fix For: 0.6.0

 Attachments: hive.1311.1.patch


 CombineHiveInputFormat: getSplits()
  if (this.mrwork != null  this.mrwork.getHadoopSupportsSplittable()) 
 should check if hadoop supports splittable is false

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Created: (HIVE-1311) bug is use of hadoop supports splittable

2010-04-15 Thread Namit Jain (JIRA)
bug is use of hadoop supports splittable


 Key: HIVE-1311
 URL: https://issues.apache.org/jira/browse/HIVE-1311
 Project: Hadoop Hive
  Issue Type: Bug
  Components: Query Processor
Reporter: Namit Jain
Assignee: Namit Jain
 Attachments: hive.1311.1.patch


CombineHiveInputFormat: getSplits()
 if (this.mrwork != null  this.mrwork.getHadoopSupportsSplittable()) 


should check if hadoop supports splittable is false

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1311) bug is use of hadoop supports splittable

2010-04-15 Thread Namit Jain (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1311?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Namit Jain updated HIVE-1311:
-

   Status: Patch Available  (was: Open)
Fix Version/s: 0.6.0

 bug is use of hadoop supports splittable
 

 Key: HIVE-1311
 URL: https://issues.apache.org/jira/browse/HIVE-1311
 Project: Hadoop Hive
  Issue Type: Bug
  Components: Query Processor
Reporter: Namit Jain
Assignee: Namit Jain
 Fix For: 0.6.0

 Attachments: hive.1311.1.patch


 CombineHiveInputFormat: getSplits()
  if (this.mrwork != null  this.mrwork.getHadoopSupportsSplittable()) 
 should check if hadoop supports splittable is false

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HIVE-1311) bug is use of hadoop supports splittable

2010-04-15 Thread Ning Zhang (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-1311?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12857203#action_12857203
 ] 

Ning Zhang commented on HIVE-1311:
--

+1 will commit if tests pass.

 bug is use of hadoop supports splittable
 

 Key: HIVE-1311
 URL: https://issues.apache.org/jira/browse/HIVE-1311
 Project: Hadoop Hive
  Issue Type: Bug
  Components: Query Processor
Reporter: Namit Jain
Assignee: Namit Jain
 Fix For: 0.6.0

 Attachments: hive.1311.1.patch


 CombineHiveInputFormat: getSplits()
  if (this.mrwork != null  this.mrwork.getHadoopSupportsSplittable()) 
 should check if hadoop supports splittable is false

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1311) bug is use of hadoop supports splittable

2010-04-15 Thread Zheng Shao (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1311?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Zheng Shao updated HIVE-1311:
-

  Status: Resolved  (was: Patch Available)
Hadoop Flags: [Reviewed]
Release Note: HIVE-1311. Bug in use of parameter hadoop supports 
splittable. (Namit Jain via zshao)
  Resolution: Fixed

Committed. Thanks Namit!
(Sorry I didn't see Ning's comment before committing)

 bug is use of hadoop supports splittable
 

 Key: HIVE-1311
 URL: https://issues.apache.org/jira/browse/HIVE-1311
 Project: Hadoop Hive
  Issue Type: Bug
  Components: Query Processor
Reporter: Namit Jain
Assignee: Namit Jain
 Fix For: 0.6.0

 Attachments: hive.1311.1.patch


 CombineHiveInputFormat: getSplits()
  if (this.mrwork != null  this.mrwork.getHadoopSupportsSplittable()) 
 should check if hadoop supports splittable is false

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Created: (HIVE-1312) hive trunk does compile with hadoop 0.17 any more

2010-04-15 Thread Zheng Shao (JIRA)
hive trunk does compile with hadoop 0.17 any more
-

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi


This is caused by HIVE-1295.

{code}
compile:
 [echo] Compiling: hive
[javac] Compiling 527 source files to 
/hadoop_hive_trunk/.ptest_0/build/ql/classes
[javac] 
/hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
tputFormat.java:69: cannot find symbol
[javac] symbol  : method getBytes()
[javac] location: class org.apache.hadoop.io.BytesWritable
[javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
[javac] ^
[javac] 
/hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
tputFormat.java:69: cannot find symbol
[javac] symbol  : method getLength()
[javac] location: class org.apache.hadoop.io.BytesWritable
[javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
[javac]   ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.
[javac] 2 errors
{code}


-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




Build failed in Hudson: Hive-trunk-h0.17 #412

2010-04-15 Thread Apache Hudson Server
See http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/412/changes

Changes:

[zshao] HIVE-1311. Bug in use of parameter hadoop supports splittable. (Namit 
Jain via zshao)

[namit] HIVE-1002. multi-partition inserts
(Ning Zhang via namit)

[namit] HIVE-1295. facilitate HBase bulk loads from Hive
(John Sichi via namit)

--
[...truncated 392 lines...]
init:

compile:
 [echo] Compiling: anttasks

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#common;work...@minerva.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.17.2.1 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1030ms :: artifacts dl 12ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/3ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: common
[javac] Compiling 5 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/common/classes

jar:
 [echo] Jar: common
  [jar] Building jar: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/common/hive-common-0.6.0.jar

create-dirs:
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/serde
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/serde/classes
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/serde/test
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/serde/test/src
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/serde/test/classes

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

jar:

init:

dynamic-serde:

compile:
 [echo] Compiling: hive
[javac] Compiling 224 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/serde/classes
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.

jar:
 [echo] Jar: serde
  [jar] Building jar: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/serde/hive-serde-0.6.0.jar

create-dirs:
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore/classes
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore/test
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore/test/src
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore/test/classes

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

jar:

init:

model-compile:
[javac] Compiling 8 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore/classes
 [copy] Copying 1 file to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore/classes

core-compile:
 [echo] Compiling: 
[javac] Compiling 36 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/metastore/classes
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.

model-enhance:
[datanucleusenhancer] log4j:WARN No appenders could be found for 

Build failed in Hudson: Hive-trunk-h0.18 #415

2010-04-15 Thread Apache Hudson Server
See http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/415/changes

Changes:

[zshao] HIVE-1311. Bug in use of parameter hadoop supports splittable. (Namit 
Jain via zshao)

[namit] HIVE-1002. multi-partition inserts
(Ning Zhang via namit)

[namit] HIVE-1295. facilitate HBase bulk loads from Hive
(John Sichi via namit)

--
[...truncated 3646 lines...]

init:

compile:
 [echo] Compiling: anttasks

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#common;work...@minerva.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.18.3 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1035ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/3ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: common
[javac] Compiling 5 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/common/classes

jar:
 [echo] Jar: common
  [jar] Building jar: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/common/hive-common-0.6.0.jar

create-dirs:
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/classes
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/test
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/test/src
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/test/classes

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

jar:

init:

dynamic-serde:

compile:
 [echo] Compiling: hive
[javac] Compiling 224 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/classes
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.

jar:
 [echo] Jar: serde
  [jar] Building jar: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/hive-serde-0.6.0.jar

create-dirs:
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/test
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/test/src
[mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/test/classes

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks

jar:

init:

model-compile:
[javac] Compiling 8 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes
 [copy] Copying 1 file to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes

core-compile:
 [echo] Compiling: 
[javac] Compiling 36 source files to 
http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.

model-enhance:
[datanucleusenhancer] log4j:WARN No appenders could be found for 

[jira] Commented: (HIVE-1312) hive trunk does compile with hadoop 0.17 any more

2010-04-15 Thread John Sichi (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12857365#action_12857365
 ] 

John Sichi commented on HIVE-1312:
--

Whoops, I'll take a look.


 hive trunk does compile with hadoop 0.17 any more
 -

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi

 This is caused by HIVE-1295.
 {code}
 compile:
  [echo] Compiling: hive
 [javac] Compiling 527 source files to 
 /hadoop_hive_trunk/.ptest_0/build/ql/classes
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getBytes()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac] ^
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getLength()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac]   ^
 [javac] Note: Some input files use or override a deprecated API.
 [javac] Note: Recompile with -Xlint:deprecation for details.
 [javac] Note: Some input files use unchecked or unsafe operations.
 [javac] Note: Recompile with -Xlint:unchecked for details.
 [javac] 2 errors
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1312) hive trunk does not compile with hadoop 0.17 any more

2010-04-15 Thread John Sichi (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

John Sichi updated HIVE-1312:
-

Summary: hive trunk does not compile with hadoop 0.17 any more  (was: hive 
trunk does compile with hadoop 0.17 any more)

 hive trunk does not compile with hadoop 0.17 any more
 -

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi

 This is caused by HIVE-1295.
 {code}
 compile:
  [echo] Compiling: hive
 [javac] Compiling 527 source files to 
 /hadoop_hive_trunk/.ptest_0/build/ql/classes
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getBytes()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac] ^
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getLength()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac]   ^
 [javac] Note: Some input files use or override a deprecated API.
 [javac] Note: Recompile with -Xlint:deprecation for details.
 [javac] Note: Some input files use unchecked or unsafe operations.
 [javac] Note: Recompile with -Xlint:unchecked for details.
 [javac] 2 errors
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1312) hive trunk does not compile with hadoop 0.17 any more

2010-04-15 Thread John Sichi (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

John Sichi updated HIVE-1312:
-

   Status: Patch Available  (was: Open)
Fix Version/s: 0.6.0

Here is the fix.


 hive trunk does not compile with hadoop 0.17 any more
 -

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi
 Fix For: 0.6.0

 Attachments: HIVE-1312.1.patch


 This is caused by HIVE-1295.
 {code}
 compile:
  [echo] Compiling: hive
 [javac] Compiling 527 source files to 
 /hadoop_hive_trunk/.ptest_0/build/ql/classes
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getBytes()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac] ^
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getLength()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac]   ^
 [javac] Note: Some input files use or override a deprecated API.
 [javac] Note: Recompile with -Xlint:deprecation for details.
 [javac] Note: Some input files use unchecked or unsafe operations.
 [javac] Note: Recompile with -Xlint:unchecked for details.
 [javac] 2 errors
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1312) hive trunk does not compile with hadoop 0.17 any more

2010-04-15 Thread John Sichi (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

John Sichi updated HIVE-1312:
-

Attachment: HIVE-1312.1.patch

 hive trunk does not compile with hadoop 0.17 any more
 -

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi
 Fix For: 0.6.0

 Attachments: HIVE-1312.1.patch


 This is caused by HIVE-1295.
 {code}
 compile:
  [echo] Compiling: hive
 [javac] Compiling 527 source files to 
 /hadoop_hive_trunk/.ptest_0/build/ql/classes
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getBytes()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac] ^
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getLength()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac]   ^
 [javac] Note: Some input files use or override a deprecated API.
 [javac] Note: Recompile with -Xlint:deprecation for details.
 [javac] Note: Some input files use unchecked or unsafe operations.
 [javac] Note: Recompile with -Xlint:unchecked for details.
 [javac] 2 errors
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HIVE-1312) hive trunk does not compile with hadoop 0.17 any more

2010-04-15 Thread Ning Zhang (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12857528#action_12857528
 ] 

Ning Zhang commented on HIVE-1312:
--

cool. will test and commit.

 hive trunk does not compile with hadoop 0.17 any more
 -

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi
 Fix For: 0.6.0

 Attachments: HIVE-1312.1.patch


 This is caused by HIVE-1295.
 {code}
 compile:
  [echo] Compiling: hive
 [javac] Compiling 527 source files to 
 /hadoop_hive_trunk/.ptest_0/build/ql/classes
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getBytes()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac] ^
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getLength()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac]   ^
 [javac] Note: Some input files use or override a deprecated API.
 [javac] Note: Recompile with -Xlint:deprecation for details.
 [javac] Note: Some input files use unchecked or unsafe operations.
 [javac] Note: Recompile with -Xlint:unchecked for details.
 [javac] 2 errors
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HIVE-1002) multi-partition inserts

2010-04-15 Thread Ning Zhang (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-1002?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12857530#action_12857530
 ] 

Ning Zhang commented on HIVE-1002:
--

Wiki has been updated for this feature's syntax and semantics:

http://wiki.apache.org/hadoop/Hive/Tutorial#Dynamic-partition_Insert



 multi-partition inserts
 ---

 Key: HIVE-1002
 URL: https://issues.apache.org/jira/browse/HIVE-1002
 Project: Hadoop Hive
  Issue Type: New Feature
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: Ning Zhang
 Fix For: 0.6.0

 Attachments: HIVE-1002.1.patch, HIVE-1002.2.patch, HIVE-1002.4.patch, 
 HIVE-1002.patch


 We should allow queries like this into a partitioned table:
 {code}
 CREATE TABLE (a STRING, b STRING, c STRING)
 PARTITIONED BY (ds STRING, ts STRING);
 INSERT OVERWRITE TABLE x PARTITION (ds = '2009-12-12')
 SELECT a, b, c, ts FROM xxx;
 {code}
 Basically, allowing users to overwrite multiple partitions at a time.
 The partition values specified in PARTITION part (if any) should be a prefix 
 of the partition keys.
 The rest of the partition keys goes to the end of the SELECT expression list.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HIVE-1308) boolean = boolean throws NPE

2010-04-15 Thread Namit Jain (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-1308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12857534#action_12857534
 ] 

Namit Jain commented on HIVE-1308:
--

will take a look

 boolean = boolean throws NPE
 

 Key: HIVE-1308
 URL: https://issues.apache.org/jira/browse/HIVE-1308
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Paul Yang
Assignee: Paul Yang
 Attachments: HIVE-1308.1.patch, HIVE-1308.2.patch


 Workaround is to just use boolean or NOT boolean
 {code}
 hive select true=true from src;
 FAILED: Hive Internal Error: java.lang.NullPointerException(null)
 java.lang.NullPointerException
 at 
 org.apache.hadoop.hive.ql.udf.generic.GenericUDFUtils$ConversionHelper.init(GenericUDFUtils.java:212)
 at 
 org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.initialize(GenericUDFBridge.java:138)
 at 
 org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:153)
 at 
 org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:587)
 at 
 org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:708)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:89)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:88)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:128)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:102)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:6136)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1831)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1663)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:4911)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:5421)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:5952)
 at 
 org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:126)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:304)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:377)
 at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
 at 
 org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1312) hive trunk does not compile with hadoop 0.17 any more

2010-04-15 Thread John Sichi (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

John Sichi updated HIVE-1312:
-

Attachment: HIVE-1312.2.patch

Here's an updated patch which also backs out the jsp-2.1 change to the 
classpath in ql/build.xml, since that is breaking 0.17 also.


 hive trunk does not compile with hadoop 0.17 any more
 -

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi
 Fix For: 0.6.0

 Attachments: HIVE-1312.1.patch, HIVE-1312.2.patch


 This is caused by HIVE-1295.
 {code}
 compile:
  [echo] Compiling: hive
 [javac] Compiling 527 source files to 
 /hadoop_hive_trunk/.ptest_0/build/ql/classes
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getBytes()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac] ^
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getLength()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac]   ^
 [javac] Note: Some input files use or override a deprecated API.
 [javac] Note: Recompile with -Xlint:deprecation for details.
 [javac] Note: Some input files use unchecked or unsafe operations.
 [javac] Note: Recompile with -Xlint:unchecked for details.
 [javac] 2 errors
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1308) boolean = boolean throws NPE

2010-04-15 Thread Namit Jain (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Namit Jain updated HIVE-1308:
-

  Status: Resolved  (was: Patch Available)
Hadoop Flags: [Reviewed]
  Resolution: Fixed

Committed. Thanks Paul

 boolean = boolean throws NPE
 

 Key: HIVE-1308
 URL: https://issues.apache.org/jira/browse/HIVE-1308
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Paul Yang
Assignee: Paul Yang
 Attachments: HIVE-1308.1.patch, HIVE-1308.2.patch


 Workaround is to just use boolean or NOT boolean
 {code}
 hive select true=true from src;
 FAILED: Hive Internal Error: java.lang.NullPointerException(null)
 java.lang.NullPointerException
 at 
 org.apache.hadoop.hive.ql.udf.generic.GenericUDFUtils$ConversionHelper.init(GenericUDFUtils.java:212)
 at 
 org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.initialize(GenericUDFBridge.java:138)
 at 
 org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:153)
 at 
 org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:587)
 at 
 org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:708)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:89)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:88)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:128)
 at 
 org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:102)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:6136)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1831)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1663)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:4911)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:5421)
 at 
 org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:5952)
 at 
 org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:126)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:304)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:377)
 at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
 at 
 org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1312) hive trunk does not compile with hadoop 0.17 any more

2010-04-15 Thread Ning Zhang (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ning Zhang updated HIVE-1312:
-

Status: Resolved  (was: Patch Available)
Resolution: Fixed

Committed. Thanks John!

 hive trunk does not compile with hadoop 0.17 any more
 -

 Key: HIVE-1312
 URL: https://issues.apache.org/jira/browse/HIVE-1312
 Project: Hadoop Hive
  Issue Type: Bug
Affects Versions: 0.6.0
Reporter: Zheng Shao
Assignee: John Sichi
 Fix For: 0.6.0

 Attachments: HIVE-1312.1.patch, HIVE-1312.2.patch


 This is caused by HIVE-1295.
 {code}
 compile:
  [echo] Compiling: hive
 [javac] Compiling 527 source files to 
 /hadoop_hive_trunk/.ptest_0/build/ql/classes
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getBytes()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac] ^
 [javac] 
 /hadoop_hive_trunk/.ptest_0/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOu\
 tputFormat.java:69: cannot find symbol
 [javac] symbol  : method getLength()
 [javac] location: class org.apache.hadoop.io.BytesWritable
 [javac]   keyWritable.set(bw.getBytes(), 0, bw.getLength());
 [javac]   ^
 [javac] Note: Some input files use or override a deprecated API.
 [javac] Note: Recompile with -Xlint:deprecation for details.
 [javac] Note: Some input files use unchecked or unsafe operations.
 [javac] Note: Recompile with -Xlint:unchecked for details.
 [javac] 2 errors
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Updated: (HIVE-1297) error message in Hive.checkPaths dumps Java array address instead of path string

2010-04-15 Thread Ning Zhang (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1297?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ning Zhang updated HIVE-1297:
-

Status: Resolved  (was: Patch Available)
Resolution: Fixed

This has already been fixed in another patch. 

 error message in Hive.checkPaths dumps Java array address instead of path 
 string
 

 Key: HIVE-1297
 URL: https://issues.apache.org/jira/browse/HIVE-1297
 Project: Hadoop Hive
  Issue Type: Improvement
  Components: Query Processor
Affects Versions: 0.5.0
Reporter: John Sichi
Assignee: John Sichi
 Fix For: 0.6.0

 Attachments: HIVE-1297.1.patch


 {code}
if (item.isDir()) {
 -throw new HiveException(checkPaths:  + src.toString()
 -+  has nested directory + item.toString());
 +throw new HiveException(checkPaths:  + src.getPath()
 ++  has nested directory  + item.getPath());
}
 {code}

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira