Repository: spark
Updated Branches:
  refs/heads/master 14c54f187 -> 60ab80f50


[SPARK-4272] [SQL] Add more unwrapper functions for primitive type in 
TableReader

Currently, the data "unwrap" only support couple of primitive types, not all, 
it will not cause exception, but may get some performance in table scanning for 
the type like binary, date, timestamp, decimal etc.

Author: Cheng Hao <hao.ch...@intel.com>

Closes #3136 from chenghao-intel/table_reader and squashes the following 
commits:

fffb729 [Cheng Hao] fix bug for retrieving the timestamp object
e9c97a4 [Cheng Hao] Add more unwrapper functions for primitive type in 
TableReader


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/60ab80f5
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/60ab80f5
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/60ab80f5

Branch: refs/heads/master
Commit: 60ab80f501b8384ddf48a9ac0ba0c2b9eb548b28
Parents: 14c54f1
Author: Cheng Hao <hao.ch...@intel.com>
Authored: Fri Nov 7 12:15:53 2014 -0800
Committer: Michael Armbrust <mich...@databricks.com>
Committed: Fri Nov 7 12:15:53 2014 -0800

----------------------------------------------------------------------
 .../org/apache/spark/sql/hive/HiveInspectors.scala   |  4 ----
 .../org/apache/spark/sql/hive/TableReader.scala      | 15 +++++++++++++++
 2 files changed, 15 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/60ab80f5/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveInspectors.scala
----------------------------------------------------------------------
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveInspectors.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveInspectors.scala
index 58815da..bdc7e1d 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveInspectors.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveInspectors.scala
@@ -119,10 +119,6 @@ private[hive] trait HiveInspectors {
    * Wraps with Hive types based on object inspector.
    * TODO: Consolidate all hive OI/data interface code.
    */
-  /**
-   * Wraps with Hive types based on object inspector.
-   * TODO: Consolidate all hive OI/data interface code.
-   */
   protected def wrapperFor(oi: ObjectInspector): Any => Any = oi match {
     case _: JavaHiveVarcharObjectInspector =>
       (o: Any) => new HiveVarchar(o.asInstanceOf[String], 
o.asInstanceOf[String].size)

http://git-wip-us.apache.org/repos/asf/spark/blob/60ab80f5/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
----------------------------------------------------------------------
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
index e49f095..f60bc37 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
@@ -290,6 +290,21 @@ private[hive] object HadoopTableReader extends 
HiveInspectors {
           (value: Any, row: MutableRow, ordinal: Int) => row.setFloat(ordinal, 
oi.get(value))
         case oi: DoubleObjectInspector =>
           (value: Any, row: MutableRow, ordinal: Int) => 
row.setDouble(ordinal, oi.get(value))
+        case oi: HiveVarcharObjectInspector =>
+          (value: Any, row: MutableRow, ordinal: Int) =>
+            row.setString(ordinal, oi.getPrimitiveJavaObject(value).getValue)
+        case oi: HiveDecimalObjectInspector =>
+          (value: Any, row: MutableRow, ordinal: Int) =>
+            row.update(ordinal, HiveShim.toCatalystDecimal(oi, value))
+        case oi: TimestampObjectInspector =>
+          (value: Any, row: MutableRow, ordinal: Int) =>
+            row.update(ordinal, oi.getPrimitiveJavaObject(value).clone())
+        case oi: DateObjectInspector =>
+          (value: Any, row: MutableRow, ordinal: Int) =>
+            row.update(ordinal, oi.getPrimitiveJavaObject(value))
+        case oi: BinaryObjectInspector =>
+          (value: Any, row: MutableRow, ordinal: Int) =>
+            row.update(ordinal, oi.getPrimitiveJavaObject(value))
         case oi =>
           (value: Any, row: MutableRow, ordinal: Int) => row(ordinal) = 
unwrap(value, oi)
       }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to