GitHub user nongli opened a pull request:

    https://github.com/apache/spark/pull/10961

    [SPARK-13043][SQL] Implement remaining catalyst types in ColumnarBatch.

    This includes: float, boolean, short, decimal and calendar interval.
    
    Decimal is mapped to long or byte array depending on the size and calendar
    interval is mapped to a struct of int and long.
    
    The only remaining type is map. The schema mapping is straightforward but
    we might want to revisit how we deal with this in the rest of the execution
    engine.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/nongli/spark spark-13043

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/10961.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #10961
    
----
commit 24ca13c7f8b2ac5fbc4a9600539bb02d22b56a91
Author: Nong Li <n...@databricks.com>
Date:   2016-01-27T06:22:48Z

    [SPARK-13043][SQL] Implement remaining catalyst types in ColumnarBatch.
    
    This includes: float, boolean, short, decimal and calendar interval.
    
    Decimal is mapped to long or byte array depending on the size and calendar
    interval is mapped to a struct of int and long.
    
    The only remaining type is map. The schema mapping is straightforward but
    we might want to revisit how we deal with this in the rest of the execution
    engine.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to