Thanks Mich for replying.

I got the cause of it.

Version: Hive v0.14

Cause: Vectorization was enabled

Regards,
Dhaval Modi
dhavalmod...@gmail.com

On 18 April 2016 at 22:13, Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> this should work and should not crash
>
>
> *hive> insert into dummy select cast("2016-06-08 00:00:00" as String) from
> t2 limit 5;*Query ID =
> hduser_20160418175620_4cebc5e9-10a0-422c-92ff-43f059b4c3a6
> Total jobs = 1
> Launching Job 1 out of 1
> In order to change the average load for a reducer (in bytes):
>   set hive.exec.reducers.bytes.per.reducer=<number>
> In order to limit the maximum number of reducers:
>   set hive.exec.reducers.max=<number>
> In order to set a constant number of reducers:
>   set mapreduce.job.reduces=<number>
> Starting Spark Job = 5271f92e-8ff6-4b5f-bb2c-980109d26342
>
> Query Hive on Spark job[2] stages:
> 3
> 4
>
> Status: Running (Hive on Spark job[2])
> Job Progress Format
> CurrentTime StageId_StageAttemptId:
> SucceededTasksCount(+RunningTasksCount-FailedTasksCount)/TotalTasksCount
> [StageCost]
> 2016-04-18 17:58:27,183 Stage-3_0: 1/1 Finished Stage-4_0: 0(+1)/1
> 2016-04-18 17:58:28,190 Stage-3_0: 1/1 Finished Stage-4_0: 1/1 Finished
> Status: Finished successfully in 2.26 seconds
> Loading data to table default.dummy
> OK
> Time taken: 2.586 seconds
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 18 April 2016 at 16:55, Dhaval Modi <dhavalmod...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am trying to convert typecast timestamp to string and add in table
>> containing String column.
>> But it is failing with
>> =====================================================================
>> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error
>> evaluating '2016-06-08 00:00:00'
>>         at
>> org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.processOp(VectorSelectOperator.java:126)
>> =====================================================================
>>
>> Steps to reproduce:
>> Create table:
>> create table dummy (col1 String) stored as avro;
>>
>> Create table t2
>> create table t2 (col1 timestamp) stored as orc;
>>
>>
>> Insert Query:
>> insert into table rct.dummy select cast("2016-06-08 00:00:00" as String)
>> from t2 limit 5;
>>
>>
>> Let me know incase, I am missing something.
>>
>>
>> Regards,
>> Dhaval Modi
>> dhavalmod...@gmail.com
>>
>
>

Reply via email to