Thank you everyone for your help! Owen, we're on an old version of hive
(1.1.0-cdh5.9.2).

On Thu, Jul 18, 2019 at 9:38 AM Owen O'Malley <[email protected]>
wrote:

> ORC files expect UTF-8, which is a superset of ascii, in strings, char,
> and varchar. The only place that I know that will cause trouble if you put
> non-utf-8 data in strings is the statistics. The API for getting the
> min/max will convert to Java strings.
>
> But back to your original point, the schema evolution should easily handle
> the case of string to varchar. Which version of Hive are you using?
>
> .. Owen
>
> On Thu, Jul 18, 2019 at 8:27 AM Devopam Mittra <[email protected]> wrote:
>
>> The table has data in it perhaps that is beyond ASCII.
>> Easier way is to go for additional column , update with data and the drop
>> the older one after validation of records in String type col.
>>
>> Regards
>> Dev
>>
>> On Thu, Jul 18, 2019, 4:44 AM William Shen <[email protected]>
>> wrote:
>>
>>> Hi all,
>>>
>>> I assumed that it should be compatible to convert column type varchar to
>>> string, however, after running ALTER TABLE table CHANGE col col STRING, I
>>> encounter the following error when querying the column from hive:
>>>
>>> Failed with exception
>>> java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException:
>>> java.lang.ClassCastException:
>>> org.apache.hadoop.hive.serde2.io.HiveVarcharWritable cannot be cast to
>>> org.apache.hadoop.io.Text
>>>
>>> Anyone encountered this before, or know how to work around this?
>>>
>>> Thank you!
>>>
>>> - Will
>>>
>>

Reply via email to