Hi everyone,
I get an error when getting the column info thrift API, the version of the hive
is 3.1.1.
I created a table using the following sql: CREATE EXTERNAL TABLE `my_table`( a
string, b bigint) ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe';
When I calling the thrift API
Hi Furcy,
Thanks.
Apologies for being late on this. You are absolutely correct. I tried and
BQ can read compressed ORC files.
Still referring to my original thread, BQ handling of Double and Dates are
problematic. I tend to create these type of fields as String and do the ETL
in BQ by
Can you please create a Jira ticket, and put code sample, not screenshot
there.
Thanks,
Dmitry
On Tue, Jan 29, 2019 at 9:08 AM 陈贵龙 wrote:
> Hi,
>How are you . i have a question in how to use hqlsql on hive partition
> table
> when I use Hqlsql on hive Partition table,Why get the key Word
Hi,
After tweaking the configs, I found out that
"hive.vectorized.execution.enabled" and "hive.auto.convert.join" configs
are the culprit.
I think vectorization on map column data type is not supported in my
current Hive version. Also, the Map Join is having problems on the map data
type.
So,