I'm running into this error while doing a dynamic partition insert. Heres
how I created the table:
CREATE TABLE `part_table`(
`c1` bigint,
`c2` bigint,
`c3` bigint)
PARTITIONED BY (p1 string, `p2` string)
STORED AS PARQUET;
Here is the insert table:
SET
, but if you bury the partition columns in
a derived field like in Query 2 it is unable to spot that and so does a full
table scan. I think (but don’t know for sure) that this will be fairly typical
of all SQL engines. Your best bet is to use direct conditions like in Query 1.
In this case it may have
Hi,
I have a question on Hive Optimizer. I have a table with partition columns
eg.,Sales partitioned by year, month, day. Assume that I have two years
worth of data on this table. I'm running two queries on this table.
Query 1: Select * from Sales where year=2015 and month = 5 and day between
1
wrote:
Hi,
I have a question on Hive Optimizer. I have a table with partition columns
eg.,Sales partitioned by year, month, day. Assume that I have two years
worth of data on this table. I'm running two queries on this table.
Query 1: Select * from Sales where year=2015 and month = 5 and day
On Thu, May 14, 2015 at 1:48 PM, Appan Thirumaligai
appanhiv...@gmail.com wrote:
Hi,
I have a question on Hive Optimizer. I have a table with partition
columns eg.,Sales partitioned by year, month, day. Assume that I have two
years worth of data on this table. I'm running two queries
with partition
columns eg.,Sales partitioned by year, month, day. Assume that I have
two
years worth of data on this table. I'm running two queries on this
table.
Query 1: Select * from Sales where year=2015 and month = 5 and day
between 1 and 7
Query 2: Select * from Sales where
I have a table that has partition based on column ss_sold_date_sk which
has null value partition as well.
When I run the analyze ..compute stat it fails with attached exception.
Is there a way to avoid this or bypass this exception, also what would be
the impact on query performance of stat