[
https://issues.apache.org/jira/browse/HIVE-7472?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sushanth Sowmyan updated HIVE-7472:
-----------------------------------
Description:
Cloning HIVE-5550 because HIVE-5550 fixed org.apache.hcatalog.*, and not
org.apache.hive.hcatalog.* . And that other package needs this change too. And
with 0.14 pruning of org.apache.hcatalog.*, we miss this patch altogether.
====
A table was created using HCatalog API with out specifying the file format, it
defaults to:
{code}
fileFormat=TextFile, inputformat=org.apache.hadoop.mapred.TextInputFormat,
outputformat=org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
{code}
But, when hive fetches the table from the metastore, it strangely replaces the
output format with org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
and the comparison between source and target table fails.
The code in org.apache.hadoop.hive.ql.parse.ImportSemanticAnalyzer#checkTable
does a string comparison of classes and fails.
{code}
// check IF/OF/Serde
String existingifc = table.getInputFormatClass().getName();
String importedifc = tableDesc.getInputFormat();
String existingofc = table.getOutputFormatClass().getName();
String importedofc = tableDesc.getOutputFormat();
if ((!existingifc.equals(importedifc))
|| (!existingofc.equals(importedofc))) {
throw new SemanticException(
ErrorMsg.INCOMPATIBLE_SCHEMA
.getMsg(" Table inputformat/outputformats do not match"));
}
{code}
This only affects tables with text and sequence file formats but not rc or orc.
was:
A table was created using HCatalog API with out specifying the file format, it
defaults to:
{code}
fileFormat=TextFile, inputformat=org.apache.hadoop.mapred.TextInputFormat,
outputformat=org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
{code}
But, when hive fetches the table from the metastore, it strangely replaces the
output format with org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
and the comparison between source and target table fails.
The code in org.apache.hadoop.hive.ql.parse.ImportSemanticAnalyzer#checkTable
does a string comparison of classes and fails.
{code}
// check IF/OF/Serde
String existingifc = table.getInputFormatClass().getName();
String importedifc = tableDesc.getInputFormat();
String existingofc = table.getOutputFormatClass().getName();
String importedofc = tableDesc.getOutputFormat();
if ((!existingifc.equals(importedifc))
|| (!existingofc.equals(importedofc))) {
throw new SemanticException(
ErrorMsg.INCOMPATIBLE_SCHEMA
.getMsg(" Table inputformat/outputformats do not match"));
}
{code}
This only affects tables with text and sequence file formats but not rc or orc.
> CLONE - Import fails for tables created with default text, sequence and orc
> file formats using HCatalog API
> -----------------------------------------------------------------------------------------------------------
>
> Key: HIVE-7472
> URL: https://issues.apache.org/jira/browse/HIVE-7472
> Project: Hive
> Issue Type: Bug
> Components: HCatalog
> Affects Versions: 0.14.0, 0.13.1
> Reporter: Sushanth Sowmyan
> Assignee: Sushanth Sowmyan
>
> Cloning HIVE-5550 because HIVE-5550 fixed org.apache.hcatalog.*, and not
> org.apache.hive.hcatalog.* . And that other package needs this change too.
> And with 0.14 pruning of org.apache.hcatalog.*, we miss this patch altogether.
> ====
> A table was created using HCatalog API with out specifying the file format,
> it defaults to:
> {code}
> fileFormat=TextFile, inputformat=org.apache.hadoop.mapred.TextInputFormat,
> outputformat=org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
> {code}
> But, when hive fetches the table from the metastore, it strangely replaces
> the output format with
> org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
> and the comparison between source and target table fails.
> The code in org.apache.hadoop.hive.ql.parse.ImportSemanticAnalyzer#checkTable
> does a string comparison of classes and fails.
> {code}
> // check IF/OF/Serde
> String existingifc = table.getInputFormatClass().getName();
> String importedifc = tableDesc.getInputFormat();
> String existingofc = table.getOutputFormatClass().getName();
> String importedofc = tableDesc.getOutputFormat();
> if ((!existingifc.equals(importedifc))
> || (!existingofc.equals(importedofc))) {
> throw new SemanticException(
> ErrorMsg.INCOMPATIBLE_SCHEMA
> .getMsg(" Table inputformat/outputformats do not match"));
> }
> {code}
> This only affects tables with text and sequence file formats but not rc or
> orc.
--
This message was sent by Atlassian JIRA
(v6.2#6252)