[ 
https://issues.apache.org/jira/browse/KYLIN-3695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16708395#comment-16708395
 ] 

XiaoXiang Yu edited comment on KYLIN-3695 at 12/4/18 8:46 AM:
--------------------------------------------------------------

[~gauravbrills] I debug in my test env.

I found when using decimal data type with empty precision to build a cube. The 
front end(JavaScript) will use 'undefined'  for null value and send it to 
back-end to validate cube metadata. It clear that  decimal(19, undefined) do 
not match a valid regex pattern, so back end return error message to front end 
as my above message.

Source code at 
org.apache.kylin.metadata.datatype.DataType.[https://github.com/apache/kylin/blob/master/core-metadata/src/main/java/org/apache/kylin/metadata/datatype/DataType.java]

I think it has no need to fix it because the root cause is clear.

 
       

{
    "name":"SUM_INTEREST_SCORE2",
    "function":{
        "expression":"SUM",
        "returntype":"decimal(19,undefined)",
        "parameter":{
            "type":"column",
            "value":"USERACTION.INTEREST_SCORE2"
        }
    }
}


was (Author: hit_lacus):
[~gauravbrills] I debug in my test env.

I found when using decimal data type with empty precision to build a cube. The 
front end(JavaScript) will use 'undefined'  for null value and send it to 
back-end to validate cube metadata. It clear that  decimal(19, undefined) do 
not match a valid regex pattern, so back end return error message to front end 
as my above message.

Source code at 
org.apache.kylin.metadata.datatype.DataType.[https://github.com/apache/kylin/blob/master/core-metadata/src/main/java/org/apache/kylin/metadata/datatype/DataType.java]

I think it has no need to fix it because the root cause is clear.

 
       
{quote}{
     "name":"SUM_INTEREST_SCORE2",
     "function":{
         "expression":"SUM",
         "returntype":"decimal(19,undefined)",
         "parameter":
Unknown macro: \{             "type"}
    }
 }
{quote}

> Error while creating hive table through Kylin build cube with mysql imported 
> tables
> -----------------------------------------------------------------------------------
>
>                 Key: KYLIN-3695
>                 URL: https://issues.apache.org/jira/browse/KYLIN-3695
>             Project: Kylin
>          Issue Type: Bug
>          Components: Integration
>    Affects Versions: v2.5.0
>            Reporter: Gaurav Rawat
>            Assignee: XiaoXiang Yu
>            Priority: Minor
>         Attachments: image-2018-12-04-15-53-49-983.png
>
>
> HI I am trying to build a cube with Kylin, the data gets souced fine from 
> sqoop but the next step for creating hive tables fails . Looking at the 
> command being fired it looks weird as the create statement looks good to me .
> I think the issue is with DOUBLE types as when I remove the same the create 
> statement works fine . Can someone please help .
> I am using the stack in AWS EMR, kylin 2.5 hive 2.3.0
> The errors logs with commands as as below, the table is a msql table which 
> had columns with DOUBLE type
> Command
> {code:java}
> hive -e "USE default; DROP TABLE IF EXISTS 
> kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368; 
> CREATE EXTERNAL TABLE IF NOT EXISTS 
> kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368 ( 
> HOLDINGS_STOCK_INVESTOR_ID string ,STOCK_INVESTORS_CHANNEL string 
> ,STOCK_STOCK_ID string ,STOCK_DOMICILE string ,STOCK_STOCK_NM string 
> ,STOCK_APPROACH string ,STOCK_STOCK_TYP string ,INVESTOR_ID string 
> ,INVESTOR_NM string ,INVESTOR_DOMICILE_CNTRY string ,CLIENT_NM string 
> ,INVESTOR_HOLDINGS_GROSS_ASSETS_USD double(22) 
> ,INVESTOR_HOLDINGS_NET_ASSETS_USD double(22) ) ROW FORMAT DELIMITED FIELDS 
> TERMINATED BY '|' STORED AS TEXTFILE LOCATION 
> 's3://wfg1tst-models/kylin/kylin_metadata/kylin-4ae3b18b-831b-da66-eb8c-7318245c4448/kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368';
>  ALTER TABLE 
> kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368 SET 
> TBLPROPERTIES('auto.purge'='true'); " --hiveconf hive.merge.mapredfiles=false 
> --hiveconf hive.auto.convert.join=true --hiveconf dfs.replication=2 
> --hiveconf hive.exec.compress.output=true --hiveconf 
> hive.auto.convert.join.noconditionaltask=true --hiveconf 
> mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false 
> --hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf 
> hive.stats.autogather=true{code}
> Error is as below
> {code:java}
> OK Time taken: 1.315 seconds OK Time taken: 0.09 seconds 
> MismatchedTokenException(334!=347) at 
> org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
>  at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115) at 
> org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
>  at 
> org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808) 
> at 
> org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
>  at 
> org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333) at 
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:204) at 
> org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77) at 
> org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70) at 
> org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468) at 
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1316) at 
> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1456) at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1236) at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1226) at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) at 
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336) at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787) at 
> org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) at 
> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.hadoop.util.RunJar.run(RunJar.java:221) at 
> org.apache.hadoop.util.RunJar.main(RunJar.java:136) FAILED: ParseException 
> line 15:42 mismatched input '(' expecting ) near 'double' in create table 
> statement{code}
> [hadoop|https://stackoverflow.com/questions/tagged/hadoop] 
> [hive|https://stackoverflow.com/questions/tagged/hive] 
> [kylin|https://stackoverflow.com/questions/tagged/kylin]
>  
> More details here  
> [https://stackoverflow.com/questions/53377623/error-while-creating-hive-table-through-kylin-build-cube]
>  . Though the same got solved when I used DECIMAL type with precision .
>  
> Also observer that if you use DECIMAL(10) it does not work unless it has a 
> precision while building mode for the same it gives error as tries to use 
> precision as undefined ie . DECIMAL(10,undefined)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to