Hive version 2.3.3 can work well with HDP 3? Can you try the HiveQL that
Kylin executed out of Kylin, if it works, then there should be something
wrong in Kylin.

liuzhixin <liuz...@163.com> 于2018年10月15日周一 下午1:47写道:

> Thank you for the answer!
>
> I can’t decide the hive version.
>
> And the hive version 2.3.3 can work well with HDP 3.
>
> Perhaps you can test the Kylin with hive version 2.3.3.
>
> Maybe it’s other error. Thanks!
>
> Best wishes!
>
>
> 在 2018年10月15日,下午1:24,ShaoFeng Shi <shaofeng...@apache.org> 写道:
>
> Hi zhixin,
>
> I think the problem is how to run Hive 2 with HDP 3, no relation with
> Kylin.
>
> Usually, we don't encourage user to customize the component version in a
> release, because that may bring dependency conflicts.
>
> I suggest you use the original Hive version in HDP 3.
>
> liuzhixin <liuz...@163.com> 于2018年10月15日周一 上午11:25写道:
>
>> Hi ShaoFeng Shi
>>
>> Yes, the error from hive version 2.3.3,
>>
>> And Kylin need hive version 3.1.0.
>>
>> So how to solve the question?
>>
>> Best wishes!
>>
>> > 在 2018年10月15日,上午11:10,ShaoFeng Shi <shaofeng...@apache.org> 写道:
>> >
>> > Hi Zhixin,
>> >
>> > The error log is thrown from Hive, not from Kylin I think. Please verify
>> > your hive is properly installed; You can manually run that hive command
>> :
>> >
>> > hive -e "use default; xxx"
>> >
>> > Lijun Cao <641507...@qq.com> 于2018年10月15日周一 上午11:01写道:
>> >
>> >> Hi liuzhixin:
>> >>
>> >> As I remember, the Hive version in HDP 3 is 3.1.0 .
>> >>
>> >> You can update Hive to 3.1.0 and then have another try.
>> >>
>> >> And according to my previous test, the binary package
>> >> apache-kylin-2.5.0-bin-hadoop3.tar.gz can work properly on HDP 3. You
>> can
>> >> get it form official site.
>> >>
>> >> Best Regards
>> >>
>> >> Lijun Cao
>> >>
>> >>> 在 2018年10月15日,10:22,liuzhixin <liuz...@163.com> 写道:
>> >>>
>> >>> hi cao lijun,
>> >>> #
>> >>> the platform is ambari hdp3.0, and hive is 2.3.3, hbase version is 2.0
>> >>>
>> >>> I have compile the source code with hive 2.3.3,
>> >>>
>> >>> but the module atopcalcite depends on protobuf 3.1.0,
>> >>>
>> >>> other module depends on protobuf 2.5.0.
>> >>>
>> >>>
>> >>>> 在 2018年10月15日,上午8:40,Lijun Cao <641507...@qq.com> 写道:
>> >>>>
>> >>>> Hi liuzhixin:
>> >>>>
>> >>>> Which platform did you use?
>> >>>>
>> >>>> The CDH 6.0.x or HDP 3.0 ?
>> >>>>
>> >>>> Best Regards
>> >>>>
>> >>>> Lijun Cao
>> >>>>
>> >>>>> 在 2018年10月12日,21:14,liuzhixin <liuz...@163.com> 写道:
>> >>>>>
>> >>>>> Logging initialized using configuration in
>> >>
>> file:/data/hadoop-enviorment/apache-hive-2.3.3/conf/hive-log4j2.properties
>> >> Async: true
>> >>>>> OK
>> >>>>> Time taken: 4.512 seconds
>> >>>>> OK
>> >>>>> Time taken: 1.511 seconds
>> >>>>> OK
>> >>>>> Time taken: 0.272 seconds
>> >>>>> OK
>> >>>>> Time taken: 0.185 seconds
>> >>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>> >> com.google.protobuf.Descriptors$Descriptor.getOneofs()Ljava/util/List;
>> >>>>>    at
>> >>
>> com.google.protobuf.GeneratedMessageV3$FieldAccessorTable.<init>(GeneratedMessageV3.java:1704)
>> >>>>>    at
>> >> org.apache.calcite.avatica.proto.Common.<clinit>(Common.java:18927)
>> >>>>>    at
>> >>
>> org.apache.calcite.avatica.proto.Common$ConnectionProperties.getDescriptor(Common.java:1264)
>> >>>>>    at
>> >>
>> org.apache.calcite.avatica.ConnectionPropertiesImpl.<clinit>(ConnectionPropertiesImpl.java:38)
>> >>>>>    at org.apache.calcite.avatica.MetaImpl.<init>(MetaImpl.java:72)
>> >>>>>    at
>> >> org.apache.calcite.jdbc.CalciteMetaImpl.<init>(CalciteMetaImpl.java:88)
>> >>>>>    at org.apache.calcite.jdbc.Driver.createMeta(Driver.java:169)
>> >>>>>    at
>> >>
>> org.apache.calcite.avatica.AvaticaConnection.<init>(AvaticaConnection.java:121)
>> >>>>>    at
>> >>
>> org.apache.calcite.jdbc.CalciteConnectionImpl.<init>(CalciteConnectionImpl.java:113)
>> >>>>>    at
>> >>
>> org.apache.calcite.jdbc.CalciteJdbc41Factory$CalciteJdbc41Connection.<init>(CalciteJdbc41Factory.java:114)
>> >>>>>    at
>> >>
>> org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:59)
>> >>>>>    at
>> >>
>> org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:44)
>> >>>>>    at
>> >>
>> org.apache.calcite.jdbc.CalciteFactory.newConnection(CalciteFactory.java:53)
>> >>>>>    at
>> >>
>> org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
>> >>>>>    at java.sql.DriverManager.getConnection(DriverManager.java:664)
>> >>>>>    at java.sql.DriverManager.getConnection(DriverManager.java:208)
>> >>>>>    at
>> >> org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:145)
>> >>>>>    at
>> >> org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:106)
>> >>>>>    at
>> >>
>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1069)
>> >>>>>    at
>> >>
>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:1085)
>> >>>>>    at
>> >>
>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:364)
>> >>>>>    at
>> >>
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11138)
>> >>>>>    at
>> >>
>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
>> >>>>>    at
>> >>
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
>> >>>>>    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
>> >>>>>    at
>> >> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
>> >>>>>    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
>> >>>>>    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
>> >>>>>    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
>> >>>>>    at
>> >>
>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
>> >>>>>    at
>> >> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
>> >>>>>    at
>> >> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>> >>>>>    at
>> >> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
>> >>>>>    at
>> >> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
>> >>>>>    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>> >>>>>    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
>> >>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>>>>    at
>> >>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> >>>>>    at
>> >>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >>>>>    at java.lang.reflect.Method.invoke(Method.java:498)
>> >>>>>    at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>> >>>>>    at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>> >>>>> The command is:
>> >>>>> hive -e "USE default;
>> >>>>
>> >>>>
>> >>>
>> >>
>> >>
>> >
>> > --
>> > Best regards,
>> >
>> > Shaofeng Shi 史少锋
>>
>>
>>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>
>
>

-- 
Best regards,

Shaofeng Shi 史少锋

Reply via email to