If these two imports are the only imports that you added, then you did not follow Hequn's advice or the link that I sent you.

You need to add the underscore imports to let Scala do its magic.

Timo


Am 01.08.18 um 10:28 schrieb Mich Talebzadeh:
Hi Timo,

These are my two flink table related imports

import org.apache.flink.table.api.Table
import org.apache.flink.table.api.TableEnvironment

And these are my dependencies building with SBT

libraryDependencies += "org.apache.hadoop" % "hadoop-core" % "1.2.1"
libraryDependencies += "org.apache.hbase" % "hbase" % "1.2.6"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.6"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.6"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.6"
libraryDependencies += "org.apache.flink" %% "flink-connector-kafka-0.11" % "1.5.0" libraryDependencies += "org.apache.flink" %% "flink-connector-kafka-base" % "1.5.0"
libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.5.0"
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.11.0.0"
libraryDependencies += "org.apache.flink" %% "flink-streaming-java" % "1.5.0" % "provided" *libraryDependencies += "org.apache.flink" %% "flink-table" % "1.5.0" % "provided"
*libraryDependencies += "org.apache.kafka" %% "kafka" % "0.11.0.0"

There appears to be conflict somewhere that cause this error

[info] Compiling 1 Scala source to /home/hduser/dba/bin/flink/md_streaming/target/scala-2.11/classes... [error] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:152: overloaded method value fromDataStream with alternatives: [error]   [T](dataStream: org.apache.flink.streaming.api.datastream.DataStream[T], fields: String)org.apache.flink.table.api.Table <and> [error]   [T](dataStream: org.apache.flink.streaming.api.datastream.DataStream[T])org.apache.flink.table.api.Table [error]  cannot be applied to (org.apache.flink.streaming.api.datastream.DataStreamSource[String], Symbol, Symbol, Symbol, Symbol) [error]   val table1: Table = tableEnv.fromDataStream(dataStream, 'key, 'ticker, 'timeissued, 'price)
[error]                                ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed

Thanks


Dr Mich Talebzadeh

LinkedIn /https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/

http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk.Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.



On Wed, 1 Aug 2018 at 09:17, Timo Walther <twal...@apache.org <mailto:twal...@apache.org>> wrote:

    Hi Mich,

    I would check you imports again [1]. This is a pure compiler issue
    that is unrelated to your actual data stream. Also check your
    project dependencies.

    Regards,
    Timo

    [1]
    
https://ci.apache.org/projects/flink/flink-docs-master/dev/table/common.html#implicit-conversion-for-scala

    Am 01.08.18 um 09:30 schrieb Mich Talebzadeh:

    Hi both,

    I added the import as Hequn suggested.

    My stream is very simple and consists of 4 values separated by
    "," as below

    05521df6-4ccf-4b2f-b874-eb27d461b305,IBM,2018-07-30T19:51:50,190.48

    So this is what I have been trying to do

    Code

    val dataStream =  streamExecEnv
           .addSource(new FlinkKafkaConsumer011[String](topicsValue,
    new SimpleStringSchema(), properties))
     //
     //
      val tableEnv = TableEnvironment.getTableEnvironment(streamExecEnv)
      val table1: Table = tableEnv.fromDataStream(dataStream, 'key,
    'ticker, 'timeissued, 'price)

    note those four columns in Table1 definition

    And this is the error being thrown

    [info] Compiling 1 Scala source to
    /home/hduser/dba/bin/flink/md_streaming/target/scala-2.11/classes...
    [error]
    
/home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:152:
    overloaded method value fromDataStream with alternatives:
    [error]   [T](dataStream:
    org.apache.flink.streaming.api.datastream.DataStream[T], fields:
    String)org.apache.flink.table.api.Table <and>
    [error]   [T](dataStream:
    
org.apache.flink.streaming.api.datastream.DataStream[T])org.apache.flink.table.api.Table
    [error]  cannot be applied to
    (org.apache.flink.streaming.api.datastream.DataStreamSource[String],
    Symbol, Symbol, Symbol, Symbol)
    [error]   val table1: Table = tableEnv.fromDataStream(dataStream,
    'key, 'ticker, 'timeissued, 'price)
    [error]                                ^
    [error] one error found
    [error] (compile:compileIncremental) Compilation failed

    I suspect dataStream may not be compatible with this operation?

    Regards,

    Dr Mich Talebzadeh

    LinkedIn
    
/https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/

    http://talebzadehmich.wordpress.com


    *Disclaimer:* Use it at your own risk.Any and all responsibility
    for any loss, damage or destruction of data or any other property
    which may arise from relying on this email's technical content is
    explicitly disclaimed. The author will in no case be liable for
    any monetary damages arising from such loss, damage or destruction.



    On Wed, 1 Aug 2018 at 04:51, Hequn Cheng <chenghe...@gmail.com
    <mailto:chenghe...@gmail.com>> wrote:

        Hi, Mich

        You can try adding "import
        org.apache.flink.table.api.scala._", so that the Symbol can
        be recognized as an Expression.

        Best, Hequn

        On Wed, Aug 1, 2018 at 6:16 AM, Mich Talebzadeh
        <mich.talebza...@gmail.com
        <mailto:mich.talebza...@gmail.com>> wrote:

            Hi,

            I am following this example

            
https://ci.apache.org/projects/flink/flink-docs-release-1.5/dev/table/common.html#integration-with-datastream-and-dataset-api

            This is my dataStream which is built on a Kafka topic

            //
                //Create a Kafka consumer
                //
                val dataStream =  streamExecEnv
                   .addSource(new
            FlinkKafkaConsumer011[String](topicsValue, new
            SimpleStringSchema(), properties))
             //
             //
              val tableEnv =
            TableEnvironment.getTableEnvironment(streamExecEnv)
              val table1: Table = tableEnv.fromDataStream(dataStream,
            'key, 'ticker, 'timeissued, 'price)

            While compiling it throws this error

            [error]
            
/home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:169:
            overloaded method value fromDataStream with alternatives:
            [error]   [T](dataStream:
            org.apache.flink.streaming.api.datastream.DataStream[T],
            fields: String)org.apache.flink.table.api.Table <and>
            [error]   [T](dataStream:
            
org.apache.flink.streaming.api.datastream.DataStream[T])org.apache.flink.table.api.Table
            [error]  cannot be applied to
            (org.apache.flink.streaming.api.datastream.DataStreamSource[String],
            Symbol, Symbol, Symbol, Symbol)
            [error]   val table1: Table =
            tableEnv.fromDataStream(dataStream, 'key, 'ticker,
            'timeissued, 'price)
            [error]                                ^
            [error] one error found
            [error] (compile:compileIncremental) Compilation failed

            The topic is very simple, it is comma separated prices. I
            tried mapFunction and flatMap but neither worked!

            Thanks,


            Dr Mich Talebzadeh

            LinkedIn
            
/https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/

            http://talebzadehmich.wordpress.com


            *Disclaimer:* Use it at your own risk.Any and all
            responsibility for any loss, damage or destruction of
            data or any other property which may arise from relying
            on this email's technical content is explicitly
            disclaimed. The author will in no case be liable for any
            monetary damages arising from such loss, damage or
            destruction.




Reply via email to