Exception went away after downloading flink-connector-kafka-base_2.11-1.4.1.jar
to lib folder

On Sat, Feb 24, 2018 at 6:36 PM, kant kodali <kanth...@gmail.com> wrote:

> Hi,
>
> I couldn't get flink and kafka working together. It looks like all
> examples I tried from web site fails with the following Exception.
>
> Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.
> connectors.kafka.FlinkKafkaConsumerBase
>
>
> *or when I do something like this like it is in the website*
>
>
>  val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test",
> new SimpleStringSchema(), properties)).print()
>
> *I get the following exception*
>
> <console>:73: error: overloaded method value addSource with alternatives:
>
>   [T](function: org.apache.flink.streaming.api.functions.source.
> SourceFunction.SourceContext[T] => Unit)(implicit evidence$10:
> org.apache.flink.api.common.typeinfo.TypeInformation[T])
> org.apache.flink.streaming.api.scala.DataStream[T] <and>
>
>   [T](function: 
> org.apache.flink.streaming.api.functions.source.SourceFunction[T])(implicit
> evidence$9: org.apache.flink.api.common.typeinfo.TypeInformation[T])
> org.apache.flink.streaming.api.scala.DataStream[T]
>
>  cannot be applied to (org.apache.flink.streaming.connectors.kafka.
> FlinkKafkaConsumer08[String])
>
>        val stream = senv.addSource(new 
> FlinkKafkaConsumer08[String]("join_test",
> new SimpleStringSchema(), properties)).print()
>
> can anyone share a simple example of how to get Kafka Stream as a Table
> using scala shell? No need for any fancy schema just needs to print the
> value. I am using the latest version of flink 1.41 and my lib folder
> containers flink-connector-kafka-0.8_2.11-1.4.1.jar
>
> I wanted to use Kafka 0.9 but that didn't work so I thought let me just
> get something working first and downgraded to 0.8 but 0.8 examples on the
> website also don't seem to work using scala shell.
>
> Thanks!!
>
>
>
>

Reply via email to