Thanks Ted for the info Dr Mich Talebzadeh
LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com On 22 April 2016 at 18:38, Ted Yu <yuzhih...@gmail.com> wrote: > Marcelo: > From yesterday's thread, Mich revealed that he was looking at: > > > https://github.com/agsachin/spark/blob/CEP/external/kafka/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala > > which references SparkFunSuite. > > In an earlier thread, Mich was asking about CEP. > > Just to give you some background. > > On Fri, Apr 22, 2016 at 10:31 AM, Marcelo Vanzin <van...@cloudera.com> > wrote: > >> Sorry, I've been looking at this thread and the related ones and one >> thing I still don't understand is: why are you trying to use internal >> Spark classes like Logging and SparkFunSuite in your code? >> >> Unless you're writing code that lives inside Spark, you really >> shouldn't be trying to reference them. First reason being that they're >> "private[spark]" and even if they're available, the compiler won't let >> you. >> >> On Fri, Apr 22, 2016 at 12:21 AM, Mich Talebzadeh >> <mich.talebza...@gmail.com> wrote: >> > >> > Hi, >> > >> > Anyone know which jar file has import >> org.apache.spark.internal.Logging? >> > >> > I tried spark-core_2.10-1.5.1.jar >> > >> > but does not seem to work >> > >> > scala> import org.apache.spark.internal.Logging >> > >> > <console>:57: error: object internal is not a member of package >> > org.apache.spark >> > import org.apache.spark.internal.Logging >> > >> > Thanks >> > >> > Dr Mich Talebzadeh >> > >> > >> > >> > LinkedIn >> > >> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw >> > >> > >> > >> > http://talebzadehmich.wordpress.com >> > >> > >> >> >> >> -- >> Marcelo >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> >