Can’t you remove the dependency to the databricks CSV data source? Spark has them now integrated since some versions so it is not needed.
> On 31. Aug 2018, at 05:52, Srabasti Banerjee <srabast...@ymail.com.INVALID> > wrote: > > Hi, > > I am trying to run below code to read file as a dataframe onto a Stream (for > Spark Streaming) developed via Eclipse IDE, defining schemas appropriately, > by running thin jar on server and am getting error below. Tried out > suggestions from researching on internet based on > "spark.read.option.schema.csv" similar errors with no success. > > Am thinking this can be a bug as the changes might not have been done for > readStream option? Has anybody encountered similar issue for Spark Streaming? > > Looking forward to hear your response(s)! > > Thanks > Srabasti Banerjee > > Error > Exception in thread "main" java.lang.RuntimeException: Multiple sources found > for csv (com.databricks.spark.csv.DefaultSource15, > org.apache.spark.sql.execution.datasources.csv.CSVFileFormat), please specify > the fully qualified class name. > > Code: > val csvdf = spark.readStream.option("sep", > ",").schema(userSchema).csv("server_path") //does not resolve error > val csvdf = spark.readStream.option("sep", > ",").schema(userSchema).format("com.databricks.spark.csv").csv("server_path") > //does not resolve error > val csvdf = spark.readStream.option("sep", > ",").schema(userSchema).csv("server_path") //does not resolve error > val csvdf = spark.readStream.option("sep", > ",").schema(userSchema).format("org.apache.spark.sql.execution.datasources.csv").csv("server_path") > //does not resolve error > val csvdf = spark.readStream.option("sep", > ",").schema(userSchema).format("org.apache.spark.sql.execution.datasources.csv.CSVFileFormat").csv("server_path") > //does not resolve error > val csvdf = spark.readStream.option("sep", > ",").schema(userSchema).format("com.databricks.spark.csv.DefaultSource15").csv("server_path") > //does not resolve error > > > >