I am unable to register the Solr Cloud as data source in Spark 2.1.0.
Following the documentation at
https://github.com/lucidworks/spark-solr#import-jar-file-via-spark-shell, I
have used the 3.0.0.beta3 version.

The system path is displaying the added jar as
spark://172.31.208.1:55730/jars/spark-solr-3.0.0-beta3-shaded.jar Added By
User

OS:Win10
Hadoop : 2.7 (x64 winutils)
Spark:2.1.0
Solr-Spark:3.0.0-beta3

same was tried with spark 2.2.0 with solr-spark:3.1.0

ERROR
scala> val df = spark.read.format("solr").options(Map("collection" ->
"cdr1","zkhost" -> "localhost:9983")).load
java.lang.ClassNotFoundException: Failed to find data source: solr. Please
find packages at http://spark.apache.org/third-party-projects.html
  at
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:569)
  at
org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86)
  at
org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86)
  at
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:325)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
  ... 48 elided
Caused by: java.lang.ClassNotFoundException: solr.DefaultSource
  at
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
  at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554)
  at scala.util.Try$.apply(Try.scala:192)
  at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
  at
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554)
  at scala.util.Try.orElse(Try.scala:84)
  at
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:554)



-- 
I.R

Reply via email to