What's the value of spark.version ?

Do you know which version of Spark mongodb connector 0.10.3 was built
against ?

You can use the following command to find out:
mvn dependency:tree

Maybe the Spark version you use is different from what mongodb connector
was built against.

On Fri, Jun 10, 2016 at 2:50 AM, Asfandyar Ashraf Malik <
asfand...@kreditech.com> wrote:

> Hi,
> I did not notice that I put it twice.
> I changed that and ran my program but it still gives the same error:
>
> java.lang.NoSuchMethodError:
> org.apache.spark.sql.catalyst.ScalaReflection$.typeOfObject()Lscala/PartialFunction;
>
>
> Cheers
>
>
>
> 2016-06-10 11:47 GMT+02:00 Alonso Isidoro Roman <alons...@gmail.com>:
>
>> why *spark-mongodb_2.11 dependency is written twice in pom.xml?*
>>
>> Alonso Isidoro Roman
>> [image: https://]about.me/alonso.isidoro.roman
>>
>> <https://about.me/alonso.isidoro.roman?promo=email_sig&utm_source=email_sig&utm_medium=email_sig&utm_campaign=external_links>
>>
>> 2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik <
>> asfand...@kreditech.com>:
>>
>>> Hi,
>>> I am using Stratio library to get MongoDB to work with Spark but I get
>>> the following error:
>>>
>>> java.lang.NoSuchMethodError:
>>> org.apache.spark.sql.catalyst.ScalaReflection
>>>
>>> This is my code.
>>>
>>> ---------------------------------------------------------------------------------------
>>> *    public static void main(String[] args) {*
>>>
>>> *        JavaSparkContext sc = new JavaSparkContext("local[*]", "test
>>> spark-mongodb java"); *
>>> *        SQLContext sqlContext = new SQLContext(sc); *
>>>
>>> *        Map options = new HashMap(); *
>>> *        options.put("host", "xyz.mongolab.com:59107
>>> <http://xyz.mongolab.com:59107>"); *
>>> *        options.put("database", "heroku_app3525385");*
>>> *        options.put("collection", "datalog");*
>>> *        options.put("credentials", "*****,****,****");*
>>>
>>> *        DataFrame df =
>>> sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
>>> *        df.registerTempTable("datalog"); *
>>> *        df.show();*
>>>
>>> *    }*
>>>
>>> ---------------------------------------------------------------------------------------
>>> My pom file is as follows:
>>>
>>>  *<dependencies>*
>>> *        <dependency>*
>>> *            <groupId>org.apache.spark</groupId>*
>>> *            <artifactId>spark-core_2.11</artifactId>*
>>> *            <version>${spark.version}</version>*
>>> *        </dependency>*
>>> *        <dependency>*
>>> *            <groupId>org.apache.spark</groupId>*
>>> *            <artifactId>spark-catalyst_2.11 </artifactId>*
>>> *            <version>${spark.version}</version>*
>>> *        </dependency>*
>>> *        <dependency>*
>>> *                <groupId>org.apache.spark</groupId>*
>>> *                <artifactId>spark-sql_2.11</artifactId>*
>>> *                <version>${spark.version}</version>*
>>> *        </dependency> *
>>> *        <dependency>*
>>> *            <groupId>com.stratio.datasource</groupId>*
>>> *            <artifactId>spark-mongodb_2.11</artifactId>*
>>> *            <version>0.10.3</version>*
>>> *        </dependency>*
>>> *        <dependency>*
>>> *            <groupId>com.stratio.datasource</groupId>*
>>> *            <artifactId>spark-mongodb_2.11</artifactId>*
>>> *            <version>0.10.3</version>*
>>> *            <type>jar</type>*
>>> *        </dependency>*
>>> *    </dependencies>*
>>>
>>>
>>> Regards
>>>
>>
>>
>

Reply via email to