Re: Spark Getting data from MongoDB in JAVA

2016-06-13 Thread Asfandyar Ashraf Malik
Yes, It was a dependency issue. I was using incompatible versions.
The command *mvn dependency:tree -Dverbose *helped me fix this.

Cheers



Asfandyar Ashraf Malik


Mobile: +49 15751174449 <%2B49%20151%20230%20130%2066>

Email: asfand...@kreditech.com <%2B49%20151%20230%20130%2066>



Kreditech Holding SSL GmbH

Ludwig-Erhard-Straße 1, 20459 Hamburg, Germany

2016-06-12 18:36 GMT+02:00 Ted Yu <yuzhih...@gmail.com>:

> What's the value of spark.version ?
>
> Do you know which version of Spark mongodb connector 0.10.3 was built
> against ?
>
> You can use the following command to find out:
> mvn dependency:tree
>
> Maybe the Spark version you use is different from what mongodb connector
> was built against.
>
> On Fri, Jun 10, 2016 at 2:50 AM, Asfandyar Ashraf Malik <
> asfand...@kreditech.com> wrote:
>
>> Hi,
>> I did not notice that I put it twice.
>> I changed that and ran my program but it still gives the same error:
>>
>> java.lang.NoSuchMethodError:
>> org.apache.spark.sql.catalyst.ScalaReflection$.typeOfObject()Lscala/PartialFunction;
>>
>>
>> Cheers
>>
>>
>>
>> 2016-06-10 11:47 GMT+02:00 Alonso Isidoro Roman <alons...@gmail.com>:
>>
>>> why *spark-mongodb_2.11 dependency is written twice in pom.xml?*
>>>
>>> Alonso Isidoro Roman
>>> [image: https://]about.me/alonso.isidoro.roman
>>>
>>> <https://about.me/alonso.isidoro.roman?promo=email_sig_source=email_sig_medium=email_sig_campaign=external_links>
>>>
>>> 2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik <
>>> asfand...@kreditech.com>:
>>>
>>>> Hi,
>>>> I am using Stratio library to get MongoDB to work with Spark but I get
>>>> the following error:
>>>>
>>>> java.lang.NoSuchMethodError:
>>>> org.apache.spark.sql.catalyst.ScalaReflection
>>>>
>>>> This is my code.
>>>>
>>>> ---
>>>> *public static void main(String[] args) {*
>>>>
>>>> *JavaSparkContext sc = new JavaSparkContext("local[*]", "test
>>>> spark-mongodb java"); *
>>>> *SQLContext sqlContext = new SQLContext(sc); *
>>>>
>>>> *Map options = new HashMap(); *
>>>> *options.put("host", "xyz.mongolab.com:59107
>>>> <http://xyz.mongolab.com:59107>"); *
>>>> *options.put("database", "heroku_app3525385");*
>>>> *options.put("collection", "datalog");*
>>>> *options.put("credentials", "*,,");*
>>>>
>>>> *DataFrame df =
>>>> sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
>>>> *df.registerTempTable("datalog"); *
>>>> *df.show();*
>>>>
>>>> *}*
>>>>
>>>> ---
>>>> My pom file is as follows:
>>>>
>>>>  **
>>>> **
>>>> *org.apache.spark*
>>>> *spark-core_2.11*
>>>> *${spark.version}*
>>>> **
>>>> **
>>>> *org.apache.spark*
>>>> *spark-catalyst_2.11 *
>>>> *${spark.version}*
>>>> **
>>>> **
>>>> *org.apache.spark*
>>>> *spark-sql_2.11*
>>>> *${spark.version}*
>>>> * *
>>>> **
>>>> *com.stratio.datasource*
>>>> *spark-mongodb_2.11*
>>>> *0.10.3*
>>>> **
>>>> **
>>>> *com.stratio.datasource*
>>>> *spark-mongodb_2.11*
>>>> *0.10.3*
>>>> *jar*
>>>> **
>>>> **
>>>>
>>>>
>>>> Regards
>>>>
>>>
>>>
>>
>


Re: Spark Getting data from MongoDB in JAVA

2016-06-10 Thread Asfandyar Ashraf Malik
Hi,
I did not notice that I put it twice.
I changed that and ran my program but it still gives the same error:

java.lang.NoSuchMethodError:
org.apache.spark.sql.catalyst.ScalaReflection$.typeOfObject()Lscala/PartialFunction;


Cheers



2016-06-10 11:47 GMT+02:00 Alonso Isidoro Roman <alons...@gmail.com>:

> why *spark-mongodb_2.11 dependency is written twice in pom.xml?*
>
> Alonso Isidoro Roman
> [image: https://]about.me/alonso.isidoro.roman
>
> <https://about.me/alonso.isidoro.roman?promo=email_sig_source=email_sig_medium=email_sig_campaign=external_links>
>
> 2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik <asfand...@kreditech.com
> >:
>
>> Hi,
>> I am using Stratio library to get MongoDB to work with Spark but I get
>> the following error:
>>
>> java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.ScalaReflection
>>
>> This is my code.
>>
>> ---
>> *public static void main(String[] args) {*
>>
>> *JavaSparkContext sc = new JavaSparkContext("local[*]", "test
>> spark-mongodb java"); *
>> *SQLContext sqlContext = new SQLContext(sc); *
>>
>> *Map options = new HashMap(); *
>> *options.put("host", "xyz.mongolab.com:59107
>> <http://xyz.mongolab.com:59107>"); *
>> *options.put("database", "heroku_app3525385");*
>> *options.put("collection", "datalog");*
>> *options.put("credentials", "*,,");*
>>
>> *DataFrame df =
>> sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
>> *df.registerTempTable("datalog"); *
>> *df.show();*
>>
>> *}*
>>
>> ---
>> My pom file is as follows:
>>
>>  **
>> **
>> *org.apache.spark*
>> *spark-core_2.11*
>> *${spark.version}*
>> **
>> **
>> *org.apache.spark*
>> *spark-catalyst_2.11 *
>> *${spark.version}*
>> **
>> **
>> *org.apache.spark*
>> *spark-sql_2.11*
>> *${spark.version}*
>> * *
>> **
>> *com.stratio.datasource*
>> *spark-mongodb_2.11*
>> *0.10.3*
>> **
>> **
>> *com.stratio.datasource*
>> *spark-mongodb_2.11*
>> *0.10.3*
>> *jar*
>> **
>> **
>>
>>
>> Regards
>>
>
>


Spark Getting data from MongoDB in JAVA

2016-06-10 Thread Asfandyar Ashraf Malik
Hi,
I am using Stratio library to get MongoDB to work with Spark but I get the
following error:

java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.ScalaReflection

This is my code.
---
*public static void main(String[] args) {*

*JavaSparkContext sc = new JavaSparkContext("local[*]", "test
spark-mongodb java"); *
*SQLContext sqlContext = new SQLContext(sc); *

*Map options = new HashMap(); *
*options.put("host", "xyz.mongolab.com:59107
"); *
*options.put("database", "heroku_app3525385");*
*options.put("collection", "datalog");*
*options.put("credentials", "*,,");*

*DataFrame df =
sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
*df.registerTempTable("datalog"); *
*df.show();*

*}*
---
My pom file is as follows:

 **
**
*org.apache.spark*
*spark-core_2.11*
*${spark.version}*
**
**
*org.apache.spark*
*spark-catalyst_2.11 *
*${spark.version}*
**
**
*org.apache.spark*
*spark-sql_2.11*
*${spark.version}*
* *
**
*com.stratio.datasource*
*spark-mongodb_2.11*
*0.10.3*
**
**
*com.stratio.datasource*
*spark-mongodb_2.11*
*0.10.3*
*jar*
**
**


Regards