Re: Spark Getting data from MongoDB in JAVA

2016-06-13 Thread Asfandyar Ashraf Malik
Yes, It was a dependency issue. I was using incompatible versions.
The command *mvn dependency:tree -Dverbose *helped me fix this.

Cheers



Asfandyar Ashraf Malik


Mobile: +49 15751174449 <%2B49%20151%20230%20130%2066>

Email: asfand...@kreditech.com <%2B49%20151%20230%20130%2066>



Kreditech Holding SSL GmbH

Ludwig-Erhard-Straße 1, 20459 Hamburg, Germany

2016-06-12 18:36 GMT+02:00 Ted Yu :

> What's the value of spark.version ?
>
> Do you know which version of Spark mongodb connector 0.10.3 was built
> against ?
>
> You can use the following command to find out:
> mvn dependency:tree
>
> Maybe the Spark version you use is different from what mongodb connector
> was built against.
>
> On Fri, Jun 10, 2016 at 2:50 AM, Asfandyar Ashraf Malik <
> asfand...@kreditech.com> wrote:
>
>> Hi,
>> I did not notice that I put it twice.
>> I changed that and ran my program but it still gives the same error:
>>
>> java.lang.NoSuchMethodError:
>> org.apache.spark.sql.catalyst.ScalaReflection$.typeOfObject()Lscala/PartialFunction;
>>
>>
>> Cheers
>>
>>
>>
>> 2016-06-10 11:47 GMT+02:00 Alonso Isidoro Roman :
>>
>>> why *spark-mongodb_2.11 dependency is written twice in pom.xml?*
>>>
>>> Alonso Isidoro Roman
>>> [image: https://]about.me/alonso.isidoro.roman
>>>
>>> 
>>>
>>> 2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik <
>>> asfand...@kreditech.com>:
>>>
 Hi,
 I am using Stratio library to get MongoDB to work with Spark but I get
 the following error:

 java.lang.NoSuchMethodError:
 org.apache.spark.sql.catalyst.ScalaReflection

 This is my code.

 ---
 *public static void main(String[] args) {*

 *JavaSparkContext sc = new JavaSparkContext("local[*]", "test
 spark-mongodb java"); *
 *SQLContext sqlContext = new SQLContext(sc); *

 *Map options = new HashMap(); *
 *options.put("host", "xyz.mongolab.com:59107
 "); *
 *options.put("database", "heroku_app3525385");*
 *options.put("collection", "datalog");*
 *options.put("credentials", "*,,");*

 *DataFrame df =
 sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
 *df.registerTempTable("datalog"); *
 *df.show();*

 *}*

 ---
 My pom file is as follows:

  **
 **
 *org.apache.spark*
 *spark-core_2.11*
 *${spark.version}*
 **
 **
 *org.apache.spark*
 *spark-catalyst_2.11 *
 *${spark.version}*
 **
 **
 *org.apache.spark*
 *spark-sql_2.11*
 *${spark.version}*
 * *
 **
 *com.stratio.datasource*
 *spark-mongodb_2.11*
 *0.10.3*
 **
 **
 *com.stratio.datasource*
 *spark-mongodb_2.11*
 *0.10.3*
 *jar*
 **
 **


 Regards

>>>
>>>
>>
>


Re: Spark Getting data from MongoDB in JAVA

2016-06-12 Thread Ted Yu
What's the value of spark.version ?

Do you know which version of Spark mongodb connector 0.10.3 was built
against ?

You can use the following command to find out:
mvn dependency:tree

Maybe the Spark version you use is different from what mongodb connector
was built against.

On Fri, Jun 10, 2016 at 2:50 AM, Asfandyar Ashraf Malik <
asfand...@kreditech.com> wrote:

> Hi,
> I did not notice that I put it twice.
> I changed that and ran my program but it still gives the same error:
>
> java.lang.NoSuchMethodError:
> org.apache.spark.sql.catalyst.ScalaReflection$.typeOfObject()Lscala/PartialFunction;
>
>
> Cheers
>
>
>
> 2016-06-10 11:47 GMT+02:00 Alonso Isidoro Roman :
>
>> why *spark-mongodb_2.11 dependency is written twice in pom.xml?*
>>
>> Alonso Isidoro Roman
>> [image: https://]about.me/alonso.isidoro.roman
>>
>> 
>>
>> 2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik <
>> asfand...@kreditech.com>:
>>
>>> Hi,
>>> I am using Stratio library to get MongoDB to work with Spark but I get
>>> the following error:
>>>
>>> java.lang.NoSuchMethodError:
>>> org.apache.spark.sql.catalyst.ScalaReflection
>>>
>>> This is my code.
>>>
>>> ---
>>> *public static void main(String[] args) {*
>>>
>>> *JavaSparkContext sc = new JavaSparkContext("local[*]", "test
>>> spark-mongodb java"); *
>>> *SQLContext sqlContext = new SQLContext(sc); *
>>>
>>> *Map options = new HashMap(); *
>>> *options.put("host", "xyz.mongolab.com:59107
>>> "); *
>>> *options.put("database", "heroku_app3525385");*
>>> *options.put("collection", "datalog");*
>>> *options.put("credentials", "*,,");*
>>>
>>> *DataFrame df =
>>> sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
>>> *df.registerTempTable("datalog"); *
>>> *df.show();*
>>>
>>> *}*
>>>
>>> ---
>>> My pom file is as follows:
>>>
>>>  **
>>> **
>>> *org.apache.spark*
>>> *spark-core_2.11*
>>> *${spark.version}*
>>> **
>>> **
>>> *org.apache.spark*
>>> *spark-catalyst_2.11 *
>>> *${spark.version}*
>>> **
>>> **
>>> *org.apache.spark*
>>> *spark-sql_2.11*
>>> *${spark.version}*
>>> * *
>>> **
>>> *com.stratio.datasource*
>>> *spark-mongodb_2.11*
>>> *0.10.3*
>>> **
>>> **
>>> *com.stratio.datasource*
>>> *spark-mongodb_2.11*
>>> *0.10.3*
>>> *jar*
>>> **
>>> **
>>>
>>>
>>> Regards
>>>
>>
>>
>


Re: Spark Getting data from MongoDB in JAVA

2016-06-12 Thread vaquar khan
Hi Asfanyar,

*NoSuchMethodError *in Java means you compiled against one version of code
, and executed against a different version.

Please make sure your java version and adding dependency version is working
on same java version.

regards,
vaquar khan

On Fri, Jun 10, 2016 at 4:50 AM, Asfandyar Ashraf Malik <
asfand...@kreditech.com> wrote:

> Hi,
> I did not notice that I put it twice.
> I changed that and ran my program but it still gives the same error:
>
> java.lang.NoSuchMethodError:
> org.apache.spark.sql.catalyst.ScalaReflection$.typeOfObject()Lscala/PartialFunction;
>
>
> Cheers
>
>
>
> 2016-06-10 11:47 GMT+02:00 Alonso Isidoro Roman :
>
>> why *spark-mongodb_2.11 dependency is written twice in pom.xml?*
>>
>> Alonso Isidoro Roman
>> [image: https://]about.me/alonso.isidoro.roman
>>
>> 
>>
>> 2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik <
>> asfand...@kreditech.com>:
>>
>>> Hi,
>>> I am using Stratio library to get MongoDB to work with Spark but I get
>>> the following error:
>>>
>>> java.lang.NoSuchMethodError:
>>> org.apache.spark.sql.catalyst.ScalaReflection
>>>
>>> This is my code.
>>>
>>> ---
>>> *public static void main(String[] args) {*
>>>
>>> *JavaSparkContext sc = new JavaSparkContext("local[*]", "test
>>> spark-mongodb java"); *
>>> *SQLContext sqlContext = new SQLContext(sc); *
>>>
>>> *Map options = new HashMap(); *
>>> *options.put("host", "xyz.mongolab.com:59107
>>> "); *
>>> *options.put("database", "heroku_app3525385");*
>>> *options.put("collection", "datalog");*
>>> *options.put("credentials", "*,,");*
>>>
>>> *DataFrame df =
>>> sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
>>> *df.registerTempTable("datalog"); *
>>> *df.show();*
>>>
>>> *}*
>>>
>>> ---
>>> My pom file is as follows:
>>>
>>>  **
>>> **
>>> *org.apache.spark*
>>> *spark-core_2.11*
>>> *${spark.version}*
>>> **
>>> **
>>> *org.apache.spark*
>>> *spark-catalyst_2.11 *
>>> *${spark.version}*
>>> **
>>> **
>>> *org.apache.spark*
>>> *spark-sql_2.11*
>>> *${spark.version}*
>>> * *
>>> **
>>> *com.stratio.datasource*
>>> *spark-mongodb_2.11*
>>> *0.10.3*
>>> **
>>> **
>>> *com.stratio.datasource*
>>> *spark-mongodb_2.11*
>>> *0.10.3*
>>> *jar*
>>> **
>>> **
>>>
>>>
>>> Regards
>>>
>>
>>
>


-- 
Regards,
Vaquar Khan
+91 830-851-1500


Re: Spark Getting data from MongoDB in JAVA

2016-06-10 Thread Asfandyar Ashraf Malik
Hi,
I did not notice that I put it twice.
I changed that and ran my program but it still gives the same error:

java.lang.NoSuchMethodError:
org.apache.spark.sql.catalyst.ScalaReflection$.typeOfObject()Lscala/PartialFunction;


Cheers



2016-06-10 11:47 GMT+02:00 Alonso Isidoro Roman :

> why *spark-mongodb_2.11 dependency is written twice in pom.xml?*
>
> Alonso Isidoro Roman
> [image: https://]about.me/alonso.isidoro.roman
>
> 
>
> 2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik  >:
>
>> Hi,
>> I am using Stratio library to get MongoDB to work with Spark but I get
>> the following error:
>>
>> java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.ScalaReflection
>>
>> This is my code.
>>
>> ---
>> *public static void main(String[] args) {*
>>
>> *JavaSparkContext sc = new JavaSparkContext("local[*]", "test
>> spark-mongodb java"); *
>> *SQLContext sqlContext = new SQLContext(sc); *
>>
>> *Map options = new HashMap(); *
>> *options.put("host", "xyz.mongolab.com:59107
>> "); *
>> *options.put("database", "heroku_app3525385");*
>> *options.put("collection", "datalog");*
>> *options.put("credentials", "*,,");*
>>
>> *DataFrame df =
>> sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
>> *df.registerTempTable("datalog"); *
>> *df.show();*
>>
>> *}*
>>
>> ---
>> My pom file is as follows:
>>
>>  **
>> **
>> *org.apache.spark*
>> *spark-core_2.11*
>> *${spark.version}*
>> **
>> **
>> *org.apache.spark*
>> *spark-catalyst_2.11 *
>> *${spark.version}*
>> **
>> **
>> *org.apache.spark*
>> *spark-sql_2.11*
>> *${spark.version}*
>> * *
>> **
>> *com.stratio.datasource*
>> *spark-mongodb_2.11*
>> *0.10.3*
>> **
>> **
>> *com.stratio.datasource*
>> *spark-mongodb_2.11*
>> *0.10.3*
>> *jar*
>> **
>> **
>>
>>
>> Regards
>>
>
>


Re: Spark Getting data from MongoDB in JAVA

2016-06-10 Thread Alonso Isidoro Roman
why *spark-mongodb_2.11 dependency is written twice in pom.xml?*

Alonso Isidoro Roman
[image: https://]about.me/alonso.isidoro.roman


2016-06-10 11:39 GMT+02:00 Asfandyar Ashraf Malik :

> Hi,
> I am using Stratio library to get MongoDB to work with Spark but I get the
> following error:
>
> java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.ScalaReflection
>
> This is my code.
>
> ---
> *public static void main(String[] args) {*
>
> *JavaSparkContext sc = new JavaSparkContext("local[*]", "test
> spark-mongodb java"); *
> *SQLContext sqlContext = new SQLContext(sc); *
>
> *Map options = new HashMap(); *
> *options.put("host", "xyz.mongolab.com:59107
> "); *
> *options.put("database", "heroku_app3525385");*
> *options.put("collection", "datalog");*
> *options.put("credentials", "*,,");*
>
> *DataFrame df =
> sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
> *df.registerTempTable("datalog"); *
> *df.show();*
>
> *}*
>
> ---
> My pom file is as follows:
>
>  **
> **
> *org.apache.spark*
> *spark-core_2.11*
> *${spark.version}*
> **
> **
> *org.apache.spark*
> *spark-catalyst_2.11 *
> *${spark.version}*
> **
> **
> *org.apache.spark*
> *spark-sql_2.11*
> *${spark.version}*
> * *
> **
> *com.stratio.datasource*
> *spark-mongodb_2.11*
> *0.10.3*
> **
> **
> *com.stratio.datasource*
> *spark-mongodb_2.11*
> *0.10.3*
> *jar*
> **
> **
>
>
> Regards
>


Spark Getting data from MongoDB in JAVA

2016-06-10 Thread Asfandyar Ashraf Malik
Hi,
I am using Stratio library to get MongoDB to work with Spark but I get the
following error:

java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.ScalaReflection

This is my code.
---
*public static void main(String[] args) {*

*JavaSparkContext sc = new JavaSparkContext("local[*]", "test
spark-mongodb java"); *
*SQLContext sqlContext = new SQLContext(sc); *

*Map options = new HashMap(); *
*options.put("host", "xyz.mongolab.com:59107
"); *
*options.put("database", "heroku_app3525385");*
*options.put("collection", "datalog");*
*options.put("credentials", "*,,");*

*DataFrame df =
sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();*
*df.registerTempTable("datalog"); *
*df.show();*

*}*
---
My pom file is as follows:

 **
**
*org.apache.spark*
*spark-core_2.11*
*${spark.version}*
**
**
*org.apache.spark*
*spark-catalyst_2.11 *
*${spark.version}*
**
**
*org.apache.spark*
*spark-sql_2.11*
*${spark.version}*
* *
**
*com.stratio.datasource*
*spark-mongodb_2.11*
*0.10.3*
**
**
*com.stratio.datasource*
*spark-mongodb_2.11*
*0.10.3*
*jar*
**
**


Regards