Re: Ignite with Spark Intergration

2019-12-12 Thread Andrei Aleksandrov

Hi,
In Spark you can use next options:

 * spark.driver.extraJavaOptions
 * spark.executor.extraJavaOptions

You can path your IGNITE JVM options there like -DIGNITE_QUIET=false. 
Generally clients will be started on executors during data loading or 
data reading but you also can start them on driver side using 
Ignition.start().


BR,
Andrei

12/12/2019 12:17 PM, datta пишет:

Stopped client node in the machine where spark worker node was running and
started spark shell. It started a ignite client node from within.

I have only 1 problem that how to specify ignite jvm options from spark. it
is taking default -Xms and -Xmx arguments which are very less



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Ignite with Spark Intergration

2019-12-12 Thread datta
Stopped client node in the machine where spark worker node was running and
started spark shell. It started a ignite client node from within.

I have only 1 problem that how to specify ignite jvm options from spark. it
is taking default -Xms and -Xmx arguments which are very less



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Ignite with Spark Intergration

2019-12-09 Thread sri hari kali charan Tummala
I integrated spark and ignite using thrift client will that do ?

https://github.com/kali786516/ApacheIgnitePoc/blob/master/src/main/scala/com/ignite/examples/spark/SparkClientConnectionTest.scala#L73



On Monday, December 9, 2019, Denis Magda  wrote:

> Hi, just ensure that "clientMode" is set in IgniteConfiguration that you
> pass to Spark's IgniteContext object. Spark worker will spin up a client
> node automatically for you and that one will reach out to the server
> (assuming you properly configured Ignite discovery SPI in the same
> IgniteConfiguration).
>
> -
> Denis
>
>
> On Sat, Dec 7, 2019 at 10:46 AM datta  wrote:
>
>> Hi,
>>
>> Then what I have currently implemented is hopefully not embedded mode
>> is it?
>>
>> Also I wanted to know if should install client nodes on spark worker nodes
>> if spark is going to start a client node itself ?
>>
>>
>>
>> --
>> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>>
>

-- 
Thanks & Regards
Sri Tummala


Re: Ignite with Spark Intergration

2019-12-09 Thread Denis Magda
Hi, just ensure that "clientMode" is set in IgniteConfiguration that you
pass to Spark's IgniteContext object. Spark worker will spin up a client
node automatically for you and that one will reach out to the server
(assuming you properly configured Ignite discovery SPI in the same
IgniteConfiguration).

-
Denis


On Sat, Dec 7, 2019 at 10:46 AM datta  wrote:

> Hi,
>
> Then what I have currently implemented is hopefully not embedded mode
> is it?
>
> Also I wanted to know if should install client nodes on spark worker nodes
> if spark is going to start a client node itself ?
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>


Re: Ignite with Spark Intergration

2019-12-06 Thread datta
Hi,

Then what I have currently implemented is hopefully not embedded mode 
is it?

Also I wanted to know if should install client nodes on spark worker nodes
if spark is going to start a client node itself ?



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Ignite with Spark Intergration

2019-12-06 Thread Stephen Darlington
That’s just how the Spark integration works! 

I suppose you could use the Spark’s JDBC connection to access Ignite, but you’d 
lose some of the flexibility.

Regards,
Stephen

> On 6 Dec 2019, at 17:04, datta  wrote:
> 
> Hi,
> 
> I have installed ignite in 2 machines .
> 
> 1 - server and 1 as client
> 
> in the client machine i have installed spark and copied the the required
> ignite jars inside the spark_home's jars folder.
> 
> the problem is when i am starting spark and try to read a table in ignite
> using spark dataframe api it is starting another client node from spark.
> 
> How to prevent this from happening?
> 
> 
> ignite version 2.5.0
> spark version 2.2.0
> 
> FYI I am using HortonWorks Ambari Platform. as i could not upgrade spark to
> 2.3.0 as it would break other things i had to use ignite 2.5.0 which
> supports 2.2.0 spark
> 
> 
> 
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/




Ignite with Spark Intergration

2019-12-06 Thread datta
Hi,

I have installed ignite in 2 machines .

1 - server and 1 as client

in the client machine i have installed spark and copied the the required
ignite jars inside the spark_home's jars folder.

the problem is when i am starting spark and try to read a table in ignite
using spark dataframe api it is starting another client node from spark.

How to prevent this from happening?


ignite version 2.5.0
spark version 2.2.0

FYI I am using HortonWorks Ambari Platform. as i could not upgrade spark to
2.3.0 as it would break other things i had to use ignite 2.5.0 which
supports 2.2.0 spark



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/