Re: [Spark] spark client for Hadoop 2.x

2022-04-06 Thread Morven Huang
e use Hadoop > 2.7.7 in our infrastructure currently. > > 1) Does Spark have a plan to publish the Spark client dependencies for Hadoop > 2.x? > 2) Are the new Spark clients capable of connecting to the Hadoop 2.x cluster? > (According to a simple test, Spark client 3.2.1 had

[Spark] spark client for Hadoop 2.x

2022-04-06 Thread Amin Borjian
>From Spark version 3.1.0 onwards, the clients provided for Spark are built >with Hadoop 3 and placed in maven Repository. Unfortunately we use Hadoop >2.7.7 in our infrastructure currently. 1) Does Spark have a plan to publish the Spark client dependencies for Hadoop 2.x? 2) Ar

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-28 Thread Nikhil Chinnapa
Thanks for explaining in such detail and pointing to the source code. Yes, its helpful and cleared lot of confusions. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-28 Thread Stavros Kontopoulos
Yes here is why the initial effort didnt work, explained a bit better. As I mentioned earlier SparkContext will add your jars/files (declared with the related conf properties) to the FileServer. If it is a local to the container's fs jar (has schema local:) it will just be resolved to: file +

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-27 Thread Nikhil Chinnapa
Hi Stavros, Thanks a lot for pointing in right direction. I got stuck in some release, so didn’t got time earlier. The mistake was “LINUX_APP_RESOURCE” : I was using “local” instead it should be “file”. I reached above due to your email only. What I understood: Driver image : $SPARK_HOME/bin

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-19 Thread Stavros Kontopoulos
Hi Nikhil, Application jar by default is added to spark.jars so it is fetched by executors when tasks are launched (behind the scenes SparkContext

K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-16 Thread Nikhil Chinnapa
Environment: Spark: 2.4.0 Kubernetes:1.14 Query: Does application jar needs to be part of both Driver and Executor image? Invocation point (from Java code): sparkLaunch = new SparkLauncher() .setMaster(LINUX_MASTER)

Re: Interest in adding ability to request GPU's to the spark client?

2018-07-23 Thread Susan X. Huynh
t; El mié., 16 may. 2018 a las 2:58, Daniel Galvez () >> escribió: >> >>> Hi all, >>> >>> Is anyone here interested in adding the ability to request GPUs to >>> Spark's client (i.e, spark-submit)? As of now, Yarn 3.0's resource manager >>> se

Re: Interest in adding ability to request GPU's to the spark client?

2018-07-12 Thread Mich Talebzadeh
rk's client (i.e, spark-submit)? As of now, Yarn 3.0's resource manager >> server has the ability to schedule GPUs as resources via cgroups, but the >> Spark client lacks an ability to request these. >> >> The ability to guarantee GPU resources would be practically useful for

Re: Interest in adding ability to request GPU's to the spark client?

2018-07-12 Thread Maximiliano Felice
client (i.e, spark-submit)? As of now, Yarn 3.0's resource manager server > has the ability to schedule GPUs as resources via cgroups, but the Spark > client lacks an ability to request these. > > The ability to guarantee GPU resources would be practically useful for my > organization. Ri

Interest in adding ability to request GPU's to the spark client?

2018-05-15 Thread Daniel Galvez
Hi all, Is anyone here interested in adding the ability to request GPUs to Spark's client (i.e, spark-submit)? As of now, Yarn 3.0's resource manager server has the ability to schedule GPUs as resources via cgroups, but the Spark client lacks an ability to request these. The ability to guarantee

Can I have two different receivers for my Spark client program?

2016-11-30 Thread kant kodali
HI All, I am wondering if it makes sense to have two receivers inside my Spark Client program? The use case is as follows. 1) We have to support a feed from Kafka so this will be a direct receiver #1. We need to perform batch inserts from kafka feed to Cassandra. 2) an gRPC receiver where we

Re: How to set up a Spark Client node?

2015-06-15 Thread ayan guha
workers in UI which typically starts on Master URL:8080. Once you do that,you follow Akhil's instruction above to get a sqlContexxt and set master property properly and runyour app. HTH On Mon, Jun 15, 2015 at 7:02 PM, Akhil Das ak...@sigmoidanalytics.com wrote: I'm assuming by spark-client you

Re: How to set up a Spark Client node?

2015-06-15 Thread Akhil Das
I'm assuming by spark-client you mean the spark driver program. In that case you can pick any machine (say Node 7), create your driver program in it and use spark-submit to submit it to the cluster or if you create the SparkContext within your driver program (specifying all the properties

How to set up a Spark Client node?

2015-06-13 Thread MrAsanjar .
instuctions on how to setup spark client in a cluster mode? I am not sure if I am doing it right. Thanks in advance

Re: Spark Client

2015-06-03 Thread Akhil Das
. Actually we are looking to Launch a spark job from a long running workflow manager, which invokes spark client via SparkSubmit. Unfortunately the client upon successful completion of the application exits with a System.exit(0) or System.exit(NON_ZERO) when there is a failure. Question

Spark Client

2015-06-03 Thread pavan kumar Kolamuri
Hi guys , i am new to spark . I am using sparksubmit to submit spark jobs. But for my use case i don't want it to be exit with System.exit . Is there any other spark client which is api friendly other than SparkSubmit which shouldn't exit with system.exit. Please correct me if i am missing

Re: Spark Client

2015-06-03 Thread pavan kumar Kolamuri
Hi akhil , sorry i may not conveying the question properly . Actually we are looking to Launch a spark job from a long running workflow manager, which invokes spark client via SparkSubmit. Unfortunately the client upon successful completion of the application exits with a System.exit(0

Re: Spark Client

2015-06-03 Thread Akhil Das
it to be exit with System.exit . Is there any other spark client which is api friendly other than SparkSubmit which shouldn't exit with system.exit. Please correct me if i am missing something. Thanks in advance -- Regards Pavan Kumar Kolamuri

Re: Spark Client

2015-06-03 Thread Richard Marscher
to Launch a spark job from a long running workflow manager, which invokes spark client via SparkSubmit. Unfortunately the client upon successful completion of the application exits with a System.exit(0) or System.exit(NON_ZERO) when there is a failure. Question is, Is there an alternate api though

Re: Spark Client

2015-06-03 Thread pavan kumar Kolamuri
are looking to Launch a spark job from a long running workflow manager, which invokes spark client via SparkSubmit. Unfortunately the client upon successful completion of the application exits with a System.exit(0) or System.exit(NON_ZERO) when there is a failure. Question is, Is there an alternate api

Re: How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread Tom Vacek
have written custom InputFormat and RecordReader for Spark, I need to use user variables from spark client program. I added them in SparkConf val sparkConf = new SparkConf().setAppName(args(0)).set(developer,MyName) *and in InputFormat class* protected boolean isSplitable

How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread hnahak
Hi, I have written custom InputFormat and RecordReader for Spark, I need to use user variables from spark client program. I added them in SparkConf val sparkConf = new SparkConf().setAppName(args(0)).set(developer,MyName) *and in InputFormat class* protected boolean isSplitable

Re: How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread haihar nahak
variables from spark client program. I added them in SparkConf val sparkConf = new SparkConf().setAppName(args(0)).set(developer,MyName) *and in InputFormat class* protected boolean isSplitable(JobContext context, Path filename) { System.out.println

Re: How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread hnahak
Instead of setting in SparkConf , set it into SparkContext.hadoopconfiguration.set(key,value) and from JobContext extract same key. --Harihar -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-send-user-variables-from-Spark-client-to-custom

Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Erik Freed
Hi All, I am not sure if this is a 0.9.0 problem to be fixed in 0.9.1 so perhaps already being addressed, but I am having a devil of a time with a spark 0.9.0 client jar for hadoop 2.X. If I go to the site and download: - Download binaries for Hadoop 2 (HDP2, CDH5): find an Apache mirror

Re: Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Rahul Singhal
@spark.apache.orgmailto:user@spark.apache.org Date: Friday 4 April 2014 7:58 PM To: user@spark.apache.orgmailto:user@spark.apache.org user@spark.apache.orgmailto:user@spark.apache.org Subject: Hadoop 2.X Spark Client Jar 0.9.0 problem Hi All, I am not sure if this is a 0.9.0 problem to be fixed in 0.9.1 so

Re: Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Amit Tewari
I believe you got to set following SPARK_HADOOP_VERSION=2.2.0 (or whatever your version is) SPARK_YARN=true then type sbt/sbt assembly If you are using Maven to compile mvn -Pyarn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean package Hope this helps -A On Fri, Apr 4, 2014

Re: Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Erik Freed
...@codecision.com Reply-To: user@spark.apache.org user@spark.apache.org Date: Friday 4 April 2014 7:58 PM To: user@spark.apache.org user@spark.apache.org Subject: Hadoop 2.X Spark Client Jar 0.9.0 problem Hi All, I am not sure if this is a 0.9.0 problem to be fixed in 0.9.1 so perhaps already