This is an interesting one.
I have never tried to add --files ...
spark-submit --master yarn --deploy-mode client --files
/etc/hive/conf/hive-site.xml,/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
Rather, under $SPARK_HOME/conf, I create soft links to the needed XML files
as
Thanks everyone. I was able to resolve this.
Here is what I did. Just passed conf file using —files option.
Mistake that I did was reading the json conf file before creating spark session
. Reading if after creating spark session helped it. Thanks once again for your
valuable suggestions
If code running on the executors need some local file like a config file,
then it does have to be passed this way. That much is normal.
On Sat, May 15, 2021 at 1:41 AM Gourav Sengupta
wrote:
> Hi,
>
> once again lets start with the requirement. Why are you trying to pass xml
> and json files to
executors.
>>>>>
>>>>>
>>>>> On Fri, May 14, 2021 at 5:01 PM Longjiang.Yang <
>>>>> longjiang.y...@target.com> wrote:
>>>>>
>>>>>> Could you check whether this file is accessible in executors? (is it
>>>>>> in HDFS or in the client local FS)
>>>>>> /appl/common/ftp/conf.json
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From: *KhajaAsmath Mohammed
>>>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>>>> *To: *"user @spark"
>>>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>>>
>>>>>>
>>>>>>
>>>>>> /appl/common/ftp/conf.json
>>>>>>
>>>>>
>
>>>>> Could you check whether this file is accessible in executors? (is it
>>>>> in HDFS or in the client local FS)
>>>>> /appl/common/ftp/conf.json
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *From: *KhajaAsmath Mohammed
>>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>>> *To: *"user @spark"
>>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>>
>>>>>
>>>>>
>>>>> /appl/common/ftp/conf.json
>>>>>
>>>>
/conf.json
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *From: *KhajaAsmath Mohammed
>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>> *To: *"user @spark"
>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>
>>>>
>>>>
>>>> /appl/common/ftp/conf.json
>>>>
>>>
gt;>>
>>>
>>>
>>>
>>> *From: *KhajaAsmath Mohammed
>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>> *To: *"user @spark"
>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>
>>>
>>>
>>> /appl/common/ftp/conf.json
>>>
>>
gt;>
>>
>> *From: *KhajaAsmath Mohammed
>> *Date: *Friday, May 14, 2021 at 4:50 PM
>> *To: *"user @spark"
>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>
>>
>>
>> /appl/common/ftp/conf.json
>>
>
cal FS)
> /appl/common/ftp/conf.json
>
>
>
>
>
> *From: *KhajaAsmath Mohammed
> *Date: *Friday, May 14, 2021 at 4:50 PM
> *To: *"user @spark"
> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>
>
>
> /appl/common/ftp/conf.json
>
Hi,
I am having a weird situation where the below command works when the
deploy mode is a client and fails if it is a cluster.
spark-submit --master yarn --deploy-mode client --files
/etc/hive/conf/hive-site.xml,/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
--driver-memory 70g
windows
Best Regards,
Vamshi T
From: Raymond Xie
Sent: Sunday, June 17, 2018 5:07 AM
To: user; Hui Xie
Subject: spark-submit Error: Cannot load main class from JAR file
Hello, I am doing the practice in windows now.
I have the jar file generated under:
C:\RXI
Hello, I am doing the practice in windows now.
I have the jar file generated under:
C:\RXIE\Learning\Scala\spark2practice\target\scala-2.
11\spark2practice_2.11-0.1.jar
The package name is Retail_db and the object is GetRevenuePerOrder.
The spark-submit command is:
spark-submit
Hi,
I am on EMR 4.7 with Spark 1.6.1
I am trying to read from s3n buckets in spark
Option 1 :
If I set up
hadoopConf.set("fs.s3.impl", "org.apache.hadoop.fs.s3.S3FileSystem")
hadoopConf.set("fs.s3.awsSecretAccessKey", sys.env("AWS_SECRET_ACCESS_KEY"))
hadoopConf.set("fs.s3.awsAccessKeyId",
Hi Guru,
I am executing this on DataStax Enterprise Spark node and ~/.dserc file
exists which consists Cassandra credentials but still getting the error
Below is the given command
dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars
Hi Satish,
Can you add more error or log info to the email?
Guru Medasani
gdm...@gmail.com
On Jul 31, 2015, at 1:06 AM, satish chandra j jsatishchan...@gmail.com
wrote:
HI,
I have submitted a Spark Job with options jars,class,master as local but i am
getting an error as below
dse
Thanks Satish. I only see the INFO messages and don’t see any error messages in
the output you pasted.
Can you paste the log with the error messages?
Guru Medasani
gdm...@gmail.com
On Aug 3, 2015, at 11:12 PM, satish chandra j jsatishchan...@gmail.com
wrote:
Hi Guru,
I am executing
HI,
I have submitted a Spark Job with options jars,class,master as *local* but
i am getting an error as below
*dse spark-submit spark error exception in thread main java.io.ioexception:
Invalid Request Exception(Why you have not logged in)*
*Note: submitting datastax spark node*
please let me
17 matches
Mail list logo