On Tue, Sep 13, 2022, 21:23 Sachit Murarka <connectsac...@gmail.com> wrote:

> Hi Vibhor,
>
> Thanks for your response!
>
> There are some properties which can be set without changing this flag 
> "spark.sql.legacy.setCommandRejectsSparkCoreConfs"
> post creation of spark session , like shuffle partitions. Any idea why so?
>
> Kind Regards,
> Sachit Murarka
>
>
> On Tue, Sep 13, 2022 at 7:14 PM Vibhor Gupta <vibhor.gu...@walmart.com>
> wrote:
>
>> Hi Sachit,
>>
>> Check the migration guide.
>>
>> https://spark.apache.org/docs/latest/sql-migration-guide.html#:~:text=Spark%202.4%20and%20below%3A%20the,legacy.setCommandRejectsSparkCoreConfs%20to%20false.
>> Migration Guide: SQL, Datasets and DataFrame - Spark 3.3.0 Documentation
>> - Apache Spark
>> <https://spark.apache.org/docs/latest/sql-migration-guide.html#:~:text=Spark%202.4%20and%20below%3A%20the,legacy.setCommandRejectsSparkCoreConfs%20to%20false.>
>> In Spark 3.0, you can use ADD FILE to add file directories as well.
>> Earlier you could add only single files using this command. To restore the
>> behavior of earlier versions, set spark.sql.legacy.addSingleFileInAddFile
>> to true.. In Spark 3.0, SHOW TBLPROPERTIES throws AnalysisException if the
>> table does not exist. In Spark version 2.4 and below, this scenario caused
>> NoSuchTableException.
>> spark.apache.org
>> Also I think it is better to set this property before starting the
>> SparkContext.
>>
>> Regards,
>> Vibhor
>>
>> ------------------------------
>> *From:* Sachit Murarka <connectsac...@gmail.com>
>> *Sent:* Tuesday, September 13, 2022 5:14 PM
>> *To:* spark users <user@spark.apache.org>
>> *Subject:* EXT: Network time out property is not getting set in Spark
>>
>> *EXTERNAL: *Report suspicious emails to *Email Abuse.*
>>
>> Hello Everyone,
>>
>> I am trying to set network timeout property , it used to work in
>> Spark2.X , but in Spark 3 , it is giving following error:-
>>
>> Could you please suggest if it is due to any bug in Spark3 or do we need
>> any other property because as per spark official doc ,this is the unchanged
>> property.
>>
>> spark.conf.set("spark.network.timeout", "1200s")
>>
>> org.apache.spark.sql.AnalysisException: Cannot modify the value of a
>> Spark config: spark.network.timeout
>>
>>   at
>> org.apache.spark.sql.errors.QueryCompilationErrors$.cannotModifyValueOfSparkConfigError(QueryCompilationErrors.scala:2322)
>>
>>   at
>> org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:157)
>>
>>   at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:41)
>>
>>
>> Kind Regards,
>> Sachit Murarka
>>
>

Reply via email to