[ 
https://issues.apache.org/jira/browse/SPARK-30467?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17089517#comment-17089517
 ] 

Prashant Sharma commented on SPARK-30467:
-----------------------------------------

As it is already mentioned here, this is not a SPARK issue. It is an issue 
related to the JVM in use. As I was able to run various FIPS compliant 
configurations [Blog 
link|https://github.com/ScrapCodes/FIPS-compliance/blob/master/blogs/spark-meets-fips.md].

Based on this, I am closing this issue as cannot reproduce. Feel free to 
reopen, if you can get us more detail and a way to reproduce.

> On Federal Information Processing Standard (FIPS) enabled cluster, Spark 
> Workers are not able to connect to Remote Master.
> --------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-30467
>                 URL: https://issues.apache.org/jira/browse/SPARK-30467
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 2.3.3, 2.3.4, 2.4.4
>            Reporter: SHOBHIT SHUKLA
>            Priority: Major
>              Labels: security
>
> On _*Federal Information Processing Standard*_ (FIPS) enabled clusters, If we 
> configured *spark.network.crypto.enabled true* , Spark Workers are not able 
> to create Spark Context because of communication between Spark Worker and 
> Spark Master is failing.
> Default Algorithm ( *_spark.network.crypto.keyFactoryAlgorithm_* ) is set to 
> *_PBKDF2WithHmacSHA1_* and that is one of the Non Approved Cryptographic 
> Algorithm. We had tried so many values from FIPS Approved Cryptographic 
> Algorithm but those values are also not working.
> *Error logs :*
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> *fips.c(145): OpenSSL internal error, assertion failed: FATAL FIPS SELFTEST 
> FAILURE*
> JVMDUMP039I Processing dump event "abort", detail "" at 2020/01/09 06:41:50 - 
> please wait.
> JVMDUMP032I JVM requested System dump using 
> '<SPARK_HOME>/bin/core.20200109.064150.283.0001.dmp' in response to an event
> JVMDUMP030W Cannot write dump to 
> file<SPARK_HOME>/bin/core.20200109.064150.283.0001.dmp: Permission denied
> JVMDUMP012E Error in System dump: The core file created by child process with 
> pid = 375 was not found. Expected to find core file with name 
> "/var/cores/core-netty-rpc-conne-sig11-user1000320999-group0-pid375-time*"
> JVMDUMP030W Cannot write dump to file 
> <SPARK_HOME>/bin/javacore.20200109.064150.283.0002.txt: Permission denied
> JVMDUMP032I JVM requested Java dump using 
> '/tmp/javacore.20200109.064150.283.0002.txt' in response to an event
> JVMDUMP010I Java dump written to /tmp/javacore.20200109.064150.283.0002.txt
> JVMDUMP032I JVM requested Snap dump using 
> '<SPARK_HOME>/bin/Snap.20200109.064150.283.0003.trc' in response to an event
> JVMDUMP030W Cannot write dump to file 
> <SPARK_HOME>/bin/Snap.20200109.064150.283.0003.trc: Permission denied
> JVMDUMP010I Snap dump written to /tmp/Snap.20200109.064150.283.0003.trc
> JVMDUMP030W Cannot write dump to file 
> <SPARK_HOME>/bin/jitdump.20200109.064150.283.0004.dmp: Permission denied
> JVMDUMP007I JVM Requesting JIT dump using 
> '/tmp/jitdump.20200109.064150.283.0004.dmp'
> JVMDUMP010I JIT dump written to /tmp/jitdump.20200109.064150.283.0004.dmp
> JVMDUMP013I Processed dump event "abort", detail "".



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to