Are there installation instructions for Spark 3.4.1?
I defined SPARK_HOME as it describes here
https://spark.apache.org/docs/latest/api/python/getting_started/install.html
ls $SPARK_HOME/python/lib
py4j-0.10.9.7-src.zip PY4J_LICENSE.txt pyspark.zip
I am getting a class not found error
imp
Hi Mich,
It's not specific to ORC, and looks like a bug from Hadoop Common project.
I have raised a bug and am happy to contribute to Hadoop 3.3.0 version. Do
you know if anyone could help me to set the Assignee?
https://issues.apache.org/jira/browse/HADOOP-18856
With Best Regards,
Dipayan Dev