Hi all, The examples listed here
https://spark.apache.org/examples.html refer to the spark context as "spark" but when running Spark Shell uses "sc" for the SparkContext. Am I missing something? Thanks! RJ -- em [email protected] c 954.496.2314
Hi all, The examples listed here
https://spark.apache.org/examples.html refer to the spark context as "spark" but when running Spark Shell uses "sc" for the SparkContext. Am I missing something? Thanks! RJ -- em [email protected] c 954.496.2314