Hi Team,

    I am new to spark and writing my first program. I have written sample
program with spark master as local. To execute spark over local yarn what
should be value of spark.master property? Can I point to remote yarn
cluster? I would like to execute  this as a java application and not submit
using spark-submit.

Main objective is to create a service which can execute spark sql queries
over yarn cluster.


Thanks in advance.

Regards,
Atul
 

code snippet as below

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;

public class TestSpark {
        public static void main(String[] args) {
                SparkSession spark = SparkSession.builder().appName("Java Spark 
Sql
Example").config("spark.master","local").getOrCreate();
                
                Dataset<Row> df = 
spark.read().json("/spark/sparkSql/employee.json");
                System.out.println("Data----");
                df.cache();
                df.show();
//              JavaSparkContext sc = new JavaSparkContext(new
SparkConf().setAppName("SparkJoins").setMaster("yarn-client"));
        
        }
}




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-spark-Java-on-yarn-cluster-tp27504.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to