Hi Team,

    I am new to spark and writing my first program. I have written sample
program with spark master as local. To execute spark over local yarn what
should be value of spark.master property? Can I point to remote yarn
cluster? I would like to execute  this as a java application and not submit
using spark-submit.

Main objective is to create a service which can execute spark sql queries
over yarn cluster.


Thanks in advance.

Regards,
Atul


code snippet as below
----------------

import org.apache.spark.SparkConf;

import org.apache.spark.api.java.JavaSparkContext;

import org.apache.spark.sql.Dataset;

import org.apache.spark.sql.Row;

import org.apache.spark.sql.SparkSession;


public class TestSpark {

public static void main(String[] args) {

SparkSession spark = SparkSession.builder().appName("Java Spark Sql Example"
).config(*"spark.master","local"*).getOrCreate();

Dataset<Row> df = spark.read().json("/spark/sparkSql/employee.json");

System.out.println("Data----");

df.cache();

df.show();



}

}

Reply via email to