Hi,
we want to execute spark code with out submit application.jar,like this code:

public static void main(String args[]) throws Exception{
        SparkSession spark = SparkSession
                .builder()
                .master("local[*]")
                .appName("spark test")
                .getOrCreate();
      
        Dataset<Row> testData = spark.read().csv(".\\src\\main\\java\\Resources\\no_schema_iris.scv");
        testData.printSchema();
        testData.show();
    }

the above code can work well with idea , do not need to generate jar file and submit , but if we replace master("local[*]") with master("yarn") , it can't work , so is there a way to use cluster sparkSession like local sparkSession ?  we need to dynamically execute spark code in web server according to the different request ,  such as filter request will call dataset.filter() , so there is no application.jar to submit .

Reply via email to