Hello,

I think you can try with  below , the reason is only yarn-cllient mode is 
supported for your scenario.

master("yarn-client")



Thanks very much.
Keith
From: 张万新 <kevinzwx1...@gmail.com>
Sent: Thursday, November 1, 2018 11:36 PM
To: 崔苗(数据与人工智能产品开发部) <0049003...@znv.com>
Cc: user <u...@spark.incubator.apache.org>
Subject: Re: how to use cluster sparkSession like localSession

I think you should investigate apache zeppelin and livy
崔苗(数据与人工智能产品开发部) <0049003...@znv.com<mailto:0049003...@znv.com>>于2018年11月2日 
周五11:01写道:

Hi,
we want to execute spark code with out submit application.jar,like this code:

public static void main(String args[]) throws Exception{
        SparkSession spark = SparkSession
                .builder()
                .master("local[*]")
                .appName("spark test")
                .getOrCreate();

        Dataset<Row> testData = 
spark.read().csv(".\\src\\main\\java\\Resources\\no_schema_iris.scv");
        testData.printSchema();
        testData.show();
    }

the above code can work well with idea , do not need to generate jar file and 
submit , but if we replace master("local[*]") with master("yarn") , it can't 
work , so is there a way to use cluster sparkSession like local sparkSession ?  
we need to dynamically execute spark code in web server according to the 
different request ,  such as filter request will call dataset.filter() , so 
there is no application.jar to submit .

[https://mail-online.nosdn.127.net/qiyelogo/defaultAvatar.png]

0049003208

0049003...@znv.com<mailto:0049003...@znv.com>

签名由 
网易邮箱大师<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmail.163.com%2Fdashi%2Fdlpro.html%3Ffrom%3Dmail81&data=02%7C01%7Caisun%40ebay.com%7C08e6e5175c0e4177e9a608d6408d94a2%7C46326bff992841a0baca17c16c94ea99%7C0%7C0%7C636767374160940114&sdata=HfnYgWKXOUCodtDZGPFQHpyVcY8Oi707rihUe8v24cQ%3D&reserved=0>
 定制
--------------------------------------------------------------------- To 
unsubscribe e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>

Reply via email to