Re: Submitting Spark Job thru REST API?

2020-09-02 Thread Amit Joshi
Hi, There is other option like apache Livy which lets you submit the job using Rest api. Other option can be using AWS Datapipeline to configure your job as EMR activity. To activate pipeline, you need console or a program. Regards Amit On Thursday, September 3, 2020, Eric Beabes wrote: > Under

Re: Submitting Spark Job thru REST API?

2020-09-02 Thread Breno Arosa
Maybe there are other ways but I think the most common path is using Apache Livy (https://livy.apache.org/). On 02/09/2020 17:58, Eric Beabes wrote: Under Spark 2.4 is it possible to submit a Spark job thru REST API - just like the Flink job? Here's the use case: We need to submit a Spark Job

Submitting Spark Job thru REST API?

2020-09-02 Thread Eric Beabes
Under Spark 2.4 is it possible to submit a Spark job thru REST API - just like the Flink job? Here's the use case: We need to submit a Spark Job to the EMR cluster but our security team is not allowing us to submit a job from the Master node or thru UI. They want us to create a "Docker Container"

Re: Adding isolation level when reading from DB2 with spark.read

2020-09-02 Thread Jörg Strebel
Hallo! You can set the DB2 JDBC driver options in theJDBC connection string: https://www.ibm.com/support/knowledgecenter/en/SSEPGG_10.1.0/com.ibm.db2.luw.apdv.java.doc/src/tpc/imjcc_rjvdsprp.html The DB2 JDBC driver has an option called "defaultIsolationLevel" https://www.ibm.com/support/knowled

RE: Adding isolation level when reading from DB2 with spark.read

2020-09-02 Thread Filipa Sousa
Hello Luca, Thank you for your fast response! It is an interesting suggestion, but unfortunately, isolation level change statement like SET ISOLATION is not permitted while connected to a DB2 database (https://www.ibm.com/support/knowledgecenter/th/SSEPGG_9.7.0/com.ibm.db2.luw.admin.cmd.doc/doc

RE: Adding isolation level when reading from DB2 with spark.read

2020-09-02 Thread Luca Canali
Hi Filipa , Spark JDBC data source has the option to add a "sessionInitStatement". Documented in https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html and https://issues.apache.org/jira/browse/SPARK-21519 I guess you could use that option to "inject " a SET ISOLATION statement, altho

Adding isolation level when reading from DB2 with spark.read

2020-09-02 Thread Filipa Sousa
Hello, We are trying to read from an IBM DB2 database using a pyspark job. We have a requirement to add an isolation level - Read Uncommitted (WITH UR) to the JDBC queries when reading DB2 data. We found "isolationLevel" parameter in Spark documentation, but apparently it seems l