We've been able to use ipopo dependency injection framework in our pyspark system and deploy .egg pyspark apps that resolve and wire up all the components (like a kernel architecture. Also similar to spring) during an initial bootstrap sequence; then invoke those components across spark. Just replying for info since it's not identical to your request but in the same spirit. Darren
Sent from my Verizon, Samsung Galaxy smartphone -------- Original message --------From: Chetan Khatri <chetan.opensou...@gmail.com> Date: 1/4/17 6:34 AM (GMT-05:00) To: Lars Albertsson <la...@mapflat.com> Cc: user <user@spark.apache.org>, Spark Dev List <d...@spark.apache.org> Subject: Re: Dependency Injection and Microservice development with Spark Lars, Thank you, I want to use DI for configuring all the properties (wiring) for below architectural approach. Oracle -> Kafka Batch (Event Queuing) -> Spark Jobs( Incremental load from HBase -> Hive with Transformation) -> Spark Transformation -> PostgreSQL Thanks. On Thu, Dec 29, 2016 at 3:25 AM, Lars Albertsson <la...@mapflat.com> wrote: Do you really need dependency injection? DI is often used for testing purposes. Data processing jobs are easy to test without DI, however, due to their functional and synchronous nature. Hence, DI is often unnecessary for testing data processing jobs, whether they are batch or streaming jobs. Or do you want to use DI for other reasons? Lars Albertsson Data engineering consultant www.mapflat.com https://twitter.com/lalleal +46 70 7687109 Calendar: https://goo.gl/6FBtlS, https://freebusy.io/la...@mapflat.com On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri <chetan.opensou...@gmail.com> wrote: > Hello Community, > > Current approach I am using for Spark Job Development with Scala + SBT and > Uber Jar with yml properties file to pass configuration parameters. But If i > would like to use Dependency Injection and MicroService Development like > Spring Boot feature in Scala then what would be the standard approach. > > Thanks > > Chetan