Thanks for all the suggestion. Very Helpful.
On 17 January 2017 at 22:04, Lars Albertsson wrote:
> My advice, short version:
> * Start by testing one job per test.
> * Use Scalatest or a standard framework.
> * Generate input datasets with Spark routines, write to local file.
My advice, short version:
* Start by testing one job per test.
* Use Scalatest or a standard framework.
* Generate input datasets with Spark routines, write to local file.
* Run job with local master.
* Read output with Spark routines, validate only the fields you care
about for the test case at
ent spark-testing-base package:
> https://github.com/holdenk/spark-testing-base
>
>
>
>
>
> From: A Shaikh <shaikh.af...@gmail.com>
> Date: Sunday, January 15, 2017 at 1:14 PM
> To: User <user@spark.apache.org>
> Subject: TDD in Spark
>
>
>
> What
You should check out Holden’s excellent spark-testing-base package:
https://github.com/holdenk/spark-testing-base
From: A Shaikh <shaikh.af...@gmail.com>
Date: Sunday, January 15, 2017 at 1:14 PM
To: User <user@spark.apache.org>
Subject: TDD in Spark
Whats the most popular Test
Whats the most popular Testing approach for Spark App. I am looking
something in the line of TDD.