HI all i need to use spark-csv in my spark instance, and i want to avoid launching spark-shell by passing the package name every time I seem to remember that i need to amend a file in the /conf directory to inlcude e,g spark.packages com.databricks:spark-csv_2.11:1.4.0 ....
but i cannot find any docs tell ing me which config file i have to modify anyone can assist ? kr marco