Re: Custom delimiter file load

2016-12-31 Thread Nicholas Hakobian
See the documentation for the options given to the csv function: http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.DataFrameReader@csv(paths:String*):org.apache.spark.sql.DataFrame The options can be passed with the option/options functions to the DataFrameReader class

Custom delimiter file load

2016-12-31 Thread A Shaikh
In Pyspark 2 loading file wtih any delimiter into Dataframe is pretty straightforward spark.read.csv(file, schema=, sep='|') Is there something similar in Spark 2 in Scala! spark.read.csv(path, sep='|')?