Saving Structured Streaming DF to Hive Partitioned table

2017-02-26 Thread nimrodo
Hi, I want to load a stream of CSV files to a partitioned Hive table called myTable. I tried using Spark 2 Structured Streaming to do that: val spark = SparkSession .builder .appName("TrueCallLoade") .enableHiveSupport() .config("hive.exec.dynamic.partition.mode",

CSV DStream to Hive

2017-02-21 Thread nimrodo
Hi all, I have a DStream that contains very long comma separated values. I want to convert this DStream to a DataFrame. I thought of using split on the RDD and toDF however I can't get it to work. Can anyone help me here? Nimrod -- View this message in context:

NoSuchMethodException: org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions writing to Hive

2017-02-14 Thread nimrodo
Hi, I'm trying to write a DataFrame to a Hive partitioned table. This works fine from spark-shell, however when I use spark-submit i get the following exception: Exception in thread "main" java.lang.NoSuchMethodException:

Problem saving Hive table with Overwrite mode

2016-07-13 Thread nimrodo
Hi, I'm trying to write a partitioned parquet table and save it as a hive table at a specific path. The code I'm using is in Java (columns and table names are a bit different in my real code) and the code is executed using AirFlow which calls the spark-submit: