Now Spark persist data source table into Hive metastore in Spark SQL specific format. This is not a bug.
------------------ Original ------------------ From: "????????";<1141982...@qq.com>; Date: Mon, Mar 27, 2017 04:47 PM To: "dev"<dev@carbondata.incubator.apache.org>; Subject: data not input hive spark 2.1.0 hive 1.2.1 Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`carbon_table30` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.