Re: How to direct insert vaules into SparkSQL tables?

2014-08-14 Thread chutium
oh, right, i meant within SqlContext alone, schemaRDD from text file with a
case class



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-direct-insert-vaules-into-SparkSQL-tables-tp11851p12100.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to direct insert vaules into SparkSQL tables?

2014-08-13 Thread Michael Armbrust
I do not believe this is true.  If you are using a hive context you should
be able to register an RDD as a temporary table and then use INSERT INTO
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML#LanguageManualDML-InsertingdataintoHiveTablesfromqueriesto
add data to a hive table of any format (including text files).  Please let
us know if you encounter issues.


On Mon, Aug 11, 2014 at 3:13 AM, chutium teng@gmail.com wrote:

 no, spark sql can not insert or update textfile yet, can only insert into
 parquet files

 but,

 people.union(new_people).registerAsTable(people)

 could be an idea.



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/How-to-direct-insert-vaules-into-SparkSQL-tables-tp11851p11882.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: How to direct insert vaules into SparkSQL tables?

2014-08-11 Thread chutium
no, spark sql can not insert or update textfile yet, can only insert into
parquet files

but,

people.union(new_people).registerAsTable(people)

could be an idea.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-direct-insert-vaules-into-SparkSQL-tables-tp11851p11882.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org