Github user CK50 commented on the pull request: https://github.com/apache/spark/pull/10003#issuecomment-160178110 @rxin I wish I could run dev/lint-scala, but even after hours of struggling I cannot get build/sbt running. I have downloaded sbt-launcher.jar manually, but now I am stuck with sbt trying to download extra files from a variety of invalid urls such as https://jcenter.bintray.com/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.pom https://jcenter.bintray.com/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.jar Any ideas appreciated. The Spark 1.6 build page did not help. - Any other page I should follow? The old generated statement is INSERT INTO mytable VALUES (?, ?, ..., ?) whereas the new statement is INSERT INTO mytable (col1, col2, ..., colN) VALUES (?, ?, ..., ?) So the old syntax relies on the column positions, whereas the new one relies on the column names. Column names are taken from the DataFrame. So DataFrame column names must match target table column names. At least for Oracle adding the column names is fine. The benefit of this change is that you can write out DataFrames with having less columns than the target table, which I think is not possible today. The downside is that DataFrame column names *must* match. For best backwards compatibility I only wanted to provide column names when it is really needed, like in case of Cassandra. WDYT? On 26.11.2015 20:11, Reynold Xin wrote: > > @CK50 <https://github.com/CK50> this needs some style cleanup - you > can run dev/lint-scala to check styles locally. > > Can you please also attach the generated query before / after this > change? IIUC, maybe it's ok to have this for all dialects. > > â > Reply to this email directly or view it on GitHub > <https://github.com/apache/spark/pull/10003#issuecomment-159978454>. >
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org