Github user CK50 commented on the pull request: https://github.com/apache/spark/pull/10003#issuecomment-160193463 Do I take this as "SBT works fine for me on master"? Yes, build/sbt fails for me when doing downloads: Getting org.scala-sbt sbt 0.13.7 ... :: problems summary :: :::: WARNINGS module not found: org.scala-sbt#sbt;0.13.7 ==== local: tried /home/ckurz/.ivy2/local/org.scala-sbt/sbt/0.13.7/ivys/ivy.xml -- artifact org.scala-sbt#sbt;0.13.7!sbt.jar: /home/ckurz/.ivy2/local/org.scala-sbt/sbt/0.13.7/jars/sbt.jar ==== jcenter: tried https://jcenter.bintray.com/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.pom -- artifact org.scala-sbt#sbt;0.13.7!sbt.jar: https://jcenter.bintray.com/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.jar ==== typesafe-ivy-releases: tried https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt/0.13.7/ivys/ivy.xml ==== Maven Central: tried https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.pom -- artifact org.scala-sbt#sbt;0.13.7!sbt.jar: https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.jar :::::::::::::::::::::::::::::::::::::::::::::: :: UNRESOLVED DEPENDENCIES :: :::::::::::::::::::::::::::::::::::::::::::::: :: org.scala-sbt#sbt;0.13.7: not found :::::::::::::::::::::::::::::::::::::::::::::: :::: ERRORS Server access Error: Connection timed out url=https://jcenter.bintray.com/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.pom Server access Error: Connection timed out url=https://jcenter.bintray.com/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.jar Server access Error: Connection timed out url=https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt/0.13.7/ivys/ivy.xml Server access Error: Connection timed out url=https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.pom Server access Error: Connection timed out url=https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.jar I am behind a proxy and have tried configuring linux env vars and JAVA_OPTS/SBT_OPTS, but this did not help. The thing, which strikes me is that when I try accessing these urls: https://jcenter.bintray.com/org/scala-sbt/sbt/0.13.7/sbt-0.13.7.pom from within a browser, I also get 404. When looking into https://jcenter.bintray.com/org/scala-sbt there is no sub-folder called sbt. - No idea what is going wrong here :-( ---------------------- Back to the PR: I will upload the suggested style changes. In general I agree that column names should be given by default. I just have two concerns re making this the new default behaviour: 1) does this work on all supported DBs? - Does Spark have automated tests with appropriate envs for this? 2) is it okay to assume that DataFrame column names always match target col names? (Spark currently does not rely on this. Instead it relies on column positions, which is usually not good practice.) 3) is such a change okay with Spark's rules for backwards compatibility? Just some concerns, I am fine with making column names the new default behaviour. But as I said, I do not have the background to judge this. Please let me know how to proceed. On 27.11.2015 19:49, Sean Owen wrote: > > I'm not sure what the issue is with SBT (do you get download errors or > something?) but you can also see the results of the style checker in > the build output: > https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2119/consoleFull > Search for "[error]" > > I tend to think it's better practice to specify column names > explicitly. Is there a downside? I don't recall a database that > /wouldn't/ let you specify them, right? > > â > Reply to this email directly or view it on GitHub > <https://github.com/apache/spark/pull/10003#issuecomment-160189345>. >
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org