[ https://issues.apache.org/jira/browse/SPARK-13040?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sebastián Ramírez updated SPARK-13040: -------------------------------------- Comment: was deleted (was: I submitted a PR with the parameters that ended up working for me: https://github.com/apache/spark/pull/10948) > JDBC using SPARK_CLASSPATH is deprecated but is the only way documented > ----------------------------------------------------------------------- > > Key: SPARK-13040 > URL: https://issues.apache.org/jira/browse/SPARK-13040 > Project: Spark > Issue Type: Documentation > Components: Documentation, Examples > Affects Versions: 1.6.0 > Reporter: Sebastián Ramírez > Priority: Minor > > The documentation says that to use a JDBC driver it must be set with the > environment variable SPARK_CLASSPATH, as in: > {code} > SPARK_CLASSPATH=postgresql-9.3-1102-jdbc41.jar bin/spark-shell > {code} > http://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases > But when run like that, the output says that using that environment variable > is deprecated: > {code} > SPARK_CLASSPATH was detected (set to > '/home/senseta/postgresql-9.4.1207.jre7.jar'). > This is deprecated in Spark 1.0+. > Please instead use: > - ./spark-submit with --driver-class-path to augment the driver classpath > - spark.executor.extraClassPath to augment the executor classpath > 16/01/27 13:36:57 WARN spark.SparkConf: Setting > 'spark.executor.extraClassPath' to > '/home/senseta/postgresql-9.4.1207.jre7.jar' as a work-around. > 16/01/27 13:36:57 WARN spark.SparkConf: Setting 'spark.driver.extraClassPath' > to '/home/senseta/postgresql-9.4.1207.jre7.jar' as a work-around. > {code} > It would be good to have an example with the current official syntax (I'm > actually not sure of which would be the correct parameters), for Scala and > Python (in case they differ). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org