Hi Chanh,

I found a workaround that works to me:
http://stackoverflow.com/questions/29552799/spark-unable-to-find-jdbc-driver/40114125#40114125

Regards,
Daniel

El jue., 6 oct. 2016 a las 6:26, Chanh Le (<giaosu...@gmail.com>) escribió:

> Hi everyone,
> I have the same config in both mode and I really want to change config
> whenever I run so I created a config file and run my application with it.
> My problem is:
> It’s works with these config without using Mesos Cluster Dispatcher.
>
> /build/analytics/spark-1.6.1-bin-hadoop2.6/bin/spark-submit \
>
>
>
>
> *--files /build/analytics/kafkajobs/prod.conf \--conf
> 'spark.executor.extraJavaOptions=-Dconfig.fuction.conf' \--conf
> 'spark.driver.extraJavaOptions=-Dconfig.file=/build/analytics/kafkajobs/prod.conf'
> \--conf
> 'spark.driver.extraClassPath=/build/analytics/spark-1.6.1-bin-hadoop2.6/lib/postgresql-9.3-1102.jdbc41.jar'
> \--conf
> 'spark.executor.extraClassPath=/build/analytics/spark-1.6.1-bin-hadoop2.6/lib/postgresql-9.3-1102.jdbc41.jar'
> \*
> --class com.ants.util.kafka.PersistenceData \
>
> *--master mesos://10.199.0.19:5050 \*--executor-memory 5G \
> --driver-memory 2G \
> --total-executor-cores 4 \
> --jars /build/analytics/kafkajobs/spark-streaming-kafka_2.10-1.6.2.jar \
> /build/analytics/kafkajobs/kafkajobs-prod.jar
>
>
> And it’s didn't work with these:
>
> /build/analytics/spark-1.6.1-bin-hadoop2.6/bin/spark-submit \
>
>
>
>
> *--files /build/analytics/kafkajobs/prod.conf \--conf
> 'spark.executor.extraJavaOptions=-Dconfig.fuction.conf' \--conf
> 'spark.driver.extraJavaOptions=-Dconfig.file=/build/analytics/kafkajobs/prod.conf'
> \--conf
> 'spark.driver.extraClassPath=/build/analytics/spark-1.6.1-bin-hadoop2.6/lib/postgresql-9.3-1102.jdbc41.jar'
> \--conf
> 'spark.executor.extraClassPath=/build/analytics/spark-1.6.1-bin-hadoop2.6/lib/postgresql-9.3-1102.jdbc41.jar'
> \*
> --class com.ants.util.kafka.PersistenceData \
>
>
> *--master mesos://10.199.0.19:7077 \--deploy-mode cluster \--supervise \*
> --executor-memory 5G \
> --driver-memory 2G \
> --total-executor-cores 4 \
> --jars /build/analytics/kafkajobs/spark-streaming-kafka_2.10-1.6.2.jar \
> /build/analytics/kafkajobs/kafkajobs-prod.jar
>
> It threw me an error: *Exception in thread "main" java.sql.SQLException:
> No suitable driver found for jdbc:postgresql://psqlhost:5432/kafkajobs*
> which means my —conf didn’t work and those config I put in 
> */build/analytics/kafkajobs/prod.conf
> *wasn’t loaded. It only loaded thing I put in application.conf (default
> config).
>
> How to make MCD load my config?
>
> Regards,
> Chanh
>
> --
Daniel Carroza Santana
Vía de las Dos Castillas, 33, Ática 4, 3ª Planta.
28224 Pozuelo de Alarcón. Madrid.
Tel: +34 91 828 64 73 // *@stratiobd <https://twitter.com/StratioBD>*

Reply via email to