Unable to Connect to Apache Phoenix From Spark

2018-05-05 Thread SparkUser6
Simple Java Program to Connect to Phoenix DB: SparkConf sparkConf = new SparkConf(); sparkConf.setAppName("Using-spark-phoenix-df"); sparkConf.setMaster("local[*]"); JavaSparkContext sc = new JavaSparkContext(sparkConf);

Re: Driver aborts on Mesos when unable to connect to one of external shuffle services

2018-04-16 Thread igor.berman
Hi Szuromi, We manage external shuffle service by Marathon and not manually sometime though, eg. when adding new node to cluster there is some delay between mesos schedules tasks on some slave and marathon scheduling external shuffle service task on this node. -- Sent from:

Re: Driver aborts on Mesos when unable to connect to one of external shuffle services

2018-04-12 Thread Szuromi Tamás
Hi Igor, Have you started the external shuffle service manually? Cheers 2018-04-12 10:48 GMT+02:00 igor.berman <igor.ber...@gmail.com>: > Hi, > any input regarding is it expected: > Driver starts and unable to connect to external shuffle service on one of > the no

Driver aborts on Mesos when unable to connect to one of external shuffle services

2018-04-12 Thread igor.berman
Hi, any input regarding is it expected: Driver starts and unable to connect to external shuffle service on one of the nodes(no matter what is the reason) This makes framework to go to Inactive mode in Mesos UI However it seems that driver doesn't exits and continues to execute tasks(or tries

Re: unable to connect to connect to cluster 2.2.0

2017-12-06 Thread Imran Rajjad
; Richard > > > > > > *From: *Imran Rajjad <raj...@gmail.com> > *Date: *Wednesday, December 6, 2017 at 2:45 AM > *To: *"user @spark" <user@spark.apache.org> > *Subject: *unable to connect to connect to cluster 2.2.0 > > > > Hi, > > >

Re: unable to connect to connect to cluster 2.2.0

2017-12-06 Thread Qiao, Richard
Are you now building your app using spark 2.2 or 2.1? Best Regards Richard From: Imran Rajjad <raj...@gmail.com> Date: Wednesday, December 6, 2017 at 2:45 AM To: "user @spark" <user@spark.apache.org> Subject: unable to connect to connect to cluster 2.2.0 Hi, Recent

unable to connect to connect to cluster 2.2.0

2017-12-05 Thread Imran Rajjad
Hi, Recently upgraded from 2.1.1 to 2.2.0. My Streaming job seems to have broken. The submitted application is unable to connect to the cluster, when all is running. below is my stack trace Spark Master:spark://192.168.10.207:7077 Job Arguments: -appName orange_watch -directory /u01/watch/stream

Spark : Unable to connect to Oracle

2016-02-10 Thread Divya Gehlot
Hi, I am new bee to Spark and using Spark 1.5.2 version. I am trying to connect to Oracle DB using Spark API,getting errors : Steps I followed : Step 1- I placed the ojdbc6.jar in /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar Step 2- Registered the jar file

Re: Spark : Unable to connect to Oracle

2016-02-10 Thread Jorge Machado
Hi Divya, You need to install the Oracle jdbc driver on the cluster into lib folder. > On 10/02/2016, at 09:37, Divya Gehlot wrote: > > oracle.jdbc.driver.OracleDrive

Re: Spark : Unable to connect to Oracle

2016-02-10 Thread Rishi Mishra
ASFIK sc.addJar() will add the jars to executor's classpath . The datasource resolution ( createRelation) happens at driver side and driver classpath should contain the ojdbc6.jar. You can use "spark.driver.extraClassPath" config parameter to set the same. On Wed, Feb 10, 2016 at 3:08 PM, Jorge

PySpark in Pycharm- unable to connect to remote server

2015-08-05 Thread Ashish Dutt
Use Case: I want to use my laptop (using Win 7 Professional) to connect to the CentOS 6.4 master server using PyCharm. Objective: To write the code in Pycharm on the laptop and then send the job to the server which will do the processing and should then return the result back to the laptop or to

Unable to connect

2015-03-13 Thread Mohit Anchlia
I am running spark streaming standalone in ec2 and I am trying to run wordcount example from my desktop. The program is unable to connect to the master, in the logs I see, which seems to be an issue with hostname. 15/03/13 17:37:44 ERROR EndpointWriter: dropping message [class

Re: Unable to connect

2015-03-13 Thread Tathagata Das
Standalone daemons. TD On Fri, Mar 13, 2015 at 2:41 PM, Mohit Anchlia mohitanch...@gmail.com wrote: I am running spark streaming standalone in ec2 and I am trying to run wordcount example from my desktop. The program is unable to connect to the master, in the logs I see, which seems

Re: Unable to connect to Spark thrift JDBC server with pluggable authentication

2014-10-18 Thread Cheng Lian
Hi Jenny, how did you configure the classpath and start the Thrift server (YARN client/YARN cluster/standalone/...)? On 10/18/14 4:14 AM, Jenny Zhao wrote: Hi, if Spark thrift JDBC server is started with non-secure mode, it is working fine. with a secured mode in case of pluggable

Unable to connect to Spark thrift JDBC server with pluggable authentication

2014-10-17 Thread Jenny Zhao
Hi, if Spark thrift JDBC server is started with non-secure mode, it is working fine. with a secured mode in case of pluggable authentication, I placed the authentication class configuration in conf/hive-site.xml property namehive.server2.authentication/name valueCUSTOM/value /property

Re: Python + Spark unable to connect to S3 bucket .... Invalid hostname in URI

2014-08-15 Thread Miroslaw
into this problem, I hope this will help them out. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Python-Spark-unable-to-connect-to-S3-bucket-Invalid-hostname-in-URI-tp12076p12169.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Python + Spark unable to connect to S3 bucket .... Invalid hostname in URI

2014-08-14 Thread Miroslaw
.nabble.com wrote: Using s3n:// worked for me. -- If you reply to this email, your message will be added to the discussion below: http://apache-spark-user-list.1001560.n3.nabble.com/Python-Spark-unable-to-connect-to-S3-bucket-Invalid-hostname-in-URI-tp12076p12082

Re: Spark sql unable to connect to db2 hive metastore

2014-06-17 Thread Michael Armbrust
First a clarification: Spark SQL does not talk to HiveServer2, as that JDBC interface is for retrieving results from queries that are executed using Hive. Instead Spark SQL will execute queries itself by directly accessing your data using Spark. Spark SQL's Hive module can use JDBC to connect

Re: Spark sql unable to connect to db2 hive metastore

2014-06-17 Thread Jenny Zhao
Thanks Michael! as I run it using spark-shell, so I added both jars through bin/spark-shell --jars options. I noticed if I don't pass these jars, it complains it couldn't find the driver, if I pass them through --jars options, it complains there is no suitable driver. Regards. On Tue, Jun 17,

Re: Spark sql unable to connect to db2 hive metastore

2014-06-17 Thread Jenny Zhao
finally got it work out, mimicked how spark added datanucleus jars in compute-classpath.sh, and added the db2jcc*.jar in the classpath, it works now. Thanks! On Tue, Jun 17, 2014 at 10:50 AM, Jenny Zhao linlin200...@gmail.com wrote: Thanks Michael! as I run it using spark-shell, so I added

Spark sql unable to connect to db2 hive metastore

2014-06-16 Thread Jenny Zhao
Hi, my hive configuration use db2 as it's metastore database, I have built spark with the extra step sbt/sbt assembly/assembly to include the dependency jars. and copied HIVE_HOME/conf/hive-site.xml under spark/conf. when I ran : hql(CREATE TABLE IF NOT EXISTS src (key INT, value STRING)) got