Re: unable to access spark @ spark://debian:7077

2015-03-16 Thread Ralph Bergmann
I can access the manage webpage at port 8080 from my mac and it told me
that master and 1 slave is running and I can access them at port 7077

But the port scanner shows that port 8080 is open but not port 7077. I
started the port scanner on the same machine where Spark is running.


Ralph


Am 16.03.15 um 13:51 schrieb Sean Owen:
 Are you sure the master / slaves started?
 Do you have network connectivity between the two?
 Do you have multiple interfaces maybe?
 Does debian resolve correctly and as you expect to the right host/interface?


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: unable to access spark @ spark://debian:7077

2015-03-16 Thread Sean Owen
Are you sure the master / slaves started?
Do you have network connectivity between the two?
Do you have multiple interfaces maybe?
Does debian resolve correctly and as you expect to the right host/interface?

On Mon, Mar 16, 2015 at 8:14 AM, Ralph Bergmann ra...@dasralph.de wrote:
 Hi,


 I try my first steps with Spark but I have problems to access Spark
 running on my Linux server from my Mac.

 I start Spark with sbin/start-all.sh

 When I now open the website at port 8080 I see that all is running and I
 can access Spark at port 7077 but this doesn't work.

 I scanned the Linux machine with nmap and port 7077 isn't open.

 On my Mac side I get this error message:

 Using Spark's default log4j profile:
 org/apache/spark/log4j-defaults.properties
 15/03/16 09:11:41 INFO SparkContext: Running Spark version 1.3.0
 2015-03-16 09:11:41.782 java[1004:46676] Unable to load realm info from
 SCDynamicStore
 15/03/16 09:11:41 WARN NativeCodeLoader: Unable to load native-hadoop
 library for your platform... using builtin-java classes where applicable
 15/03/16 09:11:42 INFO SecurityManager: Changing view acls to: dasralph
 15/03/16 09:11:42 INFO SecurityManager: Changing modify acls to: dasralph
 15/03/16 09:11:42 INFO SecurityManager: SecurityManager: authentication
 disabled; ui acls disabled; users with view permissions: Set(dasralph);
 users with modify permissions: Set(dasralph)
 15/03/16 09:11:43 INFO Slf4jLogger: Slf4jLogger started
 15/03/16 09:11:43 INFO Remoting: Starting remoting
 15/03/16 09:11:43 INFO Remoting: Remoting started; listening on
 addresses :[akka.tcp://sparkDriver@imac_wlan.lan:52886]
 15/03/16 09:11:43 INFO Utils: Successfully started service 'sparkDriver'
 on port 52886.
 15/03/16 09:11:43 INFO SparkEnv: Registering MapOutputTracker
 15/03/16 09:11:43 INFO SparkEnv: Registering BlockManagerMaster
 15/03/16 09:11:43 INFO DiskBlockManager: Created local directory at
 /var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-9cce9d78-a0e6-4fb5-8cf6-00d91c764927/blockmgr-bd444818-a50a-4ea0-9cf6-3b2545f32238
 15/03/16 09:11:43 INFO MemoryStore: MemoryStore started with capacity
 1966.1 MB
 15/03/16 09:11:43 INFO HttpFileServer: HTTP File server directory is
 /var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-dd67dc02-c0b7-4167-b8d5-29f057cfb253/httpd-8534edfe-46b8-49ea-9273-3e8e47947332
 15/03/16 09:11:43 INFO HttpServer: Starting HTTP Server
 15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
 15/03/16 09:11:44 INFO AbstractConnector: Started
 SocketConnector@0.0.0.0:52913
 15/03/16 09:11:44 INFO Utils: Successfully started service 'HTTP file
 server' on port 52913.
 15/03/16 09:11:44 INFO SparkEnv: Registering OutputCommitCoordinator
 15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
 15/03/16 09:11:44 INFO AbstractConnector: Started
 SelectChannelConnector@0.0.0.0:4040
 15/03/16 09:11:44 INFO Utils: Successfully started service 'SparkUI' on
 port 4040.
 15/03/16 09:11:44 INFO SparkUI: Started SparkUI at http://imac_wlan.lan:4040
 15/03/16 09:11:44 INFO AppClient$ClientActor: Connecting to master
 akka.tcp://sparkMaster@debian:7077/user/Master...
 15/03/16 09:11:45 WARN ReliableDeliverySupervisor: Association with
 remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
 is now gated for [5000] ms. Reason is: [Disassociated].
 15/03/16 09:12:04 INFO AppClient$ClientActor: Connecting to master
 akka.tcp://sparkMaster@debian:7077/user/Master...
 15/03/16 09:12:04 WARN ReliableDeliverySupervisor: Association with
 remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
 is now gated for [5000] ms. Reason is: [Disassociated].
 15/03/16 09:12:24 INFO AppClient$ClientActor: Connecting to master
 akka.tcp://sparkMaster@debian:7077/user/Master...
 15/03/16 09:12:24 WARN ReliableDeliverySupervisor: Association with
 remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
 is now gated for [5000] ms. Reason is: [Disassociated].
 15/03/16 09:12:44 ERROR SparkDeploySchedulerBackend: Application has
 been killed. Reason: All masters are unresponsive! Giving up.
 15/03/16 09:12:44 ERROR TaskSchedulerImpl: Exiting due to error from
 cluster scheduler: All masters are unresponsive! Giving up.
 15/03/16 09:12:44 WARN SparkDeploySchedulerBackend: Application ID is
 not initialized yet.
 15/03/16 09:12:45 INFO NettyBlockTransferService: Server created on 53666
 15/03/16 09:12:45 INFO BlockManagerMaster: Trying to register BlockManager
 15/03/16 09:12:45 INFO BlockManagerMasterActor: Registering block
 manager imac_wlan.lan:53666 with 1966.1 MB RAM, BlockManagerId(driver,
 imac_wlan.lan, 53666)
 15/03/16 09:12:45 INFO BlockManagerMaster: Registered BlockManager
 15/03/16 09:12:45 ERROR MetricsSystem: Sink class
 org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized


 What's going wrong?

 Ralph

 --

 Ralph Bergmann


 www  http://www.dasralph.de | http://www.the4thFloor.eu
 mail ra...@dasralph.de
 skype

Re: unable to access spark @ spark://debian:7077

2015-03-16 Thread Ralph Bergmann
Okay I think I found the mistake

The Eclipse Maven plug suggested me version 1.2.1 of the spark-core lib
but I use Spark 1.3.0

As I fixed it I can access the Spark server.


Ralph


Am 16.03.15 um 14:39 schrieb Ralph Bergmann:
 I can access the manage webpage at port 8080 from my mac and it told me
 that master and 1 slave is running and I can access them at port 7077
 
 But the port scanner shows that port 8080 is open but not port 7077. I
 started the port scanner on the same machine where Spark is running.
 
 
 Ralph

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



unable to access spark @ spark://debian:7077

2015-03-16 Thread Ralph Bergmann
Hi,


I try my first steps with Spark but I have problems to access Spark
running on my Linux server from my Mac.

I start Spark with sbin/start-all.sh

When I now open the website at port 8080 I see that all is running and I
can access Spark at port 7077 but this doesn't work.

I scanned the Linux machine with nmap and port 7077 isn't open.

On my Mac side I get this error message:

Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/03/16 09:11:41 INFO SparkContext: Running Spark version 1.3.0
2015-03-16 09:11:41.782 java[1004:46676] Unable to load realm info from
SCDynamicStore
15/03/16 09:11:41 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/03/16 09:11:42 INFO SecurityManager: Changing view acls to: dasralph
15/03/16 09:11:42 INFO SecurityManager: Changing modify acls to: dasralph
15/03/16 09:11:42 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(dasralph);
users with modify permissions: Set(dasralph)
15/03/16 09:11:43 INFO Slf4jLogger: Slf4jLogger started
15/03/16 09:11:43 INFO Remoting: Starting remoting
15/03/16 09:11:43 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkDriver@imac_wlan.lan:52886]
15/03/16 09:11:43 INFO Utils: Successfully started service 'sparkDriver'
on port 52886.
15/03/16 09:11:43 INFO SparkEnv: Registering MapOutputTracker
15/03/16 09:11:43 INFO SparkEnv: Registering BlockManagerMaster
15/03/16 09:11:43 INFO DiskBlockManager: Created local directory at
/var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-9cce9d78-a0e6-4fb5-8cf6-00d91c764927/blockmgr-bd444818-a50a-4ea0-9cf6-3b2545f32238
15/03/16 09:11:43 INFO MemoryStore: MemoryStore started with capacity
1966.1 MB
15/03/16 09:11:43 INFO HttpFileServer: HTTP File server directory is
/var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-dd67dc02-c0b7-4167-b8d5-29f057cfb253/httpd-8534edfe-46b8-49ea-9273-3e8e47947332
15/03/16 09:11:43 INFO HttpServer: Starting HTTP Server
15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/16 09:11:44 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:52913
15/03/16 09:11:44 INFO Utils: Successfully started service 'HTTP file
server' on port 52913.
15/03/16 09:11:44 INFO SparkEnv: Registering OutputCommitCoordinator
15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/16 09:11:44 INFO AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040
15/03/16 09:11:44 INFO Utils: Successfully started service 'SparkUI' on
port 4040.
15/03/16 09:11:44 INFO SparkUI: Started SparkUI at http://imac_wlan.lan:4040
15/03/16 09:11:44 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@debian:7077/user/Master...
15/03/16 09:11:45 WARN ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
is now gated for [5000] ms. Reason is: [Disassociated].
15/03/16 09:12:04 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@debian:7077/user/Master...
15/03/16 09:12:04 WARN ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
is now gated for [5000] ms. Reason is: [Disassociated].
15/03/16 09:12:24 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@debian:7077/user/Master...
15/03/16 09:12:24 WARN ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
is now gated for [5000] ms. Reason is: [Disassociated].
15/03/16 09:12:44 ERROR SparkDeploySchedulerBackend: Application has
been killed. Reason: All masters are unresponsive! Giving up.
15/03/16 09:12:44 ERROR TaskSchedulerImpl: Exiting due to error from
cluster scheduler: All masters are unresponsive! Giving up.
15/03/16 09:12:44 WARN SparkDeploySchedulerBackend: Application ID is
not initialized yet.
15/03/16 09:12:45 INFO NettyBlockTransferService: Server created on 53666
15/03/16 09:12:45 INFO BlockManagerMaster: Trying to register BlockManager
15/03/16 09:12:45 INFO BlockManagerMasterActor: Registering block
manager imac_wlan.lan:53666 with 1966.1 MB RAM, BlockManagerId(driver,
imac_wlan.lan, 53666)
15/03/16 09:12:45 INFO BlockManagerMaster: Registered BlockManager
15/03/16 09:12:45 ERROR MetricsSystem: Sink class
org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized


What's going wrong?

Ralph

-- 

Ralph Bergmann


www  http://www.dasralph.de | http://www.the4thFloor.eu
mail ra...@dasralph.de
skypedasralph

facebook https://www.facebook.com/dasralph
google+  https://plus.google.com/+RalphBergmann
xing https://www.xing.com/profile/Ralph_Bergmann3
linkedin https://www.linkedin.com/in/ralphbergmann
gulp https://www.gulp.de/Profil/RalphBergmann.html
github   https://github.com/the4thfloor


pgp key id   

Re: unable to access spark @ spark://debian:7077

2015-03-16 Thread Akhil Das
Try setting SPARK_MASTER_IP and you need to use the Spark URI
(spark://yourlinuxhost:7077) as displayed in the top left corner of Spark
UI (running on port 8080). Also when you are connecting from your mac, make
sure your network/firewall isn't blocking any port between the two machines.

Thanks
Best Regards

On Mon, Mar 16, 2015 at 1:44 PM, Ralph Bergmann ra...@dasralph.de wrote:

 Hi,


 I try my first steps with Spark but I have problems to access Spark
 running on my Linux server from my Mac.

 I start Spark with sbin/start-all.sh

 When I now open the website at port 8080 I see that all is running and I
 can access Spark at port 7077 but this doesn't work.

 I scanned the Linux machine with nmap and port 7077 isn't open.

 On my Mac side I get this error message:

 Using Spark's default log4j profile:
 org/apache/spark/log4j-defaults.properties
 15/03/16 09:11:41 INFO SparkContext: Running Spark version 1.3.0
 2015-03-16 09:11:41.782 java[1004:46676] Unable to load realm info from
 SCDynamicStore
 15/03/16 09:11:41 WARN NativeCodeLoader: Unable to load native-hadoop
 library for your platform... using builtin-java classes where applicable
 15/03/16 09:11:42 INFO SecurityManager: Changing view acls to: dasralph
 15/03/16 09:11:42 INFO SecurityManager: Changing modify acls to: dasralph
 15/03/16 09:11:42 INFO SecurityManager: SecurityManager: authentication
 disabled; ui acls disabled; users with view permissions: Set(dasralph);
 users with modify permissions: Set(dasralph)
 15/03/16 09:11:43 INFO Slf4jLogger: Slf4jLogger started
 15/03/16 09:11:43 INFO Remoting: Starting remoting
 15/03/16 09:11:43 INFO Remoting: Remoting started; listening on
 addresses :[akka.tcp://sparkDriver@imac_wlan.lan:52886]
 15/03/16 09:11:43 INFO Utils: Successfully started service 'sparkDriver'
 on port 52886.
 15/03/16 09:11:43 INFO SparkEnv: Registering MapOutputTracker
 15/03/16 09:11:43 INFO SparkEnv: Registering BlockManagerMaster
 15/03/16 09:11:43 INFO DiskBlockManager: Created local directory at

 /var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-9cce9d78-a0e6-4fb5-8cf6-00d91c764927/blockmgr-bd444818-a50a-4ea0-9cf6-3b2545f32238
 15/03/16 09:11:43 INFO MemoryStore: MemoryStore started with capacity
 1966.1 MB
 15/03/16 09:11:43 INFO HttpFileServer: HTTP File server directory is

 /var/folders/h3/r2qtlmbn1cd6ctj_rcyyq_24gn/T/spark-dd67dc02-c0b7-4167-b8d5-29f057cfb253/httpd-8534edfe-46b8-49ea-9273-3e8e47947332
 15/03/16 09:11:43 INFO HttpServer: Starting HTTP Server
 15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
 15/03/16 09:11:44 INFO AbstractConnector: Started
 SocketConnector@0.0.0.0:52913
 15/03/16 09:11:44 INFO Utils: Successfully started service 'HTTP file
 server' on port 52913.
 15/03/16 09:11:44 INFO SparkEnv: Registering OutputCommitCoordinator
 15/03/16 09:11:44 INFO Server: jetty-8.y.z-SNAPSHOT
 15/03/16 09:11:44 INFO AbstractConnector: Started
 SelectChannelConnector@0.0.0.0:4040
 15/03/16 09:11:44 INFO Utils: Successfully started service 'SparkUI' on
 port 4040.
 15/03/16 09:11:44 INFO SparkUI: Started SparkUI at
 http://imac_wlan.lan:4040
 15/03/16 09:11:44 INFO AppClient$ClientActor: Connecting to master
 akka.tcp://sparkMaster@debian:7077/user/Master...
 15/03/16 09:11:45 WARN ReliableDeliverySupervisor: Association with
 remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
 is now gated for [5000] ms. Reason is: [Disassociated].
 15/03/16 09:12:04 INFO AppClient$ClientActor: Connecting to master
 akka.tcp://sparkMaster@debian:7077/user/Master...
 15/03/16 09:12:04 WARN ReliableDeliverySupervisor: Association with
 remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
 is now gated for [5000] ms. Reason is: [Disassociated].
 15/03/16 09:12:24 INFO AppClient$ClientActor: Connecting to master
 akka.tcp://sparkMaster@debian:7077/user/Master...
 15/03/16 09:12:24 WARN ReliableDeliverySupervisor: Association with
 remote system [akka.tcp://sparkMaster@debian:7077] has failed, address
 is now gated for [5000] ms. Reason is: [Disassociated].
 15/03/16 09:12:44 ERROR SparkDeploySchedulerBackend: Application has
 been killed. Reason: All masters are unresponsive! Giving up.
 15/03/16 09:12:44 ERROR TaskSchedulerImpl: Exiting due to error from
 cluster scheduler: All masters are unresponsive! Giving up.
 15/03/16 09:12:44 WARN SparkDeploySchedulerBackend: Application ID is
 not initialized yet.
 15/03/16 09:12:45 INFO NettyBlockTransferService: Server created on 53666
 15/03/16 09:12:45 INFO BlockManagerMaster: Trying to register BlockManager
 15/03/16 09:12:45 INFO BlockManagerMasterActor: Registering block
 manager imac_wlan.lan:53666 with 1966.1 MB RAM, BlockManagerId(driver,
 imac_wlan.lan, 53666)
 15/03/16 09:12:45 INFO BlockManagerMaster: Registered BlockManager
 15/03/16 09:12:45 ERROR MetricsSystem: Sink class
 org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized


 What's going wrong?

 Ralph

 --

 Ralph Bergmann


 www