The feature was added in Spark 3.0. Btw, you may want to check out the EOL
date for Apache Spark releases - https://endoflife.date/apache-spark 2.x is
already EOLed.
On Fri, Nov 24, 2023 at 11:13 PM mallesh j
wrote:
> Hi Team,
>
> I am trying to test the performance of a spark streaming
As outlined at https://issues.apache.org/jira/browse/SPARK-38693 and
https://stackoverflow.com/q/71667296/7954504, we are attempting to integrate
Keycloak<https://www.keycloak.org/docs/latest/securing_apps/#_servlet_filter_adapter>
Single Sign On with the Spark Web UI.
However, Spark
When I kill an application on the Web UI (which I submit with
standalone-client mode), it seems to be killed already; But when I use 'jps'
command I can still see the application running background. This is my demo
code to reappear this problem.
And,
If I kill the application on web ui when
Thanks Manu for your response.
I already checked the logs and didn't see anything that can help me
understanding the issue.
The more weird thing, i have a small CI cluster which run on single
NameNode and i see the Spark2 job in the UI, i'm still not sure if it may
related to the NameNode HA, i
Hi Fawze,
Sorry but I'm not familiar with CM. Maybe you can look into the logs (or
turn on DEBUG log).
On Thu, Aug 16, 2018 at 3:05 PM Fawze Abujaber wrote:
> Hi Manu,
>
> I'm using cloudera manager with single user mode and every process is
> running with cloudera-scm user, the cloudera-scm
Hi Manu,
I'm using cloudera manager with single user mode and every process is
running with cloudera-scm user, the cloudera-scm is a super user and this
is why i was confused how it worked in spark 1.6 and not in spark 2.3
On Thu, Aug 16, 2018 at 5:34 AM Manu Zhang wrote:
> If you are able to
If you are able to log onto the node where UI has been launched, then try
`ps -aux | grep HistoryServer` and the first column of output should be the
user.
On Wed, Aug 15, 2018 at 10:26 PM Fawze Abujaber wrote:
> Thanks Manu, Do you know how i can see which user the UI is running,
> because i'm
Thanks Manu, Do you know how i can see which user the UI is running,
because i'm using cloudera manager and i created a user for cloudera
manager and called it spark but this didn't solve me issue and here i'm
trying to find out the user for the spark hisotry UI.
On Wed, Aug 15, 2018 at 5:11 PM
Hi Fawze,
A) The file permission is currently hard coded to 770 (
https://github.com/apache/spark/blob/branch-2.3/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala#L287
).
B) I think add all users (including UI) to the group like Spark will do.
On Wed, Aug 15, 2018 at
Hi Manu,
Thanks for your response.
Yes, i see but still interesting to know how i can see these applications
from the spark history UI.
How i can know with which user i'm logged in when i'm navigating the spark
history UI.
The Spark process is running with cloudera-scm and the events written
Hi Fawze,
In Spark 2.3, HistoryServer will check for file permissions when reading
event logs written by your applications. (Please check
https://issues.apache.org/jira/browse/SPARK-20172). With file permissions
of 770, HistoryServer is not permitted to read the event log. That's why
you were
Hello Community,
I'm using Spark 2.3 and Spark 1.6.0 in my cluster with Cloudera
distribution 5.13.0.
Both are configured to run on Yarn, but i'm unable to see completed
application in Spark2 history server, while in Spark 1.6.0 i did.
1) I checked the HDFS permissions for both folders and both
ype JKS
spark.ui.https.enabled true
Hopefully, I didn’t miss anything
Thanks,
Assaf
From: Saisai Shao [mailto:sai.sai.s...@gmail.com]
Sent: Monday, August 21, 2017 5:28 PM
To: Anshuman Kumar <anshuman27ku...@gmail.com>
Cc: spark users <user@spark.apache.org>
Subject: Re: Spark
on a server that I access from a remote
> computer. I need to setup SSL encryption for the Spark web UI, but
> following some threads online I’m still not able to set it up.
>
> Can someone help me with the SS
Hello,
I have recently installed Sparks 2.2.0, and trying to use it for some big data
processing. Spark is installed on a server that I access from a remote
computer. I need to setup SSL encryption for the Spark web UI, but following
some threads online I’m still not able to set it up.
Can
Severity: Low
Vendor: The Apache Software Foundation
Versions Affected:
Versions of Apache Spark before 2.2.0
Description:
It is possible for an attacker to take advantage of a user's trust in the
server to trick them into visiting a link that points to a shared Spark
cluster and submits data
6.1]
at java.lang.Thread.run(Thread.java:744) [na:1.7.0_51]
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Exception-when-accessing-Spark-Web-UI-in-yarn-client-mode-tp28762.html
Sent from the Apache Spark User List mailing list archive at Nabble.
a...@insight-centre.org> wrote:
>
>> Hi Jacek,
>>
>> I tried accessing Spark web UI on both Firefox and Google Chrome browsers
>> with ad blocker enabled. I do see other options like* User, Total
>> Uptime, Scheduling Mode, **Active Jobs, Completed Jobs and* Event
Try selecting a particular Job instead of looking at the summary page for
all Jobs.
On Sat, Jan 28, 2017 at 4:25 PM, Md. Rezaul Karim <
rezaul.ka...@insight-centre.org> wrote:
> Hi Jacek,
>
> I tried accessing Spark web UI on both Firefox and Google Chrome browsers
> with ad
Hi Jacek,
I tried accessing Spark web UI on both Firefox and Google Chrome browsers
with ad blocker enabled. I do see other options like* User, Total Uptime,
Scheduling Mode, **Active Jobs, Completed Jobs and* Event Timeline.
However, I don't see an option for DAG visualization.
Please note
Hi,
Wonder if you have any adblocker enabled in your browser? Is this the only
version giving you this behavior? All Spark jobs have no visualization?
Jacek
On 28 Jan 2017 7:03 p.m., "Md. Rezaul Karim" <
rezaul.ka...@insight-centre.org> wrote:
Hi All,
I am running a Spark job on my local
Hi All,
I am running a Spark job on my local machine written in Scala with Spark
2.1.0. However, I am not seeing any option of "*DAG Visualization*" at
http://localhost:4040/jobs/
Suggestion, please.
Regards,
_
*Md. Rezaul Karim*, BSc, MSc
PhD Researcher,
Config your spark master web ui you can set env SPARK_MASTER_WEBUI_PORT=
You can running cmd netstat –nao|grep 4040 to check 4040 is in using
———
I am not sure why Spark web UI keeps changing its port every time I restart a
cluster? how can I make it run always on one port? I did make
> > I get two UI'S?
>> >
>> > On Mon, Jan 23, 2017 at 12:07 PM, Marcelo Vanzin <van...@cloudera.com>
>> > wrote:
>> >>
>> >> No. Each app has its own UI which runs (starting on) port 4040.
>> >>
>> >> On M
anzin <van...@cloudera.com>
> > wrote:
> >>
> >> No. Each app has its own UI which runs (starting on) port 4040.
> >>
> >> On Mon, Jan 23, 2017 at 12:05 PM, kant kodali <kanth...@gmail.com>
> wrote:
> >> > I am using standalone mo
:
>>
>> No. Each app has its own UI which runs (starting on) port 4040.
>>
>> On Mon, Jan 23, 2017 at 12:05 PM, kant kodali <kanth...@gmail.com> wrote:
>> > I am using standalone mode so wouldn't be 8080 for my app web ui as
>> > well?
>> > T
s own UI which runs (starting on) port 4040.
>
> On Mon, Jan 23, 2017 at 12:05 PM, kant kodali <kanth...@gmail.com> wrote:
> > I am using standalone mode so wouldn't be 8080 for my app web ui as well?
> > There is nothing running on 4040 in my cluster.
> >
> > http://spa
No. Each app has its own UI which runs (starting on) port 4040.
On Mon, Jan 23, 2017 at 12:05 PM, kant kodali <kanth...@gmail.com> wrote:
> I am using standalone mode so wouldn't be 8080 for my app web ui as well?
> There is nothing running on 4040 in my cluster.
>
> http://spa
I am using standalone mode so wouldn't be 8080 for my app web ui as well?
There is nothing running on 4040 in my cluster.
http://spark.apache.org/docs/latest/security.html#standalone-mode-only
On Mon, Jan 23, 2017 at 11:51 AM, Marcelo Vanzin <van...@cloudera.com>
wrote:
> That's t
That's the Master, whose default port is 8080 (not 4040). The default
port for the app's UI is 4040.
On Mon, Jan 23, 2017 at 11:47 AM, kant kodali <kanth...@gmail.com> wrote:
> I am not sure why Spark web UI keeps changing its port every time I restart
> a cluster? how can I make i
I am not sure why Spark web UI keeps changing its port every time I restart
a cluster? how can I make it run always on one port? I did make sure there
is no process running on 4040(spark default web ui port) however it still
starts at 8080. any ideas?
MasterWebUI: Bound MasterWebUI to 0.0.0.0
at 12:07:34 AM, Jacek Laskowski (ja...@japila.pl) wrote:
Hi,
A possible workaround...Use SparkListener and save the results to a custom
sink.
After all web UI is a mere bag of SparkListeners + excellent
visualizations.
Jacek
On 3 Jan 2017 4:14 p.m., "Joseph Naegele" <jnaeg...@grier
Hi,
A possible workaround...Use SparkListener and save the results to a custom
sink.
After all web UI is a mere bag of SparkListeners + excellent
visualizations.
Jacek
On 3 Jan 2017 4:14 p.m., "Joseph Naegele" <jnaeg...@grierforensics.com>
wrote:
Hi all,
Is there any way to
Hi all,
Is there any way to observe Storage history in Spark, i.e. which RDDs were
cached and where, etc. after an application completes? It appears the Storage
tab in the History Server UI is useless.
Thanks
---
Joe Naegele
Grier Forensics
should be the Application Tracker link.
>
>
> Regards,
> Natu
>
> On Tue, Sep 13, 2016 at 9:37 AM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
>> Hi ,
>> Thank you all..
>> Hurray ...I am able to view the hadoop web UI now @ 8088 . even Spark
>>
view the hadoop web UI now @ 8088 . even Spark
> Hisroty server Web UI @ 18080
> But unable to figure out the Spark UI web port ...
> Tried with 4044,4040.. ..
> getting below error
> This site can’t be reached
> How can I find out the Spark port ?
>
> Would really a
Hi ,
Thank you all..
Hurray ...I am able to view the hadoop web UI now @ 8088 . even Spark
Hisroty server Web UI @ 18080
But unable to figure out the Spark UI web port ...
Tried with 4044,4040.. ..
getting below error
This site can’t be reached
How can I find out the Spark port ?
Would really
>
>
>
> [image: http://] <http://about.me/mti>
> Tariq, Mohammad
> about.me/mti
> [image: http://]
> <http://about.me/mti>
>
> On Tue, Sep 13, 2016 at 9:28 AM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
>> Hi,
>> I am on EMR 4.7 wi
hen I am trying to view Any of the web UI of the cluster either hadoop or
> Spark ,I am getting below error
> "
> This site can’t be reached
>
> "
> Has anybody using EMR and able to view WebUI .
> Could you please share the steps.
>
> Would really appreciate the help.
>
> Thanks,
> Divya
>
Hi,
I am on EMR 4.7 with Spark 1.6.1 and Hadoop 2.7.2
When I am trying to view Any of the web UI of the cluster either hadoop or
Spark ,I am getting below error
"
This site can’t be reached
"
Has anybody using EMR and able to view WebUI .
Could you please share the steps.
Wo
gt; <mailto:giaosu...@gmail.com>> wrote:
> >>
> >> You’re running in StandAlone Mode?
> >> Usually inside active task it will show the address of current job.
> >> or you can check in master node by using netstat -apn | grep 4040
> >>
> >>
>
the public DNS name
* the private ip in our example is 172.31.23.201
From: Jacek Laskowski <ja...@japila.pl>
Date: Tuesday, July 26, 2016 at 6:38 AM
To: Jestin Ma <jestinwith.a...@gmail.com>
Cc: Chanh Le <giaosu...@gmail.com>, "user @spark" <user@spark.apache.org&g
node by using netstat -apn | grep 4040
>> >>
>> >>
>> >>
>> >> > On Jul 26, 2016, at 8:21 AM, Jestin Ma <jestinwith.a...@gmail.com>
>> >> > wrote:
>> >> >
>> >> > Hello, when running spark jobs, I can
t; On Jul 26, 2016, at 8:21 AM, Jestin Ma <jestinwith.a...@gmail.com>
> >> > wrote:
> >> >
> >> > Hello, when running spark jobs, I can access the master UI (port 8080
> >> > one) no problem. However, I'm confused as to how to access the web UI
>
current job.
> or you can check in master node by using netstat -apn | grep 4040
>
>
>
> > On Jul 26, 2016, at 8:21 AM, Jestin Ma <jestinwith.a...@gmail.com>
> wrote:
> >
> > Hello, when running spark jobs, I can access the master UI (port 8080
> one) no problem
t;jestinwith.a...@gmail.com>
>> > wrote:
>> >
>> > Hello, when running spark jobs, I can access the master UI (port 8080
>> > one) no problem. However, I'm confused as to how to access the web UI to
>> > see
>> > jobs/tasks/stages/etc.
>&g
Hi,
Go to 8080 and under Running Applications click the Application ID.
You're on the page with Application Detail UI just before Executor
Summary table. Use it to access the web UI.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly
s, I can access the master UI (port 8080 one) no
> problem. However, I'm confused as to how to access the web UI to see
> jobs/tasks/stages/etc.
>
> I can access the master UI at http://:8080. But port 4040 gives
> me a -connection cannot be reached-.
>
> Is the web UI h
Hello, when running spark jobs, I can access the master UI (port 8080 one)
no problem. However, I'm confused as to how to access the web UI to see
jobs/tasks/stages/etc.
I can access the master UI at http://:8080. But port 4040
gives me a -connection cannot be reached-.
Is the web UI http
I created a cluster using spark-1.6.1-bin-hadoop2.6/ec2/spark-ec2 script.
The shows ganglia started how ever I am not able to access
http://ec2-54-215-230-73.us-west-1.compute.amazonaws.com:5080/ganglia. I
have tried using the private ip from with in my data center.
I d not see anything listing
;> > On Sat, Jun 18, 2016 at 9:04 PM, Jacek Laskowski <ja...@japila.pl>
>> > wrote:
>> >>
>> >> Hi,
>> >>
>> >> This is for Spark on YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
>> >> (today build)
>>
is for Spark on YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
> >> (today build)
> >>
> >> I can understand that when a stage fails a new executor entry shows up
> >> in web UI under Executors tab (that corresponds to a stage attempt). I
> >> unders
;>
>> This is for Spark on YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
>> (today build)
>>
>> I can understand that when a stage fails a new executor entry shows up
>> in web UI under Executors tab (that corresponds to a stage attempt). I
>> understand t
> Hi,
>
> This is for Spark on YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
> (today build)
>
> I can understand that when a stage fails a new executor entry shows up
> in web UI under Executors tab (that corresponds to a stage attempt). I
> understand that this is to keep the std
ster with Spark 2.0.0-SNAPSHOT
> (today build)
>
> I can understand that when a stage fails a new executor entry shows up
> in web UI under Executors tab (that corresponds to a stage attempt). I
> understand that this is to keep the stdout and stderr logs for future
> reference.
>
Hi,
This is for Spark on YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
(today build)
I can understand that when a stage fails a new executor entry shows up
in web UI under Executors tab (that corresponds to a stage attempt). I
understand that this is to keep the stdout and stderr logs
Hi,
I'd like to have the other optional columns in Aggregated Metrics by
Executor table per stage in web UI. I can easily have Shuffle Read
Size / Records and Shuffle Write Size / Records columns.
scala> sc.parallelize(0 to 9).map((_,1)).groupBy(_._1).count
I can't seem to figure out what Sp
Hi all,
I have a spark application running to which I submit jobs continuosly.
These job use different instances of sqlContext. So the web ui of
application starts to fill up more and more with this instance.
Is there any way to prevent this? I don't want to see created sql context
in the web ui
om> wrote:
> Just a quick question,
>
> When using textFileStream, I did not see any events via web UI.
> Actually, I am uploading files to s3 every 5 seconds,
> And the mini-batch duration is 30 seconds.
> On web ui,:
>
> *Input Rate*
> Avg: 0.00 events/sec
>
&
Just a quick question,
When using textFileStream, I did not see any events via web UI.
Actually, I am uploading files to s3 every 5 seconds,
And the mini-batch duration is 30 seconds.
On web ui,:
*Input Rate*
Avg: 0.00 events/sec
But the schedule time and processing time are correct
You may want to check out https://github.com/hammerlab/spree
On Tue, 15 Mar 2016 at 10:43 charles li <charles.up...@gmail.com> wrote:
> every time I can only get the latest info by refreshing the page, that's a
> little boring.
>
> so is there any way to make the WEB
every time I can only get the latest info by refreshing the page, that's a
little boring.
so is there any way to make the WEB UI auto-refreshing ?
great thanks
--
*--*
a spark lover, a quant, a developer and a good man.
http://github.com/litaotao
nsform at
ErrorStreaming2.scala:396, took 8.218500 s
(org.apache.spark.scheduler.DAGScheduler)
Stages in job 9816 are completed too according to the log
But job 9816 is still in active job of web ui, why?
How can I clear these remaining jobs?
--
View this message in context:
http://apache-spark-user-list.1001
nsform at
ErrorStreaming2.scala:396, took 8.218500 s
(org.apache.spark.scheduler.DAGScheduler)
Stages in job 9816 are completed too according to the log
But job 9816 is still in active job of web ui, why?
How can I clear these remaining jobs?
--
View this message in context:
http://apache-spark-user-list.1
t;
> Hi all,
>
> I am running Spark in yarn-client mode, but every time I access the web
> ui, the browser redirect me to one of the worker nodes and shows nothing.
> The url looks like
> http://hadoop-node31.company.com:8088/proxy/application_1453797301246_120264
> .
>
>
>
On 3 Mar 2016, at 09:17, Shady Xu <shad...@gmail.com<mailto:shad...@gmail.com>>
wrote:
Hi all,
I am running Spark in yarn-client mode, but every time I access the web ui, the
browser redirect me to one of the worker nodes and shows nothing. The url looks
like
http://h
Hi all,
I am running Spark in yarn-client mode, but every time I access the web ui,
the browser redirect me to one of the worker nodes and shows nothing. The
url looks like
http://hadoop-node31.company.com:8088/proxy/application_1453797301246_120264
.
I googled a lot and found some possible
6 at 9:02 AM, Vasanth Bhat <vasb...@gmail.com> wrote:
>
>> Hi,
>>
>>Is there a way to provide minThreads and maxThreds for
>> Threadpool through jetty.xml for the jetty that is used by spark Web
>> UI?
>>
>> I am hitting an issue
asb...@gmail.com> wrote:
> Hi,
>
>Is there a way to provide minThreads and maxThreds for
> Threadpool through jetty.xml for the jetty that is used by spark Web
> UI?
>
> I am hitting an issue very similar to the issue described in
> http://lifelongprogram
Hi,
Is there a way to provide minThreads and maxThreds for
Threadpool through jetty.xml for the jetty that is used by spark Web
UI?
I am hitting an issue very similar to the issue described in
http://lifelongprogrammer.blogspot.com/2014/10/jetty-insufficient-threads
t;>>> port 8080.
>>>>> 16/02/19 03:07:32 INFO MasterWebUI: Started MasterWebUI at
>>>>> http://127.0.0.1:8080
>>>>> 16/02/19 03:07:32 WARN AbstractConnector: insufficient threads
>>>>> configured
>>>>> for S
6/02/19 03:07:32 INFO Utils: Successfully started service on port 6066.
>>>> 16/02/19 03:07:32 INFO StandaloneRestServer: Started REST server for
>>>> submitting applications on port 6066
>>>> 16/02/19 03:07:33 INFO Master: I have been elected leader! New state:
>>
er: I have been elected leader! New state:
>>> ALIVE
>>>
>>> --
>>> Through netstat I can see that port 8080 is Listening
>>> Now when I start firefox and access http://127.0.0.1:8080 , firefox
>>> just
>&
>> Through netstat I can see that port 8080 is Listening
>> Now when I start firefox and access http://127.0.0.1:8080 , firefox
>> just
>> hangs with the message
>>
>> Waiting for "127.0.0.1" and
Hi,
try http://OAhtvJ5MCA:8080
BR
On 2/19/16, 07:18, "vasbhat" wrote:
>OAhtvJ5MCA
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
-
To unsubscribe, e-mail:
://127.0.0.1:8080 , firefox
> just
> hangs with the message
>
> Waiting for "127.0.0.1" and does not connect to UI.
>
> How do I enable debug for the spark master daemon, to understand what's
> happening.
>
> Thanks
> Vasanth
>
>
>
>
>
> --
Waiting for "127.0.0.1" and does not connect to UI.
How do I enable debug for the spark master daemon, to understand what's
happening.
Thanks
Vasanth
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Re-Accessing-Web-UI-tp23029p26276.html
a...@thefilter.com]
> Sent: 05 February 2016 17:09
> To: 'Ted Yu'
> Cc: user@spark.apache.org
> Subject: RE: Can't view executor logs in web UI on Windows
>
> We have created JIRA ticket
> https://issues.apache.org/jira/browse/SPARK-13142 and will submit a pull
> request next
I have submitted a pull request: https://github.com/apache/spark/pull/11135.
Mark
-Original Message-
From: Mark Pavey [mailto:mark.pa...@thefilter.com]
Sent: 05 February 2016 17:09
To: 'Ted Yu'
Cc: user@spark.apache.org
Subject: RE: Can't view executor logs in web UI on Windows
We
executor logs in web UI on Windows
I did a brief search but didn't find relevant JIRA either.
You can create a JIRA and submit pull request for the fix.
Cheers
> On Feb 1, 2016, at 5:13 AM, Mark Pavey <mark.pa...@thefilter.com> wrote:
>
> I am running Spark on Windows. Whe
There have been changes to visibility of info in ui between 1.4 and 1.5, I
can't say off the top of my head at which point versions they took place.
On Thu, Feb 4, 2016 at 12:07 AM, vimal dinakaran
wrote:
> No I am using DSE 4.8 which has spark 1.4. Is this a known issue ?
No I am using DSE 4.8 which has spark 1.4. Is this a known issue ?
On Wed, Jan 27, 2016 at 11:52 PM, Cody Koeninger wrote:
> Have you tried spark 1.5?
>
> On Wed, Jan 27, 2016 at 11:14 AM, vimal dinakaran
> wrote:
>
>> Hi ,
>> I am using spark 1.4 with
is message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Can-t-view-executor-logs-in-web-UI-on-Windows-tp26122.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-
in 1.6.0.
>
> I haven't been able to find an open issue for this but if there is one could
> possibly submit a pull request for it.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Can-t-view-executor-logs
Have you tried spark 1.5?
On Wed, Jan 27, 2016 at 11:14 AM, vimal dinakaran
wrote:
> Hi ,
> I am using spark 1.4 with direct kafka api . In my streaming ui , I am
> able to see the events listed in UI only if add stream.print() statements
> or else event rate and input
Hi ,
I am using spark 1.4 with direct kafka api . In my streaming ui , I am
able to see the events listed in UI only if add stream.print() statements
or else event rate and input events remains in 0 eventhough the events gets
processed.
Without print statements , I have the action
If the application history is turned on, it should work, even through ssh
tunnel. Can you elaborate on what you mean by “it does not work?”
Also, are you able to see the application web UI while an application is
executing a job?
Mohammed
Author: Big Data Analytics with
Spark<h
Yes, I tried it, but it simply does not work.
so, my concern is *to use "ssh tunnel" to forward a port of cluster to
localhost port. *
But in Spark UI, there are two ports which I should forward using "*ssh
tunnel*".
Considering a default port, 8080 is web-ui port to come i
Hello, a questino about web UI log.
I could see web interface log after forwarding the port on my cluster to
my local and click completed application, but when I clicked "application
detail UI"
[image: Inline image 1]
It happened to me. I do not know why. I also checked the sp
As I mentioned before, I am tryint to see the spark log on a cluster via
ssh-tunnel
1) The error on application details UI is probably from monitoring porting
4044. Web UI port is 8088, right? so how could I see job web ui view and
application details UI view in the web ui on my local machine
I am not sure whether you can copy the log files from Spark workers to your
local machine and view it from the Web UI. In fact, if you are able to copy the
log files locally, you can just view them directly in any text editor.
I suspect what you really want to see is the application history
:49 AM, Jacek Laskowski <ja...@japila.pl> wrote:
> Hi,
>
> I'm trying to understand how Scheduling Delays are displayed in
> Streaming page in web UI and think the values are displayed
> incorrectly in the Timelines column. I'm only concerned with the
> scheduling delays (on
;
>> I'm trying to understand how Scheduling Delays are displayed in
>> Streaming page in web UI and think the values are displayed
>> incorrectly in the Timelines column. I'm only concerned with the
>> scheduling delays (on y axis) per batch times (x axis). It appears
>> tha
Hi,
I'm trying to understand how Scheduling Delays are displayed in
Streaming page in web UI and think the values are displayed
incorrectly in the Timelines column. I'm only concerned with the
scheduling delays (on y axis) per batch times (x axis). It appears
that the values (on y axis
Hi,
We are using Spark on Amazon EMR 4.1. To access Spark web UI, we are using
the link in yarn resource manager, but we are seeing a blank page on it.
Further, using Firefox debugging we noticed that we got a HTTP 500 error in
response.
We have tried configuring proxy settings for AWS and also
wrote:
> Hello!
> I'm trying to set up a reverse proxy (using nginx) for the Spark Web UI.
> I have 2 machines:
>1) Machine A, with a public IP. This machine will be used to access
> Spark Web UI on the Machine B through its private IP address.
>2) Machine B, where Spark
Hello!
I'm trying to set up a reverse proxy (using nginx) for the Spark Web UI.
I have 2 machines:
1) Machine A, with a public IP. This machine will be used to access
Spark Web UI on the Machine B through its private IP address.
2) Machine B, where Spark is installed (standalone master
Similar setup for Hue
http://gethue.com/using-nginx-to-speed-up-hue-3-8-0/
Might give you an idea.
--
Ruslan Dautkhanov
On Thu, Sep 17, 2015 at 9:50 AM, mjordan79 <renato.per...@gmail.com> wrote:
> Hello!
> I'm trying to set up a reverse proxy (using nginx) for the Spark Web UI
Hello!
I'm trying to set up a reverse proxy (using nginx) for the Spark Web UI.
I have 2 machines:
1) Machine A, with a public IP. This machine will be used to access Spark
Web UI on the Machine B through its private IP address.
2) Machine B, where Spark is installed (standalone master cluster, 1
The web ui is at port 8080. 4040 will show up something when you have a
running job or if you have configured history server.
On Sep 1, 2015 8:57 PM, "Sunil Rathee" <ratheesunil...@gmail.com> wrote:
>
> Hi,
>
>
> localhost:4040 is not showing anything on the brow
1 - 100 of 192 matches
Mail list logo