[jira] [Commented] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15869317#comment-15869317 ] Corey A Stubbs commented on TOREE-386: -- The lines mentioned above ultimately bubble down to here, https://github.com/apache/incubator-toree/blob/6ef0c8cec02a6b622e204ab0271321e0f8a75d38/kernel/src/main/scala/org/apache/toree/boot/layer/ComponentInitialization.scala#L104, where the spark application name is set. The proper fix for this should set the app name of the parameter if it is passed in. > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} > update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed > to {color:red}Apache Toree{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Updated] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sachin Aggarwal updated TOREE-386: -- Description: this is my kernel.json {code} { "language": "scala", "display_name": "toree_special - Scala", "env": { "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", "SPARK_HOME": "spark_home", "__TOREE_OPTS__": "", "DEFAULT_INTERPRETER": "Scala", "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", "PYTHON_EXEC": "python" }, "argv": [ "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", "--profile", "{connection_file}" ] } {code} the parameter that I added {color:red}--name MyAPP{color} is not applied I still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed to {color:red}Apache Toree{color} was: this is my kernel.json {code} { "language": "scala", "display_name": "toree_special - Scala", "env": { "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", "SPARK_HOME": "spark_home", "__TOREE_OPTS__": "", "DEFAULT_INTERPRETER": "Scala", "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", "PYTHON_EXEC": "python" }, "argv": [ "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", "--profile", "{connection_file}" ] } {code} the parameter that I added {color:red}--name MyAPP{color} is not applied I still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} update: In new verison of toree {color:red}IBM Spark Kernel{color} is renamed to {color:red}Apache Toree{color} > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} > update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed > to {color:red}Apache Toree{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Updated] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sachin Aggarwal updated TOREE-386: -- Description: this is my kernel.json {code} { "language": "scala", "display_name": "toree_special - Scala", "env": { "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", "SPARK_HOME": "spark_home", "__TOREE_OPTS__": "", "DEFAULT_INTERPRETER": "Scala", "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", "PYTHON_EXEC": "python" }, "argv": [ "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", "--profile", "{connection_file}" ] } {code} the parameter that I added {color:red}--name MyAPP{color} is not applied I still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} update: In new verison of toree {color:red}IBM Spark Kernel{color} is renamed to {color:red}Apache Toree{color} was: this is my kernel.json {code} { "language": "scala", "display_name": "toree_special - Scala", "env": { "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", "SPARK_HOME": "spark_home", "__TOREE_OPTS__": "", "DEFAULT_INTERPRETER": "Scala", "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", "PYTHON_EXEC": "python" }, "argv": [ "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", "--profile", "{connection_file}" ] } {code} the parameter that I added {color:red}--name MyAPP{color} is not applied I still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} > update: In new verison of toree {color:red}IBM Spark Kernel{color} is renamed > to {color:red}Apache Toree{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Comment Edited] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15869274#comment-15869274 ] Sachin Aggarwal edited comment on TOREE-386 at 2/16/17 5:50 AM: Hi , I am using old verison if u see master branch its been updated now https://github.com/apache/incubator-toree/blob/master/protocol/src/main/scala/org/apache/toree/kernel/protocol/v5/SparkKernelInfo.scala#L46 commit history https://github.com/apache/incubator-toree/commit/1ea7b5671136221b115b82141b301bebffa301a0 but issue remains the same .. was (Author: sachin aggarwal): Hi , I am using old verison if u see master branch its been updated now https://github.com/apache/incubator-toree/blob/master/protocol/src/main/scala/org/apache/toree/kernel/protocol/v5/SparkKernelInfo.scala#L46 commit history https://github.com/apache/incubator-toree/commit/1ea7b5671136221b115b82141b301bebffa301a0 but issue remians the same .. > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Comment Edited] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15869274#comment-15869274 ] Sachin Aggarwal edited comment on TOREE-386 at 2/16/17 5:51 AM: Hi [~jodersky], I am using old verison if u see master branch its been updated now https://github.com/apache/incubator-toree/blob/master/protocol/src/main/scala/org/apache/toree/kernel/protocol/v5/SparkKernelInfo.scala#L46 commit history https://github.com/apache/incubator-toree/commit/1ea7b5671136221b115b82141b301bebffa301a0 but issue remains the same .. was (Author: sachin aggarwal): Hi , I am using old verison if u see master branch its been updated now https://github.com/apache/incubator-toree/blob/master/protocol/src/main/scala/org/apache/toree/kernel/protocol/v5/SparkKernelInfo.scala#L46 commit history https://github.com/apache/incubator-toree/commit/1ea7b5671136221b115b82141b301bebffa301a0 but issue remains the same .. > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Comment Edited] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15869274#comment-15869274 ] Sachin Aggarwal edited comment on TOREE-386 at 2/16/17 5:50 AM: Hi , I am using old verison if u see master branch its been updated now https://github.com/apache/incubator-toree/blob/master/protocol/src/main/scala/org/apache/toree/kernel/protocol/v5/SparkKernelInfo.scala#L46 commit history https://github.com/apache/incubator-toree/commit/1ea7b5671136221b115b82141b301bebffa301a0 but issue remians the same .. was (Author: sachin aggarwal): Hi , I am using old verison if u see master branch its been updated now https://github.com/apache/incubator-toree/blob/master/protocol/src/main/scala/org/apache/toree/kernel/protocol/v5/SparkKernelInfo.scala#L46 but issue remians the same .. > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Commented] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15869274#comment-15869274 ] Sachin Aggarwal commented on TOREE-386: --- Hi , I am using old verison if u see master branch its been updated now https://github.com/apache/incubator-toree/blob/master/protocol/src/main/scala/org/apache/toree/kernel/protocol/v5/SparkKernelInfo.scala#L46 but issue remians the same .. > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Created] (TOREE-387) Kernel should not store SparkSession
Ryan Blue created TOREE-387: --- Summary: Kernel should not store SparkSession Key: TOREE-387 URL: https://issues.apache.org/jira/browse/TOREE-387 Project: TOREE Issue Type: Improvement Reporter: Ryan Blue Currently, the kernel creates and stores the SparkSession in a field to share between interpreters. If the user closes a SparkSession and creates a new one, then the Kernel still returns the original. Users may need to restart Spark sessions for long-running notebooks or to deal with Spark errors without losing datasets that have been pulled back to the notebook. I think that Toree should always return the current Spark session by calling {{SparkSession.builder.getOrCreate}}. -- This message was sent by Atlassian JIRA (v6.3.15#6346)
Re: Toree for 1.6.x is broken
By master, I mean the 0.1.x branch. Was trying to get the next vote started. On Wed, Feb 15, 2017, 4:44 PM Chip Senkbeil wrote: > Just built master, got all of the artifacts ready, blah blah blah. Tested > by installing the artifact from > https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/ > and > now it's failing with not being able to bind to an ephemeral port for the > sparkDriver. Can someone help me take a look at this? I just installed like > usual via `pip install apache-toree-0.1.0.tar.gz` and pointed to spark > distributions I downloaded (1.6.1 and 1.6.3) when running `jupyter toree > install --spark_home=...`. When launching a kernel, it fails with... > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not > bind on port 0. Attempting port 1. > 17/02/15 16:41:29 [ERROR] o.a.s.SparkContext - Error initializing > SparkContext. > java.net.BindException: Can't assign requested address: Service > 'sparkDriver' failed after 16 retries! > at sun.nio.ch.Net.bind0(Native Method) > at sun.nio.ch.Net.bind(Net.java:433) > at sun.nio.ch.Net.bind(Net.java:425) > at > sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) > at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) > at > io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) > at > io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) > at > io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) > at > io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) > at > io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) > at > io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) > at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) > at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) > at > io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) > at > io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) > at java.lang.Thread.run(Thread.java:745) >
Toree for 1.6.x is broken
Just built master, got all of the artifacts ready, blah blah blah. Tested by installing the artifact from https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/ and now it's failing with not being able to bind to an ephemeral port for the sparkDriver. Can someone help me take a look at this? I just installed like usual via `pip install apache-toree-0.1.0.tar.gz` and pointed to spark distributions I downloaded (1.6.1 and 1.6.3) when running `jupyter toree install --spark_home=...`. When launching a kernel, it fails with... 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/02/15 16:41:29 [ERROR] o.a.s.SparkContext - Error initializing SparkContext. java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries! at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) at java.lang.Thread.run(Thread.java:745)
[jira] [Commented] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
[ https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15868606#comment-15868606 ] Jakob Odersky commented on TOREE-386: - Hmm, I'm not sure where the name is coming from, a grep over toree didn't show any mention of "IBM". On which version of toree can this behaviour be observed? > toree spark kernel --name parameter to spark-submit is not applied > --- > > Key: TOREE-386 > URL: https://issues.apache.org/jira/browse/TOREE-386 > Project: TOREE > Issue Type: Bug >Reporter: Sachin Aggarwal > > this is my kernel.json > {code} > { > "language": "scala", > "display_name": "toree_special - Scala", > "env": { > "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", > "SPARK_HOME": "spark_home", > "__TOREE_OPTS__": "", > "DEFAULT_INTERPRETER": "Scala", > "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", > "PYTHON_EXEC": "python" > }, > "argv": [ > "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", > "--profile", > "{connection_file}" > ] > } > {code} > the parameter that I added {color:red}--name MyAPP{color} is not applied I > still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Updated] (TOREE-375) Incorrect fully qualified name for spark context
[ https://issues.apache.org/jira/browse/TOREE-375?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jakob Odersky updated TOREE-375: Priority: Critical (was: Major) > Incorrect fully qualified name for spark context > > > Key: TOREE-375 > URL: https://issues.apache.org/jira/browse/TOREE-375 > Project: TOREE > Issue Type: Bug > Environment: Jupyter Notebook with Toree latest master > (1a9c11f5f1381c15b691a716acd0e1f0432a9a35) and Spark 2.0.2, Scala 2.11 >Reporter: Felix Schüler >Priority: Critical > > When running below snippet in a cell I get a compile error for the MLContext > Constructor. Somehow the fully qualified name of the SparkContext gets messed > up. > The same does not happen when I start a Spark shell with the --jars command > and create the MLContext there. > Snippet (the systemml jar is build with the latest master of SystemML): > {code} > %addjar > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > -f > import org.apache.sysml.api.mlcontext._ > import org.apache.sysml.api.mlcontext.ScriptFactory._ > val ml = new MLContext(sc) > Starting download from > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > Finished download of systemml-0.13.0-incubating-SNAPSHOT.jar > Name: Compile Error > Message: :25: error: overloaded method constructor MLContext with > alternatives: > (x$1: > org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.mlcontext.MLContext > > (x$1: > org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.mlcontext.MLContext > cannot be applied to > (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) >val ml = new MLContext(sc) > ^ > StackTrace: > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Commented] (TOREE-336) Toree not working with Apache Spark 2.0.0
[ https://issues.apache.org/jira/browse/TOREE-336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15867994#comment-15867994 ] Gino Bustelo commented on TOREE-336: Submitted https://github.com/apache/incubator-toree/pull/102 with new steps for testing out or installing toree. > Toree not working with Apache Spark 2.0.0 > - > > Key: TOREE-336 > URL: https://issues.apache.org/jira/browse/TOREE-336 > Project: TOREE > Issue Type: Bug > Environment: OSX and ubuntu-14.04, both running scala 2.10.4 and > spark 2.0.0 >Reporter: Tianhui Li > Original Estimate: 168h > Remaining Estimate: 168h > > Following the instructions on > https://github.com/apache/incubator-toree/blob/master/README.md, I run > ``` > pip install --pre toree > jupyter toree install --spark-home=$SPARK_HOME > ``` > I'm able to build fine. But upon starting the server and a new scala (or any > other type of notebook), I an error (provided below). This seems related to > using scala 2.10 rather than 2.11 (see > http://stackoverflow.com/questions/29339005/run-main-0-java-lang-nosuchmethoderror-scala-collection-immutable-hashset-emp > and > http://stackoverflow.com/questions/30536759/running-a-spark-application-in-intellij-14-1-3). > Below is the error: > $ jupyter notebook > [I 12:11:59.464 NotebookApp] Serving notebooks from local directory: > /Users/tianhui > [I 12:11:59.464 NotebookApp] 0 active kernels > [I 12:11:59.465 NotebookApp] The Jupyter Notebook is running at: > http://localhost:/ > [I 12:11:59.465 NotebookApp] Use Control-C to stop this server and shut down > all kernels (twice to skip confirmation). > [I 12:12:06.847 NotebookApp] 302 GET / (::1) 0.47ms > [I 12:12:10.591 NotebookApp] Creating new notebook in > [I 12:12:11.600 NotebookApp] Kernel started: > 20ca2e71-781b-4208-ad88-bc04c1ca37d6 > Starting Spark Kernel with > SPARK_HOME=/usr/local/Cellar/apache-spark/2.0.0/libexec/ > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Kernel version: > 0.1.0.dev9-incubating-SNAPSHOT > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Scala version: Some(2.10.4) > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - ZeroMQ (JeroMQ) version: 3.2.2 > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Initializing internal actor > system > Exception in thread "main" java.lang.NoSuchMethodError: > scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet; > at akka.actor.ActorCell$.(ActorCell.scala:336) > at akka.actor.ActorCell$.(ActorCell.scala) > at akka.actor.RootActorPath.$div(ActorPath.scala:185) > at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:465) > at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:453) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78) > at scala.util.Try$.apply(Try.scala:192) > at > akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73) > at > akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) > at > akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) > at scala.util.Success.flatMap(Try.scala:231) > at > akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84) > at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:585) > at akka.actor.ActorSystemImpl.(ActorSystem.scala:578) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:142) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:109) > at > org.apache.toree.boot.layer.StandardBareInitialization$class.createActorSystem(BareInitialization.scala:71) > at org.apache.toree.Main$$anon$1.createActorSystem(Main.scala:35) > at > org.apache.toree.boot.layer.StandardBareInitialization$class.initializeBare(BareInitialization.scala:60) > at org.apache.toree.Main$$anon$1.initializeBare(Main.scala:35) > at > org.apache.toree.boot.KernelBootstrap.initialize(KernelBootstrap.scala:70) > at org.apache.toree.Main$delayedInit$body.apply(Main.scala:40) > at scala.Function0$class.apply$mcV$sp(Function0.scala:34) > at > scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) > at scala.App$$anonfun$main$1.apply(App.scala:76) > at scala.App$$anonfun$main$1.apply(App.scala:76) > at scala.collection.immutabl
[jira] [Commented] (TOREE-336) Toree not working with Apache Spark 2.0.0
[ https://issues.apache.org/jira/browse/TOREE-336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15867973#comment-15867973 ] Gino Bustelo commented on TOREE-336: The pipy project for toree is now gone. We need to update the README to explain how folks can use snapshot builds from the apache dist server. > Toree not working with Apache Spark 2.0.0 > - > > Key: TOREE-336 > URL: https://issues.apache.org/jira/browse/TOREE-336 > Project: TOREE > Issue Type: Bug > Environment: OSX and ubuntu-14.04, both running scala 2.10.4 and > spark 2.0.0 >Reporter: Tianhui Li > Original Estimate: 168h > Remaining Estimate: 168h > > Following the instructions on > https://github.com/apache/incubator-toree/blob/master/README.md, I run > ``` > pip install --pre toree > jupyter toree install --spark-home=$SPARK_HOME > ``` > I'm able to build fine. But upon starting the server and a new scala (or any > other type of notebook), I an error (provided below). This seems related to > using scala 2.10 rather than 2.11 (see > http://stackoverflow.com/questions/29339005/run-main-0-java-lang-nosuchmethoderror-scala-collection-immutable-hashset-emp > and > http://stackoverflow.com/questions/30536759/running-a-spark-application-in-intellij-14-1-3). > Below is the error: > $ jupyter notebook > [I 12:11:59.464 NotebookApp] Serving notebooks from local directory: > /Users/tianhui > [I 12:11:59.464 NotebookApp] 0 active kernels > [I 12:11:59.465 NotebookApp] The Jupyter Notebook is running at: > http://localhost:/ > [I 12:11:59.465 NotebookApp] Use Control-C to stop this server and shut down > all kernels (twice to skip confirmation). > [I 12:12:06.847 NotebookApp] 302 GET / (::1) 0.47ms > [I 12:12:10.591 NotebookApp] Creating new notebook in > [I 12:12:11.600 NotebookApp] Kernel started: > 20ca2e71-781b-4208-ad88-bc04c1ca37d6 > Starting Spark Kernel with > SPARK_HOME=/usr/local/Cellar/apache-spark/2.0.0/libexec/ > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Kernel version: > 0.1.0.dev9-incubating-SNAPSHOT > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Scala version: Some(2.10.4) > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - ZeroMQ (JeroMQ) version: 3.2.2 > 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Initializing internal actor > system > Exception in thread "main" java.lang.NoSuchMethodError: > scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet; > at akka.actor.ActorCell$.(ActorCell.scala:336) > at akka.actor.ActorCell$.(ActorCell.scala) > at akka.actor.RootActorPath.$div(ActorPath.scala:185) > at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:465) > at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:453) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78) > at scala.util.Try$.apply(Try.scala:192) > at > akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73) > at > akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) > at > akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) > at scala.util.Success.flatMap(Try.scala:231) > at > akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84) > at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:585) > at akka.actor.ActorSystemImpl.(ActorSystem.scala:578) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:142) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:109) > at > org.apache.toree.boot.layer.StandardBareInitialization$class.createActorSystem(BareInitialization.scala:71) > at org.apache.toree.Main$$anon$1.createActorSystem(Main.scala:35) > at > org.apache.toree.boot.layer.StandardBareInitialization$class.initializeBare(BareInitialization.scala:60) > at org.apache.toree.Main$$anon$1.initializeBare(Main.scala:35) > at > org.apache.toree.boot.KernelBootstrap.initialize(KernelBootstrap.scala:70) > at org.apache.toree.Main$delayedInit$body.apply(Main.scala:40) > at scala.Function0$class.apply$mcV$sp(Function0.scala:34) > at > scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) > at scala.App$$anonfun$main$1.apply(App.scala:76) > at scala.App$$anonfun$main$1.apply(App.scala:76) >
[jira] [Created] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied
Sachin Aggarwal created TOREE-386: - Summary: toree spark kernel --name parameter to spark-submit is not applied Key: TOREE-386 URL: https://issues.apache.org/jira/browse/TOREE-386 Project: TOREE Issue Type: Bug Reporter: Sachin Aggarwal this is my kernel.json {code} { "language": "scala", "display_name": "toree_special - Scala", "env": { "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client", "SPARK_HOME": "spark_home", "__TOREE_OPTS__": "", "DEFAULT_INTERPRETER": "Scala", "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip", "PYTHON_EXEC": "python" }, "argv": [ "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh", "--profile", "{connection_file}" ] } {code} the parameter that I added {color:red}--name MyAPP{color} is not applied I still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color} -- This message was sent by Atlassian JIRA (v6.3.15#6346)