Supported Scala versions

2017-02-16 Thread Jakob Odersky
Hi everyone,

during a recent discussion with Marius
https://github.com/apache/incubator-toree/pull/93, I found out that
Toree does not support being built with Scala 2.10 anymore.

I am all in favor of dropping the EOL scala version, however
considering that there are still various references to the old version
and, more importantly, version specific files such as the the
interpreters, I wanted to check on the mailing list what Toree's
official standpoint on Scala support is.

best,
--Jakob


[jira] [Created] (TOREE-388) Add optional gradle build script

2017-02-16 Thread Ryan Blue (JIRA)
Ryan Blue created TOREE-388:
---

 Summary: Add optional gradle build script
 Key: TOREE-388
 URL: https://issues.apache.org/jira/browse/TOREE-388
 Project: TOREE
  Issue Type: New Feature
Reporter: Ryan Blue


We've been using a gradle build instead of sbt. I'll open a PR with it, in case 
anyone else is interested.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (TOREE-382) Revamp the sbt build and consolidate dependencies

2017-02-16 Thread Jakob Odersky (JIRA)

 [ 
https://issues.apache.org/jira/browse/TOREE-382?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jakob Odersky resolved TOREE-382.
-
Resolution: Fixed

> Revamp the sbt build and consolidate dependencies
> -
>
> Key: TOREE-382
> URL: https://issues.apache.org/jira/browse/TOREE-382
> Project: TOREE
>  Issue Type: Sub-task
>Reporter: Jakob Odersky
>Assignee: Jakob Odersky
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Re: Interpreter outputs

2017-02-16 Thread Chip Senkbeil
I'll take a look at them soon and give you some feedback. Was preparing the
next vote for the 0.1.0 release, which is why I've been a bit distracted.

On Thu, Feb 16, 2017 at 12:05 PM Ryan Blue 
wrote:

> Hi everyone,
>
> I've been putting together a proof-of-concept Toree deployment for our
> Spark users to test out. One of the things I needed to change was how
> interpreter outputs are displayed, allowing interpreters to show HTML or
> other representations of output objects. For anyone else interested, I've
> put up a PR with the changes for discussion:
>
>   https://github.com/apache/incubator-toree/pull/104
>
> There are also a few other PRs that fix tab completion, code completeness
> checks, syntax highlighting, and other usability issues. It would be great
> to get feedback from the community.
>
> Thanks!
>
> rb
>
> --
> Ryan Blue
> Software Engineer
> Netflix
>


Interpreter outputs

2017-02-16 Thread Ryan Blue
Hi everyone,

I've been putting together a proof-of-concept Toree deployment for our
Spark users to test out. One of the things I needed to change was how
interpreter outputs are displayed, allowing interpreters to show HTML or
other representations of output objects. For anyone else interested, I've
put up a PR with the changes for discussion:

  https://github.com/apache/incubator-toree/pull/104

There are also a few other PRs that fix tab completion, code completeness
checks, syntax highlighting, and other usability issues. It would be great
to get feedback from the community.

Thanks!

rb

-- 
Ryan Blue
Software Engineer
Netflix


Re: [VOTE] Apache Toree 0.1.0 RC6

2017-02-16 Thread Marius van Niekerk
+1

Going to +1 this since I assisted with getting rid of some of the bundled
pieces that were blocking for RC5.

On Thu, Feb 16, 2017, 12:14 Chip Senkbeil  wrote:

I'll go ahead and give a +1 since I've done a bit of testing using
Scala/SparkR to get this back up to speed.

Also resolved the remaining issues we had from the incubator general vote.
We're still bundling additional license information in the binary release,
but the source release no longer has the extra license info tacked onto the
LICENSE file. I've also fixed the source release such that it is no longer
dependent on being in a Git repository to be built nor does it require you
to have sbt installed.

On Thu, Feb 16, 2017 at 11:09 AM Chip Senkbeil 
wrote:

> Please vote to approve the release of the following candidate as Apache
> Toree version 0.1.0. Pay special attention to the LICENSE and NOTICE files
> since this is our first release.
>
> ## Information and Artifacts
>
> The tag to be voted on is v0.1.0-rc6
> (51fa49cb5898e0c5b7824f986382436b969cabc7), located here:
>
>
>
https://github.com/apache/incubator-toree/commit/51fa49cb5898e0c5b7824f986382436b969cabc7
>
> All distribution packages, including signatures, digests, etc. can be
found
> at:
>
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc6/
>
> Staging artifacts can be found at:
>
> https://repository.apache.org/content/repositories/orgapachetoree-1006
>
> ## Testing Instructions
>
> The fastest way to get up and running is to using Jupyter.
>
> 1. Install Jupyter if you haven't already (http://jupyter.org/install.html
> )
>
> 2. Grab the Apache Toree archive from
>
>
https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc6/toree-pip/apache-toree-0.1.0.tar.gz
>
> 3. Install Apache Toree via `pip install apache-toree-0.1.0.tar.gz`
> followed by `jupyter toree install`
>
> - You need to set a valid Apache Spark 1.6.x home, which can be done via
> `jupyter
> toree install --spark_home=/usr/local/spark`
>
> - You may need to run with `sudo` for installation permission
>
> - For all installation options, run `jupyter toree install --help-all`
>
> 4. Run a Jupyter notebook server via `jupyter notebook`
>
> - If the notebook portion of Jupyter is not installed but Jupyter is,
> you can install via `pip install notebook`
>
> 5. Create a notebook using "Apache Toree - Scala" from the dropdown under
> new
>
> - If you want other interpreter profiles than Scala, you can change the
> interpreters via `jupyter toree install --interpreters=PySpark,SQL`
>
> 6. Run Scala/Spark commands such as `sc.parallelize(1 to 100).sum()` in
the
> notebook
>
> ## Voting Instructions
>
> The vote is open for at least 72 hours and passes if a majority of at
least
> 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Toree 0.1.0
> [ ] -1 Do not release this package because ...
>

-- 
regards
Marius van Niekerk


Re: Toree for 1.6.x is broken

2017-02-16 Thread Marius van Niekerk
Cool

On Thu, Feb 16, 2017, 12:00 Chip Senkbeil  wrote:

> Looks like there was a missing SparkR::: for connExists. Maybe it became
> public in 2.0, but it was hidden in 1.6. Testing  now before starting vote
> for RC6.
>
> On Thu, Feb 16, 2017 at 10:54 AM Marius van Niekerk <
> marius.v.niek...@gmail.com> wrote:
>
> > The sparkR error might be something i did wrong when backporting the
> spark
> > 2.0 R things
> >
> > On Thu, 16 Feb 2017 at 10:18 Chip Senkbeil 
> > wrote:
> >
> > > Turns out I needed to set my hostname on my machine to localhost. Doing
> > so
> > > fixed that error.
> > >
> > > As for the SparkR error, I'll take a look.
> > >
> > > On Thu, Feb 16, 2017 at 9:07 AM Gino Bustelo 
> wrote:
> > >
> > > > I tried your pip package at
> > > >
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > > using
> > > > the `make test-release` target and it worked fine. I ran the magics
> > test
> > > > notebook and everything worked except there is an issue when calling
> > the
> > > > %SparkR magic. It ends up in an infinite loop printing
> > > >
> > > > Warning message:
> > > >
> > > > In rm(".sparkRcon", envir = .sparkREnv) : object '.sparkRcon' not
> found
> > > >
> > > > Error in sparkR.connect() : could not find function "connExists"
> > > >
> > > > Execution halted
> > > >
> > > > 17/02/15 23:15:51 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null
> > process
> > > > exited: 1
> > > >
> > > > Loading required package: methods
> > > >
> > > >
> > > > Attaching package: ‘SparkR’
> > > >
> > > >
> > > > The following objects are masked from ‘package:stats’:
> > > >
> > > >
> > > > cov, filter, lag, na.omit, predict, sd, var
> > > >
> > > >
> > > > The following objects are masked from ‘package:base’:
> > > >
> > > >
> > > > colnames, colnames<-, endsWith, intersect, rank, rbind, sample,
> > > >
> > > > startsWith, subset, summary, table, transform
> > > >
> > > >
> > > > On Thu, Feb 16, 2017 at 9:03 AM Corey Stubbs 
> > wrote:
> > > >
> > > > > I was able to run from the pip package just fine.
> > > > >
> > > > > On Wed, Feb 15, 2017 at 4:50 PM Chip Senkbeil <
> > chip.senkb...@gmail.com
> > > >
> > > > > wrote:
> > > > >
> > > > > > By master, I mean the 0.1.x branch. Was trying to get the next
> vote
> > > > > > started.
> > > > > >
> > > > > > On Wed, Feb 15, 2017, 4:44 PM Chip Senkbeil <
> > chip.senkb...@gmail.com
> > > >
> > > > > > wrote:
> > > > > >
> > > > > > > Just built master, got all of the artifacts ready, blah blah
> > blah.
> > > > > Tested
> > > > > > > by installing the artifact from
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > > > > and
> > > > > > > now it's failing with not being able to bind to an ephemeral
> port
> > > for
> > > > > the
> > > > > > > sparkDriver. Can someone help me take a look at this? I just
> > > > installed
> > > > > > like
> > > > > > > usual via `pip install apache-toree-0.1.0.tar.gz` and pointed
> to
> > > > spark
> > > > > > > distributions I downloaded (1.6.1 and 1.6.3) when running
> > `jupyter
> > > > > toree
> > > > > > > install --spark_home=...`. When launching a kernel, it fails
> > > with...
> > > > > > >
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind on port 0. Attempting port 1.
> > > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > > could
> > > > > not
> > > > > > > bind 

Re: [VOTE] Apache Toree 0.1.0 RC6

2017-02-16 Thread Chip Senkbeil
I'll go ahead and give a +1 since I've done a bit of testing using
Scala/SparkR to get this back up to speed.

Also resolved the remaining issues we had from the incubator general vote.
We're still bundling additional license information in the binary release,
but the source release no longer has the extra license info tacked onto the
LICENSE file. I've also fixed the source release such that it is no longer
dependent on being in a Git repository to be built nor does it require you
to have sbt installed.

On Thu, Feb 16, 2017 at 11:09 AM Chip Senkbeil 
wrote:

> Please vote to approve the release of the following candidate as Apache
> Toree version 0.1.0. Pay special attention to the LICENSE and NOTICE files
> since this is our first release.
>
> ## Information and Artifacts
>
> The tag to be voted on is v0.1.0-rc6
> (51fa49cb5898e0c5b7824f986382436b969cabc7), located here:
>
>
> https://github.com/apache/incubator-toree/commit/51fa49cb5898e0c5b7824f986382436b969cabc7
>
> All distribution packages, including signatures, digests, etc. can be found
> at:
>
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc6/
>
> Staging artifacts can be found at:
>
> https://repository.apache.org/content/repositories/orgapachetoree-1006
>
> ## Testing Instructions
>
> The fastest way to get up and running is to using Jupyter.
>
> 1. Install Jupyter if you haven't already (http://jupyter.org/install.html
> )
>
> 2. Grab the Apache Toree archive from
>
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc6/toree-pip/apache-toree-0.1.0.tar.gz
>
> 3. Install Apache Toree via `pip install apache-toree-0.1.0.tar.gz`
> followed by `jupyter toree install`
>
> - You need to set a valid Apache Spark 1.6.x home, which can be done via
> `jupyter
> toree install --spark_home=/usr/local/spark`
>
> - You may need to run with `sudo` for installation permission
>
> - For all installation options, run `jupyter toree install --help-all`
>
> 4. Run a Jupyter notebook server via `jupyter notebook`
>
> - If the notebook portion of Jupyter is not installed but Jupyter is,
> you can install via `pip install notebook`
>
> 5. Create a notebook using "Apache Toree - Scala" from the dropdown under
> new
>
> - If you want other interpreter profiles than Scala, you can change the
> interpreters via `jupyter toree install --interpreters=PySpark,SQL`
>
> 6. Run Scala/Spark commands such as `sc.parallelize(1 to 100).sum()` in the
> notebook
>
> ## Voting Instructions
>
> The vote is open for at least 72 hours and passes if a majority of at least
> 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Toree 0.1.0
> [ ] -1 Do not release this package because ...
>


Re: Toree for 1.6.x is broken

2017-02-16 Thread Chip Senkbeil
Looks like there was a missing SparkR::: for connExists. Maybe it became
public in 2.0, but it was hidden in 1.6. Testing  now before starting vote
for RC6.

On Thu, Feb 16, 2017 at 10:54 AM Marius van Niekerk <
marius.v.niek...@gmail.com> wrote:

> The sparkR error might be something i did wrong when backporting the spark
> 2.0 R things
>
> On Thu, 16 Feb 2017 at 10:18 Chip Senkbeil 
> wrote:
>
> > Turns out I needed to set my hostname on my machine to localhost. Doing
> so
> > fixed that error.
> >
> > As for the SparkR error, I'll take a look.
> >
> > On Thu, Feb 16, 2017 at 9:07 AM Gino Bustelo  wrote:
> >
> > > I tried your pip package at
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > using
> > > the `make test-release` target and it worked fine. I ran the magics
> test
> > > notebook and everything worked except there is an issue when calling
> the
> > > %SparkR magic. It ends up in an infinite loop printing
> > >
> > > Warning message:
> > >
> > > In rm(".sparkRcon", envir = .sparkREnv) : object '.sparkRcon' not found
> > >
> > > Error in sparkR.connect() : could not find function "connExists"
> > >
> > > Execution halted
> > >
> > > 17/02/15 23:15:51 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null
> process
> > > exited: 1
> > >
> > > Loading required package: methods
> > >
> > >
> > > Attaching package: ‘SparkR’
> > >
> > >
> > > The following objects are masked from ‘package:stats’:
> > >
> > >
> > > cov, filter, lag, na.omit, predict, sd, var
> > >
> > >
> > > The following objects are masked from ‘package:base’:
> > >
> > >
> > > colnames, colnames<-, endsWith, intersect, rank, rbind, sample,
> > >
> > > startsWith, subset, summary, table, transform
> > >
> > >
> > > On Thu, Feb 16, 2017 at 9:03 AM Corey Stubbs 
> wrote:
> > >
> > > > I was able to run from the pip package just fine.
> > > >
> > > > On Wed, Feb 15, 2017 at 4:50 PM Chip Senkbeil <
> chip.senkb...@gmail.com
> > >
> > > > wrote:
> > > >
> > > > > By master, I mean the 0.1.x branch. Was trying to get the next vote
> > > > > started.
> > > > >
> > > > > On Wed, Feb 15, 2017, 4:44 PM Chip Senkbeil <
> chip.senkb...@gmail.com
> > >
> > > > > wrote:
> > > > >
> > > > > > Just built master, got all of the artifacts ready, blah blah
> blah.
> > > > Tested
> > > > > > by installing the artifact from
> > > > > >
> > > > >
> > > >
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > > > and
> > > > > > now it's failing with not being able to bind to an ephemeral port
> > for
> > > > the
> > > > > > sparkDriver. Can someone help me take a look at this? I just
> > > installed
> > > > > like
> > > > > > usual via `pip install apache-toree-0.1.0.tar.gz` and pointed to
> > > spark
> > > > > > distributions I downloaded (1.6.1 and 1.6.3) when running
> `jupyter
> > > > toree
> > > > > > install --spark_home=...`. When launching a kernel, it fails
> > with...
> > > > > >
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > 

Re: Toree for 1.6.x is broken

2017-02-16 Thread Marius van Niekerk
The sparkR error might be something i did wrong when backporting the spark
2.0 R things

On Thu, 16 Feb 2017 at 10:18 Chip Senkbeil  wrote:

> Turns out I needed to set my hostname on my machine to localhost. Doing so
> fixed that error.
>
> As for the SparkR error, I'll take a look.
>
> On Thu, Feb 16, 2017 at 9:07 AM Gino Bustelo  wrote:
>
> > I tried your pip package at
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > using
> > the `make test-release` target and it worked fine. I ran the magics test
> > notebook and everything worked except there is an issue when calling the
> > %SparkR magic. It ends up in an infinite loop printing
> >
> > Warning message:
> >
> > In rm(".sparkRcon", envir = .sparkREnv) : object '.sparkRcon' not found
> >
> > Error in sparkR.connect() : could not find function "connExists"
> >
> > Execution halted
> >
> > 17/02/15 23:15:51 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null process
> > exited: 1
> >
> > Loading required package: methods
> >
> >
> > Attaching package: ‘SparkR’
> >
> >
> > The following objects are masked from ‘package:stats’:
> >
> >
> > cov, filter, lag, na.omit, predict, sd, var
> >
> >
> > The following objects are masked from ‘package:base’:
> >
> >
> > colnames, colnames<-, endsWith, intersect, rank, rbind, sample,
> >
> > startsWith, subset, summary, table, transform
> >
> >
> > On Thu, Feb 16, 2017 at 9:03 AM Corey Stubbs  wrote:
> >
> > > I was able to run from the pip package just fine.
> > >
> > > On Wed, Feb 15, 2017 at 4:50 PM Chip Senkbeil  >
> > > wrote:
> > >
> > > > By master, I mean the 0.1.x branch. Was trying to get the next vote
> > > > started.
> > > >
> > > > On Wed, Feb 15, 2017, 4:44 PM Chip Senkbeil  >
> > > > wrote:
> > > >
> > > > > Just built master, got all of the artifacts ready, blah blah blah.
> > > Tested
> > > > > by installing the artifact from
> > > > >
> > > >
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > > and
> > > > > now it's failing with not being able to bind to an ephemeral port
> for
> > > the
> > > > > sparkDriver. Can someone help me take a look at this? I just
> > installed
> > > > like
> > > > > usual via `pip install apache-toree-0.1.0.tar.gz` and pointed to
> > spark
> > > > > distributions I downloaded (1.6.1 and 1.6.3) when running `jupyter
> > > toree
> > > > > install --spark_home=...`. When launching a kernel, it fails
> with...
> > > > >
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> could
> > > not
> > > > > bind on port 0. Attempting port 1.
> > > > > 17/02/15 16:41:29 [ERROR] o.a.s.SparkContext - Error initializing
> > > > > SparkContext.
> > > > > java.net.BindException: Can't assign requested address: Service
> > > > > 'sparkDriver' failed after 16 retries!
> 

Re: Toree for 1.6.x is broken

2017-02-16 Thread Chip Senkbeil
Turns out I needed to set my hostname on my machine to localhost. Doing so
fixed that error.

As for the SparkR error, I'll take a look.

On Thu, Feb 16, 2017 at 9:07 AM Gino Bustelo  wrote:

> I tried your pip package at
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> using
> the `make test-release` target and it worked fine. I ran the magics test
> notebook and everything worked except there is an issue when calling the
> %SparkR magic. It ends up in an infinite loop printing
>
> Warning message:
>
> In rm(".sparkRcon", envir = .sparkREnv) : object '.sparkRcon' not found
>
> Error in sparkR.connect() : could not find function "connExists"
>
> Execution halted
>
> 17/02/15 23:15:51 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null process
> exited: 1
>
> Loading required package: methods
>
>
> Attaching package: ‘SparkR’
>
>
> The following objects are masked from ‘package:stats’:
>
>
> cov, filter, lag, na.omit, predict, sd, var
>
>
> The following objects are masked from ‘package:base’:
>
>
> colnames, colnames<-, endsWith, intersect, rank, rbind, sample,
>
> startsWith, subset, summary, table, transform
>
>
> On Thu, Feb 16, 2017 at 9:03 AM Corey Stubbs  wrote:
>
> > I was able to run from the pip package just fine.
> >
> > On Wed, Feb 15, 2017 at 4:50 PM Chip Senkbeil 
> > wrote:
> >
> > > By master, I mean the 0.1.x branch. Was trying to get the next vote
> > > started.
> > >
> > > On Wed, Feb 15, 2017, 4:44 PM Chip Senkbeil 
> > > wrote:
> > >
> > > > Just built master, got all of the artifacts ready, blah blah blah.
> > Tested
> > > > by installing the artifact from
> > > >
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > and
> > > > now it's failing with not being able to bind to an ephemeral port for
> > the
> > > > sparkDriver. Can someone help me take a look at this? I just
> installed
> > > like
> > > > usual via `pip install apache-toree-0.1.0.tar.gz` and pointed to
> spark
> > > > distributions I downloaded (1.6.1 and 1.6.3) when running `jupyter
> > toree
> > > > install --spark_home=...`. When launching a kernel, it fails with...
> > > >
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> > not
> > > > bind on port 0. Attempting port 1.
> > > > 17/02/15 16:41:29 [ERROR] o.a.s.SparkContext - Error initializing
> > > > SparkContext.
> > > > java.net.BindException: Can't assign requested address: Service
> > > > 'sparkDriver' failed after 16 retries!
> > > > at sun.nio.ch.Net.bind0(Native Method)
> > > > at sun.nio.ch.Net.bind(Net.java:433)
> > > > at sun.nio.ch.Net.bind(Net.java:425)
> > > > at
> > > > sun.nio.ch
> > > .ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
> > > > at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
> > > > at
> > > >
> > >
> >
> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
> > > > at
> > > >
> > >
> >
> 

Re: Toree for 1.6.x is broken

2017-02-16 Thread Gino Bustelo
I tried your pip package at
https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
using
the `make test-release` target and it worked fine. I ran the magics test
notebook and everything worked except there is an issue when calling the
%SparkR magic. It ends up in an infinite loop printing

Warning message:

In rm(".sparkRcon", envir = .sparkREnv) : object '.sparkRcon' not found

Error in sparkR.connect() : could not find function "connExists"

Execution halted

17/02/15 23:15:51 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null process
exited: 1

Loading required package: methods


Attaching package: ‘SparkR’


The following objects are masked from ‘package:stats’:


cov, filter, lag, na.omit, predict, sd, var


The following objects are masked from ‘package:base’:


colnames, colnames<-, endsWith, intersect, rank, rbind, sample,

startsWith, subset, summary, table, transform


On Thu, Feb 16, 2017 at 9:03 AM Corey Stubbs  wrote:

> I was able to run from the pip package just fine.
>
> On Wed, Feb 15, 2017 at 4:50 PM Chip Senkbeil 
> wrote:
>
> > By master, I mean the 0.1.x branch. Was trying to get the next vote
> > started.
> >
> > On Wed, Feb 15, 2017, 4:44 PM Chip Senkbeil 
> > wrote:
> >
> > > Just built master, got all of the artifacts ready, blah blah blah.
> Tested
> > > by installing the artifact from
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > and
> > > now it's failing with not being able to bind to an ephemeral port for
> the
> > > sparkDriver. Can someone help me take a look at this? I just installed
> > like
> > > usual via `pip install apache-toree-0.1.0.tar.gz` and pointed to spark
> > > distributions I downloaded (1.6.1 and 1.6.3) when running `jupyter
> toree
> > > install --spark_home=...`. When launching a kernel, it fails with...
> > >
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver' could
> not
> > > bind on port 0. Attempting port 1.
> > > 17/02/15 16:41:29 [ERROR] o.a.s.SparkContext - Error initializing
> > > SparkContext.
> > > java.net.BindException: Can't assign requested address: Service
> > > 'sparkDriver' failed after 16 retries!
> > > at sun.nio.ch.Net.bind0(Native Method)
> > > at sun.nio.ch.Net.bind(Net.java:433)
> > > at sun.nio.ch.Net.bind(Net.java:425)
> > > at
> > > sun.nio.ch
> > .ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
> > > at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
> > > at
> > >
> >
> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
> > > at
> > >
> >
> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
> > > at
> > >
> >
> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
> > > at
> > >
> >
> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
> > > at
> > >
> >
> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
> > > at
> > >
> >
> 

[jira] [Commented] (TOREE-386) spark kernel `--name test` or `--conf spark.app.name=test` parameter to spark_opts is not applied

2017-02-16 Thread Corey A Stubbs (JIRA)

[ 
https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15869901#comment-15869901
 ] 

Corey A Stubbs commented on TOREE-386:
--

[~sachin aggarwal], sorry for the confusion. It is not an issue with spark. I 
believe it is an issue with Toree, where the line I mentioned above is 
overwriting the argument you are passing in.

> spark kernel `--name test` or `--conf spark.app.name=test` parameter to 
> spark_opts is not applied 
> --
>
> Key: TOREE-386
> URL: https://issues.apache.org/jira/browse/TOREE-386
> Project: TOREE
>  Issue Type: Bug
>Reporter: Sachin Aggarwal
>
> this is my kernel.json
> {code}
> {
>   "language": "scala",
>   "display_name": "toree_special - Scala",
>   "env": {
> "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client",
> "SPARK_HOME": "spark_home",
> "__TOREE_OPTS__": "",
> "DEFAULT_INTERPRETER": "Scala",
> "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip",
> "PYTHON_EXEC": "python"
>   },
>   "argv": [
> "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh",
> "--profile",
> "{connection_file}"
>   ]
> }
> {code}
> the parameter that I added {color:red}--name MyAPP{color} is not applied I 
> still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color}
> update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed 
> to {color:red}Apache Toree{color}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (TOREE-386) spark kernel `--name test` or `--conf spark.app.name=test` parameter to spark_opts is not applied

2017-02-16 Thread Sachin Aggarwal (JIRA)

 [ 
https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sachin Aggarwal updated TOREE-386:
--
Summary: spark kernel `--name test` or `--conf spark.app.name=test` 
parameter to spark_opts is not applied   (was: toree spark kernel `--name test` 
or `--conf spark.app.name=test` parameter to spark_opts is not applied )

> spark kernel `--name test` or `--conf spark.app.name=test` parameter to 
> spark_opts is not applied 
> --
>
> Key: TOREE-386
> URL: https://issues.apache.org/jira/browse/TOREE-386
> Project: TOREE
>  Issue Type: Bug
>Reporter: Sachin Aggarwal
>
> this is my kernel.json
> {code}
> {
>   "language": "scala",
>   "display_name": "toree_special - Scala",
>   "env": {
> "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client",
> "SPARK_HOME": "spark_home",
> "__TOREE_OPTS__": "",
> "DEFAULT_INTERPRETER": "Scala",
> "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip",
> "PYTHON_EXEC": "python"
>   },
>   "argv": [
> "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh",
> "--profile",
> "{connection_file}"
>   ]
> }
> {code}
> the parameter that I added {color:red}--name MyAPP{color} is not applied I 
> still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color}
> update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed 
> to {color:red}Apache Toree{color}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (TOREE-386) toree spark kernel `--name test` or `--conf spark.app.name=test` parameter to spark_opts is not applied

2017-02-16 Thread Sachin Aggarwal (JIRA)

 [ 
https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sachin Aggarwal updated TOREE-386:
--
Summary: toree spark kernel `--name test` or `--conf spark.app.name=test` 
parameter to spark_opts is not applied   (was: toree spark kernel --name 
test``` or --conf spark.app.name=test``` parameter to spark_opts is not 
applied )

> toree spark kernel `--name test` or `--conf spark.app.name=test` parameter to 
> spark_opts is not applied 
> 
>
> Key: TOREE-386
> URL: https://issues.apache.org/jira/browse/TOREE-386
> Project: TOREE
>  Issue Type: Bug
>Reporter: Sachin Aggarwal
>
> this is my kernel.json
> {code}
> {
>   "language": "scala",
>   "display_name": "toree_special - Scala",
>   "env": {
> "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client",
> "SPARK_HOME": "spark_home",
> "__TOREE_OPTS__": "",
> "DEFAULT_INTERPRETER": "Scala",
> "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip",
> "PYTHON_EXEC": "python"
>   },
>   "argv": [
> "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh",
> "--profile",
> "{connection_file}"
>   ]
> }
> {code}
> the parameter that I added {color:red}--name MyAPP{color} is not applied I 
> still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color}
> update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed 
> to {color:red}Apache Toree{color}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (TOREE-386) toree spark kernel --name/--conf spark.app.name=test parameter to spark_opts is not applied

2017-02-16 Thread Sachin Aggarwal (JIRA)

 [ 
https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sachin Aggarwal updated TOREE-386:
--
Summary: toree spark kernel --name/--conf spark.app.name=test parameter to 
spark_opts is not applied   (was: toree spark kernel --name parameter to 
spark-submit is not applied )

> toree spark kernel --name/--conf spark.app.name=test parameter to spark_opts 
> is not applied 
> 
>
> Key: TOREE-386
> URL: https://issues.apache.org/jira/browse/TOREE-386
> Project: TOREE
>  Issue Type: Bug
>Reporter: Sachin Aggarwal
>
> this is my kernel.json
> {code}
> {
>   "language": "scala",
>   "display_name": "toree_special - Scala",
>   "env": {
> "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client",
> "SPARK_HOME": "spark_home",
> "__TOREE_OPTS__": "",
> "DEFAULT_INTERPRETER": "Scala",
> "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip",
> "PYTHON_EXEC": "python"
>   },
>   "argv": [
> "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh",
> "--profile",
> "{connection_file}"
>   ]
> }
> {code}
> the parameter that I added {color:red}--name MyAPP{color} is not applied I 
> still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color}
> update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed 
> to {color:red}Apache Toree{color}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied

2017-02-16 Thread Sachin Aggarwal (JIRA)

[ 
https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15869602#comment-15869602
 ] 

Sachin Aggarwal commented on TOREE-386:
---

I have created a Jira in spark also 
https://issues.apache.org/jira/browse/SPARK-19624

> toree spark kernel --name parameter to spark-submit is not applied 
> ---
>
> Key: TOREE-386
> URL: https://issues.apache.org/jira/browse/TOREE-386
> Project: TOREE
>  Issue Type: Bug
>Reporter: Sachin Aggarwal
>
> this is my kernel.json
> {code}
> {
>   "language": "scala",
>   "display_name": "toree_special - Scala",
>   "env": {
> "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client",
> "SPARK_HOME": "spark_home",
> "__TOREE_OPTS__": "",
> "DEFAULT_INTERPRETER": "Scala",
> "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip",
> "PYTHON_EXEC": "python"
>   },
>   "argv": [
> "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh",
> "--profile",
> "{connection_file}"
>   ]
> }
> {code}
> the parameter that I added {color:red}--name MyAPP{color} is not applied I 
> still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color}
> update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed 
> to {color:red}Apache Toree{color}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (TOREE-386) toree spark kernel --name parameter to spark-submit is not applied

2017-02-16 Thread Sachin Aggarwal (JIRA)

[ 
https://issues.apache.org/jira/browse/TOREE-386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15869472#comment-15869472
 ] 

Sachin Aggarwal commented on TOREE-386:
---

Hi [~Lull3rSkat3r] do mean to say set spark.app.name in spark conf ?

I have tried adding {code}--conf spark.app.name=testName{code} this also has no 
effect.



> toree spark kernel --name parameter to spark-submit is not applied 
> ---
>
> Key: TOREE-386
> URL: https://issues.apache.org/jira/browse/TOREE-386
> Project: TOREE
>  Issue Type: Bug
>Reporter: Sachin Aggarwal
>
> this is my kernel.json
> {code}
> {
>   "language": "scala",
>   "display_name": "toree_special - Scala",
>   "env": {
> "SPARK_OPTS": "--name MyAPP --master yarn --deploy-mode client",
> "SPARK_HOME": "spark_home",
> "__TOREE_OPTS__": "",
> "DEFAULT_INTERPRETER": "Scala",
> "PYTHONPATH": "spark_home/python:spark_home/python/lib/py4j-0.9-src.zip",
> "PYTHON_EXEC": "python"
>   },
>   "argv": [
> "/root/.local/share/jupyter/kernels/toree_special_scala/bin/run.sh",
> "--profile",
> "{connection_file}"
>   ]
> }
> {code}
> the parameter that I added {color:red}--name MyAPP{color} is not applied I 
> still see app name in yarn resource ui as {color:red}IBM Spark Kernel{color}
> update: In new version of toree {color:red}IBM Spark Kernel{color} is renamed 
> to {color:red}Apache Toree{color}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)