Looks like there was a missing SparkR::: for connExists. Maybe it became
public in 2.0, but it was hidden in 1.6. Testing  now before starting vote
for RC6.

On Thu, Feb 16, 2017 at 10:54 AM Marius van Niekerk <
marius.v.niek...@gmail.com> wrote:

> The sparkR error might be something i did wrong when backporting the spark
> 2.0 R things
>
> On Thu, 16 Feb 2017 at 10:18 Chip Senkbeil <chip.senkb...@gmail.com>
> wrote:
>
> > Turns out I needed to set my hostname on my machine to localhost. Doing
> so
> > fixed that error.
> >
> > As for the SparkR error, I'll take a look.
> >
> > On Thu, Feb 16, 2017 at 9:07 AM Gino Bustelo <g...@bustelos.com> wrote:
> >
> > > I tried your pip package at
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > using
> > > the `make test-release` target and it worked fine. I ran the magics
> test
> > > notebook and everything worked except there is an issue when calling
> the
> > > %SparkR magic. It ends up in an infinite loop printing
> > >
> > > Warning message:
> > >
> > > In rm(".sparkRcon", envir = .sparkREnv) : object '.sparkRcon' not found
> > >
> > > Error in sparkR.connect() : could not find function "connExists"
> > >
> > > Execution halted
> > >
> > > 17/02/15 23:15:51 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null
> process
> > > exited: 1
> > >
> > > Loading required package: methods
> > >
> > >
> > > Attaching package: ‘SparkR’
> > >
> > >
> > > The following objects are masked from ‘package:stats’:
> > >
> > >
> > >     cov, filter, lag, na.omit, predict, sd, var
> > >
> > >
> > > The following objects are masked from ‘package:base’:
> > >
> > >
> > >     colnames, colnames<-, endsWith, intersect, rank, rbind, sample,
> > >
> > >     startsWith, subset, summary, table, transform
> > >
> > >
> > > On Thu, Feb 16, 2017 at 9:03 AM Corey Stubbs <cas5...@gmail.com>
> wrote:
> > >
> > > > I was able to run from the pip package just fine.
> > > >
> > > > On Wed, Feb 15, 2017 at 4:50 PM Chip Senkbeil <
> chip.senkb...@gmail.com
> > >
> > > > wrote:
> > > >
> > > > > By master, I mean the 0.1.x branch. Was trying to get the next vote
> > > > > started.
> > > > >
> > > > > On Wed, Feb 15, 2017, 4:44 PM Chip Senkbeil <
> chip.senkb...@gmail.com
> > >
> > > > > wrote:
> > > > >
> > > > > > Just built master, got all of the artifacts ready, blah blah
> blah.
> > > > Tested
> > > > > > by installing the artifact from
> > > > > >
> > > > >
> > > >
> > >
> >
> https://dist.apache.org/repos/dist/dev/incubator/toree/0.1.0/rc5/toree-pip/
> > > > > and
> > > > > > now it's failing with not being able to bind to an ephemeral port
> > for
> > > > the
> > > > > > sparkDriver. Can someone help me take a look at this? I just
> > > installed
> > > > > like
> > > > > > usual via `pip install apache-toree-0.1.0.tar.gz` and pointed to
> > > spark
> > > > > > distributions I downloaded (1.6.1 and 1.6.3) when running
> `jupyter
> > > > toree
> > > > > > install --spark_home=...`. When launching a kernel, it fails
> > with...
> > > > > >
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [WARN] o.a.s.u.Utils - Service 'sparkDriver'
> > could
> > > > not
> > > > > > bind on port 0. Attempting port 1.
> > > > > > 17/02/15 16:41:29 [ERROR] o.a.s.SparkContext - Error initializing
> > > > > > SparkContext.
> > > > > > java.net.BindException: Can't assign requested address: Service
> > > > > > 'sparkDriver' failed after 16 retries!
> > > > > > at sun.nio.ch.Net.bind0(Native Method)
> > > > > > at sun.nio.ch.Net.bind(Net.java:433)
> > > > > > at sun.nio.ch.Net.bind(Net.java:425)
> > > > > > at
> > > > > > sun.nio.ch
> > > > > .ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
> > > > > > at sun.nio.ch
> > .ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
> > > > > > at
> io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
> > > > > > at
> > > >
> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
> > > > > > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
> > > > > > at java.lang.Thread.run(Thread.java:745)
> > > > > >
> > > > >
> > > >
> > >
> >
> --
> regards
> Marius van Niekerk
>

Reply via email to