[jira] [Commented] (TOREE-336) Toree not working with Apache Spark 2.0.0

2017-02-11 Thread JIRA

[ 
https://issues.apache.org/jira/browse/TOREE-336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15862389#comment-15862389
 ] 

Marc Carré commented on TOREE-336:
--

Hi All, many thanks for Apache Toree and for the workarounds:
{code} 
$ cd /path/to/incubator-toree
$ make clean release APACHE_SPARK_VERSION=2.1.0
$ pip install toree --no-index 
--find-links=/path/to/incubator-toree/dist/toree-pip/toree-0.2.0.dev1.tar.gz
{code}
definitely "worked on my machine"^(tm)^.
When can we expect a release to be pushed to PyPI? ({{toree-0.1.0.dev8}} is 
pretty old...)

> Toree not working with Apache Spark 2.0.0
> -
>
> Key: TOREE-336
> URL: https://issues.apache.org/jira/browse/TOREE-336
> Project: TOREE
>  Issue Type: Bug
> Environment: OSX and ubuntu-14.04, both running scala 2.10.4 and 
> spark 2.0.0
>Reporter: Tianhui Li
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Following the instructions on 
> https://github.com/apache/incubator-toree/blob/master/README.md, I run
> ```
> pip install --pre toree
> jupyter toree install --spark-home=$SPARK_HOME
> ```
> I'm able to build fine.  But upon starting the server and a new scala (or any 
> other type of notebook), I an error (provided below).  This seems related to 
> using scala 2.10 rather than 2.11 (see 
> http://stackoverflow.com/questions/29339005/run-main-0-java-lang-nosuchmethoderror-scala-collection-immutable-hashset-emp
>  and 
> http://stackoverflow.com/questions/30536759/running-a-spark-application-in-intellij-14-1-3).
>   Below is the error:
> $ jupyter notebook
> [I 12:11:59.464 NotebookApp] Serving notebooks from local directory: 
> /Users/tianhui
> [I 12:11:59.464 NotebookApp] 0 active kernels 
> [I 12:11:59.465 NotebookApp] The Jupyter Notebook is running at: 
> http://localhost:/
> [I 12:11:59.465 NotebookApp] Use Control-C to stop this server and shut down 
> all kernels (twice to skip confirmation).
> [I 12:12:06.847 NotebookApp] 302 GET / (::1) 0.47ms
> [I 12:12:10.591 NotebookApp] Creating new notebook in 
> [I 12:12:11.600 NotebookApp] Kernel started: 
> 20ca2e71-781b-4208-ad88-bc04c1ca37d6
> Starting Spark Kernel with 
> SPARK_HOME=/usr/local/Cellar/apache-spark/2.0.0/libexec/
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Kernel version: 
> 0.1.0.dev9-incubating-SNAPSHOT
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Scala version: Some(2.10.4)
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - ZeroMQ (JeroMQ) version: 3.2.2
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Initializing internal actor 
> system
> Exception in thread "main" java.lang.NoSuchMethodError: 
> scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
>   at akka.actor.ActorCell$.(ActorCell.scala:336)
>   at akka.actor.ActorCell$.(ActorCell.scala)
>   at akka.actor.RootActorPath.$div(ActorPath.scala:185)
>   at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:465)
>   at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:453)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at 
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
>   at scala.util.Try$.apply(Try.scala:192)
>   at 
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
>   at 
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>   at 
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>   at scala.util.Success.flatMap(Try.scala:231)
>   at 
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
>   at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:585)
>   at akka.actor.ActorSystemImpl.(ActorSystem.scala:578)
>   at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
>   at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
>   at 
> org.apache.toree.boot.layer.StandardBareInitialization$class.createActorSystem(BareInitialization.scala:71)
>   at org.apache.toree.Main$$anon$1.createActorSystem(Main.scala:35)
>   at 
> org.apache.toree.boot.layer.StandardBareInitialization$class.initializeBare(BareInitialization.scala:60)
>   at org.apache.toree.Main$$anon$1.initializeBare(Main.scala:35)
>   at 
> org.apache.toree.boot.KernelBootstrap.initialize(KernelBootstrap.scala:70)
>   at org.apache.toree.Main$delayedInit$body.apply(Main.scala:40)
>   at scala.Functio

[jira] [Closed] (TOREE-361) Spark examples that use Spark 2 fail because docker image contains 1.6

2017-02-11 Thread Jakob Odersky (JIRA)

 [ 
https://issues.apache.org/jira/browse/TOREE-361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jakob Odersky closed TOREE-361.
---
Resolution: Fixed

fixed in pr

> Spark examples that use Spark 2 fail because docker image contains 1.6
> --
>
> Key: TOREE-361
> URL: https://issues.apache.org/jira/browse/TOREE-361
> Project: TOREE
>  Issue Type: Bug
>Reporter: Kevin Bates
>Priority: Minor
>  Labels: build, easyfix, newbie
>
> Spark2-dependent examples (magic-tutorial.ipynb) don't work because the 
> docker image referenced in the Makefile contains Spark 1.6. As a result, 
> issues occur with import of spark.implicits._ (and SparkSession references).  
> Workaround: override with specific tag of more recent image (e.g.,  `make dev 
> IMAGE=jupyter/all-spark-notebook:2410ad57203a`) based on examination at 
> https://github.com/jupyter/docker-stacks/wiki/Docker-build-history.
> The default (no override) should render working examples.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Linking JIRA with GitHub

2017-02-11 Thread Jakob Odersky
Hi,
does anyone how we can link jira with github, so that pull requests
with a title of the form [TOREE-X] will close the issue X when merged.
Basically I'm looking for something similar to the way Spark handles
issues and pull requests.

cheers,
--Jakob


Re: Linking JIRA with GitHub

2017-02-11 Thread Marius van Niekerk
So spark has a whole bunch of scripts that you can run to do the merges and
close things. For something a little lighter we can see if we can adapt the
ones from airflow or arrow since they looked quite nice to me last time I
checked.

On Sat, Feb 11, 2017, 15:30 Jakob Odersky  wrote:

> Hi,
> does anyone how we can link jira with github, so that pull requests
> with a title of the form [TOREE-X] will close the issue X when merged.
> Basically I'm looking for something similar to the way Spark handles
> issues and pull requests.
>
> cheers,
> --Jakob
>
-- 
regards
Marius van Niekerk


Classloader exceptions when testing

2017-02-11 Thread Jakob Odersky
Hi everyone,

I may have found a fix for the classloader issues I'm having when
running unit tests
(https://groups.google.com/forum/#!topic/akka-user/U0mLX3mCmAk).
Basically the issue appears to happen when a test suite that uses the
Akka TestKit base class does not define an explicit classloader.

I.e. the following will result in a cryptic error:

class ExecuteRequestHandlerSpec extends TestKit(
  ActorSystem("ExecuteRequestHandlerSpec")) { //tests }

whereas explicitly defining a classloader for the actor system works fine:

class ExecuteRequestHandlerSpec extends TestKit(
  ActorSystem("ExecuteRequestHandlerSpec",
org.apache.toree.Main.getClass.getClassLoader)) { //tests }

I have seen that latter approach used in various tests, however the
classloaders are not specified consistently in all tests. My proposed
approach to fixing the tests is to make classloaders explicit
everywhere (thinking about creating a utility mixin, instead of using
TestKit directly).

I'm not very well versed in all the class loader magic that happens
within Spark, Akka and sbt, and would greatly appreciate any input as
to why the exceptions are raised in the first place and whether my
proposal is actually a clean fix or just some quick hack.

cheers,
--Jakob


[jira] [Closed] (TOREE-372) stream corruption cased by big-endian and little-endian

2017-02-11 Thread Jakob Odersky (JIRA)

 [ 
https://issues.apache.org/jira/browse/TOREE-372?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jakob Odersky closed TOREE-372.
---
Resolution: Won't Fix

The issue isn't related to Toree. However a mixed endian environment may be 
possible by running Toree in YARN-client mode, once TOREE-369 gets implemented.

> stream corruption cased by big-endian and little-endian
> ---
>
> Key: TOREE-372
> URL: https://issues.apache.org/jira/browse/TOREE-372
> Project: TOREE
>  Issue Type: Bug
>Reporter: Wang enzhong
>Priority: Critical
>
> We currently run spark on z/OS system which is a big-endian platform and 
> jupyter+toree on x86 platform which is small-endian platform.  The output 
> from spark is unreadable caused by the different byte order. 
> If we use spark on z/OS and jupyter+toree on another big-endian platform, 
> there is no such error. 
> I've done some investigation and it seems toree leverages the bytestring of 
> Akka which has endian process, don't know why toree does not work in our 
> case. 
> Please help to look into the problem. Due to the tight project schedule, it 
> will be much appreciated if you can give us some advice on how to fix or 
> avoid the problem if it will take some time to change the code. Many thanks 
> in advance. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Re: Classloader exceptions when testing

2017-02-11 Thread Marius van Niekerk
That's great news.

I recall some other parts of Akka where migrating the code to spark 2.0 /
scala 2.11 required specifying a bunch of classloaders in the source.

On Sat, 11 Feb 2017 at 17:45 Jakob Odersky  wrote:

> Hi everyone,
>
> I may have found a fix for the classloader issues I'm having when
> running unit tests
> (https://groups.google.com/forum/#!topic/akka-user/U0mLX3mCmAk).
> Basically the issue appears to happen when a test suite that uses the
> Akka TestKit base class does not define an explicit classloader.
>
> I.e. the following will result in a cryptic error:
>
> class ExecuteRequestHandlerSpec extends TestKit(
>   ActorSystem("ExecuteRequestHandlerSpec")) { //tests }
>
> whereas explicitly defining a classloader for the actor system works fine:
>
> class ExecuteRequestHandlerSpec extends TestKit(
>   ActorSystem("ExecuteRequestHandlerSpec",
> org.apache.toree.Main.getClass.getClassLoader)) { //tests }
>
> I have seen that latter approach used in various tests, however the
> classloaders are not specified consistently in all tests. My proposed
> approach to fixing the tests is to make classloaders explicit
> everywhere (thinking about creating a utility mixin, instead of using
> TestKit directly).
>
> I'm not very well versed in all the class loader magic that happens
> within Spark, Akka and sbt, and would greatly appreciate any input as
> to why the exceptions are raised in the first place and whether my
> proposal is actually a clean fix or just some quick hack.
>
> cheers,
> --Jakob
>
-- 
regards
Marius van Niekerk