Hi,
Thanks for your answer. I need an 'offline' solution so only 2 works for
me. I have already tried to print to a pdf but the layout is not great.
Some cell have long line output which set a huge width to the document and
if I remove output from cells, I remove the plot ... I know I can cherry
Hi moon,
In the end I managed to refactor my code so I didn’t need the
InterpreterContext or the paragraph ID at all. As my use case for it has sort
of disappeared I won’t file an issue for it now. Other people might need to
have access to the ZeppelinContext or InterpreterContext for their
Hi Moon,
here is a simplified example. Both the object and the trait are defined in
the same section.
```
object JsonKey {
sealed abstract class JsonKey(_name: String) {
def name = _name
}
case object Email extends JsonKey("email")
case object FacebookId extends
I tried the code with both scala-2.10.4 and spark-shell, and the results
were the same. Looks like scala repl does not allow such usage.
Someone who have better scala experience in this mailing list or scala
community can help you. http://www.scala-lang.org/community/
Let me know if there're
Hi Moon,
thank you for pointing me in the right direction. Knowing that this is an
inherent limitation of scala worksheets it was much easier to debug and to
find a workaround. Here is my take on it:
```
case class JsonKey(name: String)
object Email extends JsonKey("email")
object FacebookId
Thanks so much for the clarification. Perfectly clear now!
On Thu, Oct 15, 2015 at 4:41 PM, Silvio Fiorito <
silvio.fior...@granturing.com> wrote:
> Hi Chad,
>
> So with that Angular code I posted you can then create an Angular variable
> called “locations” and set it from a separate Scala
Silvio, thanks for the examples.
I'm a bit [totally] new when it comes to working with angular. I'm running
your map example and I get the map to display just fine, but how would I
add markers to the map from a separate paragraph?
Thanks,
Chad
On Tue, Oct 6, 2015 at 10:48 AM, Silvio Fiorito <
Hi Chad,
So with that Angular code I posted you can then create an Angular variable
called “locations” and set it from a separate Scala paragraph, like so:
case class Loc(desc: String, lat: Double, lon: Double)
val locations = Array(Loc(“Test”, 24.4, 49.8))
z.angularBind(“locations”, locations)
I have an existing zeppelin build (which I built using spark 1.4). I want
to try it now with Spark 1.5.1.
According to the github documentataion (
https://github.com/apache/incubator-zeppelin) - "If you want to use system
provided Spark and Hadoop, export SPARK_HOME and HADOOP_HOME in
Hi Sourav,
1.5.1 has been added recently to the list of supported versions.
Internally spark interpreter will check against a list of supported
versions, before running.
This list is here:
spark/src/main/java/org/apache/zeppelin/spark/SparkVersion.java
This commit introduced the new spark
10 matches
Mail list logo