Downgrade is for the case that the latest version has some critical bugs
and user want to downgrade to a old version.
Jongyoul Lee 于2017年7月11日周二 上午2:35写道:
> I haven't thought about about downgrading, but at least, we need to
> provide a way to install interpreter from UI.
I'm using Mac. I don't know what was the problem, I just switched to
Safari. Not sure if it wasn't caused by some other issue.
I was trying to run spark compiled for scala 2.11 with 3rd party 2.10
libraries at that time.
It was causing super-non-obvious exceptions. I stopped to do it and it
solved
I successfully added a maven snapshot repository and was able to resolve
the dependencies. Unfortunately I have published new versions to the
repository and restarted the interpreter yet the new artifact is not being
pulled in.
I set it up using the following template
Thanks for telling me that. I'll also test it with chrome. Might you use it
in Windows? I never heard about it so I'm just asking something to find a
clue.
On Mon, 10 Jul 2017 at 17:10 Serega Sheypak
wrote:
> It was Chrome, probably Version 59.0.3071.115 (Official
We DID provide two statements in a paragraph for a while, but this features
looks broken.
On Tue, 11 Jul 2017 at 03:14 wrote:
> Best as i recall you can't have two sql statements in one zeppelin note.
> Try separating them.
>
> Get Outlook for Android
I haven't thought about about downgrading, but at least, we need to provide
a way to install interpreter from UI. And do you think we need to erase
interpreter as well? We can unbind some existing interpreters and also
delete some interpreter settings as well.
On Mon, 10 Jul 2017 at 17:16 Jeff
For Oracle JDBC driver we had to feed ojdb7.jar
into SPARK_SUBMIT_OPTIONS through --jars parameter
and into ZEPPELIN_INTP_CLASSPATH_OVERRIDES, like:
zeppelin-env.sh:
export SPARK_SUBMIT_OPTIONS=". . . --jars /var/lib/sqoop/ojdbc7.jar"
> export
>
Best as i recall you can't have two sql statements in one zeppelin note. Try
separating them.
Get Outlook for Android
On Wed, Jul 5, 2017 at 7:13 AM -0400, "Iavor Jelev"
wrote:
Hi everyone,
first off - I'm new to Zeppelin, but I already love
Hi
We want to use a jdbc driver with pyspark through Zeppelin. Not the custom
interpreter but from sqlContext where we can read into dataframe.
I added the jdbc driver jar to zeppelin spark submit options "--jars" but it
still says driver class not found.
Does it have to reside
Nevermind, I forgot that it's in intepreter settings
https://cloud.githubusercontent.com/assets/5082742/20110797/c6852202-a60b-11e6-8264-93437a58f752.gif
2017-07-10 10:46 GMT+02:00 Serega Sheypak :
> Super stupid question, sorry.
> I can't find button / link to Spark
10 matches
Mail list logo