Re: Zeppelin 0.8.2 New Spark Interpreter
Number 2 under http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html is the best guide. spark.jars.packages can be set on the interpreter. I had to add export SPARK_SUBMIT_OPTIONS="--repositories " to zeppelin-env.sh to add my repo to the mix On Fri, Nov 8, 2019 at 5:11 AM Anton Kulaga wrote: > Are there clear instructions how to use spark.jars.packages properties? > For instance, if I want to depend on bintray repo > https://dl.bintray.com/comp-bio-aging/main with > "group.research.aging:spark-extensions_2.11:0.0.7.2" as a dependency, what > should I do with newintepreter? > > On 2019/10/12 01:18:09, Jeff Zhang wrote: > > Glad to hear that. > > > > Mark Bidewell 于2019年10月12日周六 上午1:30写道: > > > > > Just wanted to say "thanks"! Using spark.jars.packages, etc worked > great! > > > > > > On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang wrote: > > > > > >> That's right, document should also be updated > > >> > > >> Mark Bidewell 于2019年10月11日周五 下午9:28写道: > > >> > > >>> Also the interpreter setting UI is still listed as the first way to > > >>> handle dependencies in the documentation - Maybe it should be marked > as > > >>> deprecated? > > >>> > > >>> http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html > > >>> > > >>> > > >>> On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang wrote: > > >>> > > It looks like many users still get used to specify spark > dependencies > > in interpreter setting UI, spark.jars and spark.jars.packages seems > too > > difficult to understand and not transparent, so I create ticket > > https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can > > still set dependencies in interpreter setting UI. > > > > Jeff Zhang 于2019年10月11日周五 上午9:54写道: > > > > > Like I said above, try to set them via spark.jars and > > > spark.jars.packages. > > > > > > Don't set them here > > > > > > [image: image.png] > > > > > > > > > Mark Bidewell 于2019年10月11日周五 上午9:35写道: > > > > > >> I was specifying them in the interpreter settings in the UI. > > >> > > >> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang > wrote: > > >> > > >>> How do you specify your spark interpreter dependencies ? You > need to > > >>> specify it via property spark.jars or spark.jars.packages for > non-local > > >>> model. > > >>> > > >>> Mark Bidewell 于2019年10月11日周五 上午3:45写道: > > >>> > > I am running some initial tests of Zeppelin 0.8.2 and I am > seeing > > some weird issues with dependencies. When I use the old > interpreter, > > everything works as expected. When I use the new interpreter, > classes in > > my interpreter dependencies cannot be resolved when connecting > to a master > > that is not local[*], I did not encounter issues with either > interpreter > > on 0.8.1. > > > > Has anyone else seen this? > > > > Thanks! > > > > -- > > Mark Bidewell > > http://www.linkedin.com/in/markbidewell > > > > >>> > > >>> > > >>> -- > > >>> Best Regards > > >>> > > >>> Jeff Zhang > > >>> > > >> > > >> > > >> -- > > >> Mark Bidewell > > >> http://www.linkedin.com/in/markbidewell > > >> > > > > > > > > > -- > > > Best Regards > > > > > > Jeff Zhang > > > > > > > > > -- > > Best Regards > > > > Jeff Zhang > > > > >>> > > >>> > > >>> -- > > >>> Mark Bidewell > > >>> http://www.linkedin.com/in/markbidewell > > >>> > > >> > > >> > > >> -- > > >> Best Regards > > >> > > >> Jeff Zhang > > >> > > > > > > > > > -- > > > Mark Bidewell > > > http://www.linkedin.com/in/markbidewell > > > > > > > > > -- > > Best Regards > > > > Jeff Zhang > > > -- Mark Bidewell http://www.linkedin.com/in/markbidewell
Re: Zeppelin 0.8.2 New Spark Interpreter
Are there clear instructions how to use spark.jars.packages properties? For instance, if I want to depend on bintray repo https://dl.bintray.com/comp-bio-aging/main with "group.research.aging:spark-extensions_2.11:0.0.7.2" as a dependency, what should I do with newintepreter? On 2019/10/12 01:18:09, Jeff Zhang wrote: > Glad to hear that. > > Mark Bidewell 于2019年10月12日周六 上午1:30写道: > > > Just wanted to say "thanks"! Using spark.jars.packages, etc worked great! > > > > On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang wrote: > > > >> That's right, document should also be updated > >> > >> Mark Bidewell 于2019年10月11日周五 下午9:28写道: > >> > >>> Also the interpreter setting UI is still listed as the first way to > >>> handle dependencies in the documentation - Maybe it should be marked as > >>> deprecated? > >>> > >>> http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html > >>> > >>> > >>> On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang wrote: > >>> > It looks like many users still get used to specify spark dependencies > in interpreter setting UI, spark.jars and spark.jars.packages seems too > difficult to understand and not transparent, so I create ticket > https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can > still set dependencies in interpreter setting UI. > > Jeff Zhang 于2019年10月11日周五 上午9:54写道: > > > Like I said above, try to set them via spark.jars and > > spark.jars.packages. > > > > Don't set them here > > > > [image: image.png] > > > > > > Mark Bidewell 于2019年10月11日周五 上午9:35写道: > > > >> I was specifying them in the interpreter settings in the UI. > >> > >> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: > >> > >>> How do you specify your spark interpreter dependencies ? You need to > >>> specify it via property spark.jars or spark.jars.packages for > >>> non-local > >>> model. > >>> > >>> Mark Bidewell 于2019年10月11日周五 上午3:45写道: > >>> > I am running some initial tests of Zeppelin 0.8.2 and I am seeing > some weird issues with dependencies. When I use the old interpreter, > everything works as expected. When I use the new interpreter, > classes in > my interpreter dependencies cannot be resolved when connecting to a > master > that is not local[*], I did not encounter issues with either > interpreter > on 0.8.1. > > Has anyone else seen this? > > Thanks! > > -- > Mark Bidewell > http://www.linkedin.com/in/markbidewell > > >>> > >>> > >>> -- > >>> Best Regards > >>> > >>> Jeff Zhang > >>> > >> > >> > >> -- > >> Mark Bidewell > >> http://www.linkedin.com/in/markbidewell > >> > > > > > > -- > > Best Regards > > > > Jeff Zhang > > > > > -- > Best Regards > > Jeff Zhang > > >>> > >>> > >>> -- > >>> Mark Bidewell > >>> http://www.linkedin.com/in/markbidewell > >>> > >> > >> > >> -- > >> Best Regards > >> > >> Jeff Zhang > >> > > > > > > -- > > Mark Bidewell > > http://www.linkedin.com/in/markbidewell > > > > > -- > Best Regards > > Jeff Zhang >
Re: Zeppelin 0.8.2 New Spark Interpreter
Glad to hear that. Mark Bidewell 于2019年10月12日周六 上午1:30写道: > Just wanted to say "thanks"! Using spark.jars.packages, etc worked great! > > On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang wrote: > >> That's right, document should also be updated >> >> Mark Bidewell 于2019年10月11日周五 下午9:28写道: >> >>> Also the interpreter setting UI is still listed as the first way to >>> handle dependencies in the documentation - Maybe it should be marked as >>> deprecated? >>> >>> http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html >>> >>> >>> On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang wrote: >>> It looks like many users still get used to specify spark dependencies in interpreter setting UI, spark.jars and spark.jars.packages seems too difficult to understand and not transparent, so I create ticket https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can still set dependencies in interpreter setting UI. Jeff Zhang 于2019年10月11日周五 上午9:54写道: > Like I said above, try to set them via spark.jars and > spark.jars.packages. > > Don't set them here > > [image: image.png] > > > Mark Bidewell 于2019年10月11日周五 上午9:35写道: > >> I was specifying them in the interpreter settings in the UI. >> >> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: >> >>> How do you specify your spark interpreter dependencies ? You need to >>> specify it via property spark.jars or spark.jars.packages for non-local >>> model. >>> >>> Mark Bidewell 于2019年10月11日周五 上午3:45写道: >>> I am running some initial tests of Zeppelin 0.8.2 and I am seeing some weird issues with dependencies. When I use the old interpreter, everything works as expected. When I use the new interpreter, classes in my interpreter dependencies cannot be resolved when connecting to a master that is not local[*], I did not encounter issues with either interpreter on 0.8.1. Has anyone else seen this? Thanks! -- Mark Bidewell http://www.linkedin.com/in/markbidewell >>> >>> >>> -- >>> Best Regards >>> >>> Jeff Zhang >>> >> >> >> -- >> Mark Bidewell >> http://www.linkedin.com/in/markbidewell >> > > > -- > Best Regards > > Jeff Zhang > -- Best Regards Jeff Zhang >>> >>> >>> -- >>> Mark Bidewell >>> http://www.linkedin.com/in/markbidewell >>> >> >> >> -- >> Best Regards >> >> Jeff Zhang >> > > > -- > Mark Bidewell > http://www.linkedin.com/in/markbidewell > -- Best Regards Jeff Zhang
Re: Zeppelin 0.8.2 New Spark Interpreter
Just wanted to say "thanks"! Using spark.jars.packages, etc worked great! On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang wrote: > That's right, document should also be updated > > Mark Bidewell 于2019年10月11日周五 下午9:28写道: > >> Also the interpreter setting UI is still listed as the first way to >> handle dependencies in the documentation - Maybe it should be marked as >> deprecated? >> >> http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html >> >> >> On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang wrote: >> >>> It looks like many users still get used to specify spark dependencies in >>> interpreter setting UI, spark.jars and spark.jars.packages seems too >>> difficult to understand and not transparent, so I create ticket >>> https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can still >>> set dependencies in interpreter setting UI. >>> >>> Jeff Zhang 于2019年10月11日周五 上午9:54写道: >>> Like I said above, try to set them via spark.jars and spark.jars.packages. Don't set them here [image: image.png] Mark Bidewell 于2019年10月11日周五 上午9:35写道: > I was specifying them in the interpreter settings in the UI. > > On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: > >> How do you specify your spark interpreter dependencies ? You need to >> specify it via property spark.jars or spark.jars.packages for non-local >> model. >> >> Mark Bidewell 于2019年10月11日周五 上午3:45写道: >> >>> I am running some initial tests of Zeppelin 0.8.2 and I am seeing >>> some weird issues with dependencies. When I use the old interpreter, >>> everything works as expected. When I use the new interpreter, classes >>> in >>> my interpreter dependencies cannot be resolved when connecting to a >>> master >>> that is not local[*], I did not encounter issues with either >>> interpreter >>> on 0.8.1. >>> >>> Has anyone else seen this? >>> >>> Thanks! >>> >>> -- >>> Mark Bidewell >>> http://www.linkedin.com/in/markbidewell >>> >> >> >> -- >> Best Regards >> >> Jeff Zhang >> > > > -- > Mark Bidewell > http://www.linkedin.com/in/markbidewell > -- Best Regards Jeff Zhang >>> >>> >>> -- >>> Best Regards >>> >>> Jeff Zhang >>> >> >> >> -- >> Mark Bidewell >> http://www.linkedin.com/in/markbidewell >> > > > -- > Best Regards > > Jeff Zhang > -- Mark Bidewell http://www.linkedin.com/in/markbidewell
Re: Zeppelin 0.8.2 New Spark Interpreter
That's right, document should also be updated Mark Bidewell 于2019年10月11日周五 下午9:28写道: > Also the interpreter setting UI is still listed as the first way to handle > dependencies in the documentation - Maybe it should be marked as deprecated? > > http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html > > > On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang wrote: > >> It looks like many users still get used to specify spark dependencies in >> interpreter setting UI, spark.jars and spark.jars.packages seems too >> difficult to understand and not transparent, so I create ticket >> https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can still >> set dependencies in interpreter setting UI. >> >> Jeff Zhang 于2019年10月11日周五 上午9:54写道: >> >>> Like I said above, try to set them via spark.jars and >>> spark.jars.packages. >>> >>> Don't set them here >>> >>> [image: image.png] >>> >>> >>> Mark Bidewell 于2019年10月11日周五 上午9:35写道: >>> I was specifying them in the interpreter settings in the UI. On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: > How do you specify your spark interpreter dependencies ? You need to > specify it via property spark.jars or spark.jars.packages for non-local > model. > > Mark Bidewell 于2019年10月11日周五 上午3:45写道: > >> I am running some initial tests of Zeppelin 0.8.2 and I am seeing >> some weird issues with dependencies. When I use the old interpreter, >> everything works as expected. When I use the new interpreter, classes in >> my interpreter dependencies cannot be resolved when connecting to a >> master >> that is not local[*], I did not encounter issues with either interpreter >> on 0.8.1. >> >> Has anyone else seen this? >> >> Thanks! >> >> -- >> Mark Bidewell >> http://www.linkedin.com/in/markbidewell >> > > > -- > Best Regards > > Jeff Zhang > -- Mark Bidewell http://www.linkedin.com/in/markbidewell >>> >>> >>> -- >>> Best Regards >>> >>> Jeff Zhang >>> >> >> >> -- >> Best Regards >> >> Jeff Zhang >> > > > -- > Mark Bidewell > http://www.linkedin.com/in/markbidewell > -- Best Regards Jeff Zhang
Re: Zeppelin 0.8.2 New Spark Interpreter
Also the interpreter setting UI is still listed as the first way to handle dependencies in the documentation - Maybe it should be marked as deprecated? http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang wrote: > It looks like many users still get used to specify spark dependencies in > interpreter setting UI, spark.jars and spark.jars.packages seems too > difficult to understand and not transparent, so I create ticket > https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can still > set dependencies in interpreter setting UI. > > Jeff Zhang 于2019年10月11日周五 上午9:54写道: > >> Like I said above, try to set them via spark.jars and >> spark.jars.packages. >> >> Don't set them here >> >> [image: image.png] >> >> >> Mark Bidewell 于2019年10月11日周五 上午9:35写道: >> >>> I was specifying them in the interpreter settings in the UI. >>> >>> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: >>> How do you specify your spark interpreter dependencies ? You need to specify it via property spark.jars or spark.jars.packages for non-local model. Mark Bidewell 于2019年10月11日周五 上午3:45写道: > I am running some initial tests of Zeppelin 0.8.2 and I am seeing some > weird issues with dependencies. When I use the old interpreter, > everything > works as expected. When I use the new interpreter, classes in my > interpreter dependencies cannot be resolved when connecting to a master > that is not local[*], I did not encounter issues with either interpreter > on 0.8.1. > > Has anyone else seen this? > > Thanks! > > -- > Mark Bidewell > http://www.linkedin.com/in/markbidewell > -- Best Regards Jeff Zhang >>> >>> >>> -- >>> Mark Bidewell >>> http://www.linkedin.com/in/markbidewell >>> >> >> >> -- >> Best Regards >> >> Jeff Zhang >> > > > -- > Best Regards > > Jeff Zhang > -- Mark Bidewell http://www.linkedin.com/in/markbidewell
Re: Zeppelin 0.8.2 New Spark Interpreter
It looks like many users still get used to specify spark dependencies in interpreter setting UI, spark.jars and spark.jars.packages seems too difficult to understand and not transparent, so I create ticket https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can still set dependencies in interpreter setting UI. Jeff Zhang 于2019年10月11日周五 上午9:54写道: > Like I said above, try to set them via spark.jars and spark.jars.packages. > > Don't set them here > > [image: image.png] > > > Mark Bidewell 于2019年10月11日周五 上午9:35写道: > >> I was specifying them in the interpreter settings in the UI. >> >> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: >> >>> How do you specify your spark interpreter dependencies ? You need to >>> specify it via property spark.jars or spark.jars.packages for non-local >>> model. >>> >>> Mark Bidewell 于2019年10月11日周五 上午3:45写道: >>> I am running some initial tests of Zeppelin 0.8.2 and I am seeing some weird issues with dependencies. When I use the old interpreter, everything works as expected. When I use the new interpreter, classes in my interpreter dependencies cannot be resolved when connecting to a master that is not local[*], I did not encounter issues with either interpreter on 0.8.1. Has anyone else seen this? Thanks! -- Mark Bidewell http://www.linkedin.com/in/markbidewell >>> >>> >>> -- >>> Best Regards >>> >>> Jeff Zhang >>> >> >> >> -- >> Mark Bidewell >> http://www.linkedin.com/in/markbidewell >> > > > -- > Best Regards > > Jeff Zhang > -- Best Regards Jeff Zhang
Re: Zeppelin 0.8.2 New Spark Interpreter
Like I said above, try to set them via spark.jars and spark.jars.packages. Don't set them here [image: image.png] Mark Bidewell 于2019年10月11日周五 上午9:35写道: > I was specifying them in the interpreter settings in the UI. > > On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: > >> How do you specify your spark interpreter dependencies ? You need to >> specify it via property spark.jars or spark.jars.packages for non-local >> model. >> >> Mark Bidewell 于2019年10月11日周五 上午3:45写道: >> >>> I am running some initial tests of Zeppelin 0.8.2 and I am seeing some >>> weird issues with dependencies. When I use the old interpreter, everything >>> works as expected. When I use the new interpreter, classes in my >>> interpreter dependencies cannot be resolved when connecting to a master >>> that is not local[*], I did not encounter issues with either interpreter >>> on 0.8.1. >>> >>> Has anyone else seen this? >>> >>> Thanks! >>> >>> -- >>> Mark Bidewell >>> http://www.linkedin.com/in/markbidewell >>> >> >> >> -- >> Best Regards >> >> Jeff Zhang >> > > > -- > Mark Bidewell > http://www.linkedin.com/in/markbidewell > -- Best Regards Jeff Zhang
Re: Zeppelin 0.8.2 New Spark Interpreter
I was specifying them in the interpreter settings in the UI. On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: > How do you specify your spark interpreter dependencies ? You need to > specify it via property spark.jars or spark.jars.packages for non-local > model. > > Mark Bidewell 于2019年10月11日周五 上午3:45写道: > >> I am running some initial tests of Zeppelin 0.8.2 and I am seeing some >> weird issues with dependencies. When I use the old interpreter, everything >> works as expected. When I use the new interpreter, classes in my >> interpreter dependencies cannot be resolved when connecting to a master >> that is not local[*], I did not encounter issues with either interpreter >> on 0.8.1. >> >> Has anyone else seen this? >> >> Thanks! >> >> -- >> Mark Bidewell >> http://www.linkedin.com/in/markbidewell >> > > > -- > Best Regards > > Jeff Zhang > -- Mark Bidewell http://www.linkedin.com/in/markbidewell
Re: Zeppelin 0.8.2 New Spark Interpreter
How do you specify your spark interpreter dependencies ? You need to specify it via property spark.jars or spark.jars.packages for non-local model. Mark Bidewell 于2019年10月11日周五 上午3:45写道: > I am running some initial tests of Zeppelin 0.8.2 and I am seeing some > weird issues with dependencies. When I use the old interpreter, everything > works as expected. When I use the new interpreter, classes in my > interpreter dependencies cannot be resolved when connecting to a master > that is not local[*], I did not encounter issues with either interpreter > on 0.8.1. > > Has anyone else seen this? > > Thanks! > > -- > Mark Bidewell > http://www.linkedin.com/in/markbidewell > -- Best Regards Jeff Zhang
Zeppelin 0.8.2 New Spark Interpreter
I am running some initial tests of Zeppelin 0.8.2 and I am seeing some weird issues with dependencies. When I use the old interpreter, everything works as expected. When I use the new interpreter, classes in my interpreter dependencies cannot be resolved when connecting to a master that is not local[*], I did not encounter issues with either interpreter on 0.8.1. Has anyone else seen this? Thanks! -- Mark Bidewell http://www.linkedin.com/in/markbidewell