Re: ipython interpreter - ipython not installed

2020-06-30 Thread Jeff Zhang
Do you have multiple versions of python installed? You need to set
zeppelin.python to the right python that has ipython installed.


David Boyd  于2020年7月1日周三 下午12:16写道:

> All:
> I am trying to run the ipython example in the tutorials.
> I have ipython installed.  From the conda environment:
>- ipython=7.16.1=py38h5ca1d4c_0
>- ipython_genutils=0.2.0=py38_0
>
> I am getting the following error:
>
> > org.apache.zeppelin.interpreter.InterpreterException:
> > org.apache.zeppelin.interpreter.InterpreterException: Fail to open
> > JupyterKernelInterpreter:
> > org.apache.zeppelin.interpreter.InterpreterException: Kernel
> > prerequisite is not meet: ipython is not installed. at
> >
> org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:116)
>
> > at
> >
> org.apache.zeppelin.python.IPythonInterpreter.open(IPythonInterpreter.java:109)
>
> > at
> >
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
>
> > at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
> >
> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>
> > at
> >
> org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
> > at java.lang.Thread.run(Thread.java:748) at
> >
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
>
> > at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
> >
> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>
> > at
> >
> org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
> > at java.lang.Thread.run(Thread.java:748) Caused by:
> > org.apache.zeppelin.interpreter.InterpreterException: Fail to open
> > JupyterKernelInterpreter:
> > org.apache.zeppelin.interpreter.InterpreterException: Kernel
> > prerequisite is not meet: ipython is not installed. at
> >
> org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:116)
>
> > at
> >
> org.apache.zeppelin.python.IPythonInterpreter.open(IPythonInterpreter.java:109)
>
> > at
> >
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
>
> > at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
> >
> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>
> > at
> >
> org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
> > at java.lang.Thread.run(Thread.java:748) at
> >
> org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:132)
>
> > at
> >
> org.apache.zeppelin.python.IPythonInterpreter.open(IPythonInterpreter.java:109)
>
> > at
> >
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>
> > ... 8 more Caused by:
> > org.apache.zeppelin.interpreter.InterpreterException: Kernel
> > prerequisite is not meet: ipython is not installed. at
> >
> org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:116)
>
> > ... 10 more
>
> Says ipython not installed.
>
> --
> = mailto:db...@incadencecorp.com 
> David W. Boyd
> VP,  Data Solutions
> 10432 Balls Ford, Suite 240
> Manassas, VA 20109
> office:   +1-703-552-2862
> cell: +1-703-402-7908
> == http://www.incadencecorp.com/ 
> ISO/IEC JTC1 SC42/WG2, editor ISO/IEC 20546, ISO/IEC 20547-1
> Chair INCITS TG Big Data
> Co-chair NIST Big Data Public Working Group Reference Architecture
> First Robotic Mentor - FRC, FTC - www.iliterobotics.org
> Board Mem

ipython interpreter - ipython not installed

2020-06-30 Thread David Boyd

All:
   I am trying to run the ipython example in the tutorials.
I have ipython installed.  From the conda environment:
  - ipython=7.16.1=py38h5ca1d4c_0
  - ipython_genutils=0.2.0=py38_0

I am getting the following error:

org.apache.zeppelin.interpreter.InterpreterException: 
org.apache.zeppelin.interpreter.InterpreterException: Fail to open 
JupyterKernelInterpreter: 
org.apache.zeppelin.interpreter.InterpreterException: Kernel 
prerequisite is not meet: ipython is not installed. at 
org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:116) 
at 
org.apache.zeppelin.python.IPythonInterpreter.open(IPythonInterpreter.java:109) 
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668) 
at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at 
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) 
at 
org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39) 
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
at java.lang.Thread.run(Thread.java:748) at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668) 
at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at 
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) 
at 
org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39) 
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
at java.lang.Thread.run(Thread.java:748) Caused by: 
org.apache.zeppelin.interpreter.InterpreterException: Fail to open 
JupyterKernelInterpreter: 
org.apache.zeppelin.interpreter.InterpreterException: Kernel 
prerequisite is not meet: ipython is not installed. at 
org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:116) 
at 
org.apache.zeppelin.python.IPythonInterpreter.open(IPythonInterpreter.java:109) 
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668) 
at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at 
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) 
at 
org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39) 
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
at java.lang.Thread.run(Thread.java:748) at 
org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:132) 
at 
org.apache.zeppelin.python.IPythonInterpreter.open(IPythonInterpreter.java:109) 
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) 
... 8 more Caused by: 
org.apache.zeppelin.interpreter.InterpreterException: Kernel 
prerequisite is not meet: ipython is not installed. at 
org.apache.zeppelin.jupyter.JupyterKernelInterpreter.open(JupyterKernelInterpreter.java:116) 
... 10 more


Says ipython not installed.

--
= mailto:db...@incadencecorp.com 
David W. Boyd
VP,  Data Solutions
10432 Balls Ford, Suite 240
Manassas, VA 20109
office:   +1-703-552-2862
cell: +1-703-402-7908
== http://www.incadencecorp.com/ 
ISO/IEC JTC1 SC42/WG2, editor ISO/IEC 20546, ISO/IEC 20547-1
Chair INCITS TG Big Data
Co-chair NIST Big Data Public Working Group Reference Architecture
First Robotic Mentor - FRC, FTC - www.iliterobotics.org
Board Member- USSTEM Foundation - www.usstem.org

The information contained in this message may be privileged
and/or confidential and protected from disclosure.
If the reader of this message is not the intended recipient
or an employee or agent responsible for delivering this message
to the intended recipient, you are hereby notified that any
dissemination, distribution or copying of this communication
is strictly prohibited.  If you have received this communication
in error, please notify the sender immediately by replying to
this message and 

Re: Installing python packages to support tutorial

2020-06-30 Thread David Boyd

All:

  I got around the problem by going onto my server.  Installing each 
package one
at a time, then exporting the environment to a file.  I then used that 
to perform my

installs via puppet.

On 6/30/2020 7:23 PM, David Boyd wrote:

All:

   So I am setting up a python based virtual environment to support 
zeppelin.


Has anyone sucessfully set up a virtual environment with all the 
packages for the python tutorial?

If so how?
An hour plus after I ran conda to set up the env, I got massive 
conflict errors.


I created an environment.yml file with all the packages that are 
referenced in the

tutorial as shown below:


name: py37
channels:
   - conda-forge
dependencies:
   - python=3.7
   - numpy
   - pandas
   - jupyter
   - grpcio
   - protobuf
   - matplotlib
   - seaborn
   - bokeh
   - holoviews
   - altair
   - keras
   - ggplot
   - plotnine

Using conda In attempted to create the environment:

conda env create -f environment.yml

An hour plus later I get back the following list of incompatibilities:
UnsatisfiableError: The following specifications were found to be 
incompatible with each other:


Output in format: Requested package -> Available versions

Package python conflicts for:
pandas -> python-dateutil[version='>=2.6.1'] -> 
python[version='3.7.*|3.8.*']

plotnine -> python[version='2.7.*|3.5.*|3.6.*|>=3.5.0|>=3.6.0']
jupyter -> ipykernel -> python[version='3.4.*|>=3|>=3.4|>=3.5']
holoviews -> bokeh[version='>=1.1.0'] -> 
python[version='>=2.7|>=3|>=3.8,<3.9.0a0']
protobuf -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
ggplot -> cycler -> 
python[version='>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
protobuf -> python_abi=3.8[build=*_cp38] -> 
python[version='3.7.*|3.8.*']

keras -> h5py -> python[version='3.7.*|>=3.8,<3.9.0a0|3.8.*']
bokeh -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
altair -> 
python[version='2.7.*|3.4.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.5|>=3.6|>=3.8,<3.9.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0']
pandas -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
keras -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0|3.4.*']
grpcio -> 
python[version='>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']

matplotlib -> pyqt -> python[version='3.6.*|<3']
matplotlib -> 
python[version='2.7.*|3.4.*|3.5.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
seaborn -> statsmodels[version='>=0.8.0'] -> 
python[version='>=3.8,<3.9.0a0']

python=3.7
seaborn -> 
python[version='2.7.*|3.5.*|3.6.*|>=3.6|3.4.*|>=3.5,<3.6.0a0|>=3.7,<3.8.0a0|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0']
jupyter -> 
python[version='2.7.*|3.5.*|3.6.*|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|>=3.6,<3.7.0a0|>=2.7,<2.8.0a0|>=3.7,<3.8.0a0']
grpcio -> python_abi=3.7[build=*_cp37m] -> 
python[version='2.7.*|3.5.*|3.6.*|3.7.*|3.4.*|3.8.*']

ggplot -> python[version='2.7.*|3.5.*|3.6.*|3.4.*']
plotnine -> descartes[version='>=1.1.0'] -> 
python[version='3.4.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.8,<3.9.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0|>=3.5']
holoviews -> 
python[version='2.7.*|3.5.*|3.6.*|3.4.*|>=3.5,<3.6.0a0|>=3.7,<3.8.0a0|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0']
bokeh -> jinja2[version='>=2.7'] -> 
python[version='3.4.*|3.7.*|>=3.5|3.8.*']


Package setuptools conflicts for:
altair -> jinja2 -> setuptools[version='>=18.5']
keras -> setuptools
ggplot -> brewer2mpl -> setuptools
holoviews -> ipython[version='>=5.4.0'] -> setuptools[version='>=18.5']
grpcio -> setuptools
matplotlib -> setuptools
protobuf -> setuptools
plotnine -> matplotlib-base[version='>=3.1.1'] -> setuptools
python=3.7 -> pip -> setuptools
bokeh -> jinja2[version='>=2.7'] -> setuptools
seaborn -> matplotlib-base[version='>=2.1.2'] -> setuptools

Package enum34 conflicts for:
keras -> tensorflow -> enum34[version='>=1.1.6']
matplotlib -> pyqt -> enum34
altair -> traitlets -> enum34
grpcio -> enum34[version='>=1.0.4']

Package python-dateutil conflicts for:
plotnine -> matplotlib-base[version='>=3.1.1'] -> 
python-dateutil[version='>=2.5.*|>=2.6.1']
holoviews -> bokeh[version='>=1.1.0'] -> 
python-dateutil[version='>=2.1|>=2.6.1|>=2.5.*']

matplotlib -> python-dateutil
pandas -> python-dateutil[version='>=2.5.*|>=2.6.1']
bokeh -> python-dateutil[version='>=2.1']
ggplot -> matplotlib-base -> python-dateutil[version='>=2.5.*|>=2.6.1']
seaborn -> matplotlib-base[version='>=2.1.2'] -> 
python-dateutil[version='>=2.5.*|>=2.6.1']

bokeh -> matplotlib -> python-dateutil[version='>=2.5.*|>=2.6.1']
altair -> pandas -> python-dateutil[version='>=2.5.*|>=2.6.1']

Package functools32 conflicts for:
plotnine -> matplotlib[version='>=2.1.0'] -> functools32
seaborn -> matplotlib-base[version='>=2.1.2'] -> functools32
ggplot -> matplotlib-base -> func

Re: Installing python packages to support tutorial

2020-06-30 Thread Jeff Zhang
It might due to some conflicts between python packages, you can refer this
script which we use for zeppelin CI
https://github.com/apache/zeppelin/blob/master/testing/install_external_dependencies.sh


David Boyd  于2020年7月1日周三 上午7:23写道:

> All:
>
> So I am setting up a python based virtual environment to support
> zeppelin.
>
> Has anyone sucessfully set up a virtual environment with all the
> packages for the python tutorial?
> If so how?
> An hour plus after I ran conda to set up the env, I got massive conflict
> errors.
>
> I created an environment.yml file with all the packages that are
> referenced in the
> tutorial as shown below:
>
> > name: py37
> > channels:
> >- conda-forge
> > dependencies:
> >- python=3.7
> >- numpy
> >- pandas
> >- jupyter
> >- grpcio
> >- protobuf
> >- matplotlib
> >- seaborn
> >- bokeh
> >- holoviews
> >- altair
> >- keras
> >- ggplot
> >- plotnine
> Using conda In attempted to create the environment:
> > conda env create -f environment.yml
> An hour plus later I get back the following list of incompatibilities:
> > UnsatisfiableError: The following specifications were found to be
> > incompatible with each other:
> >
> > Output in format: Requested package -> Available versions
> >
> > Package python conflicts for:
> > pandas -> python-dateutil[version='>=2.6.1'] ->
> > python[version='3.7.*|3.8.*']
> > plotnine -> python[version='2.7.*|3.5.*|3.6.*|>=3.5.0|>=3.6.0']
> > jupyter -> ipykernel -> python[version='3.4.*|>=3|>=3.4|>=3.5']
> > holoviews -> bokeh[version='>=1.1.0'] ->
> > python[version='>=2.7|>=3|>=3.8,<3.9.0a0']
> > protobuf ->
> >
> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
> > ggplot -> cycler ->
> >
> python[version='>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
> > protobuf -> python_abi=3.8[build=*_cp38] -> python[version='3.7.*|3.8.*']
> > keras -> h5py -> python[version='3.7.*|>=3.8,<3.9.0a0|3.8.*']
> > bokeh ->
> >
> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
> > altair ->
> >
> python[version='2.7.*|3.4.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.5|>=3.6|>=3.8,<3.9.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0']
> > pandas ->
> >
> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
> > keras ->
> >
> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0|3.4.*']
> > grpcio ->
> >
> python[version='>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
> > matplotlib -> pyqt -> python[version='3.6.*|<3']
> > matplotlib ->
> >
> python[version='2.7.*|3.4.*|3.5.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
> > seaborn -> statsmodels[version='>=0.8.0'] ->
> > python[version='>=3.8,<3.9.0a0']
> > python=3.7
> > seaborn ->
> >
> python[version='2.7.*|3.5.*|3.6.*|>=3.6|3.4.*|>=3.5,<3.6.0a0|>=3.7,<3.8.0a0|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0']
> > jupyter ->
> >
> python[version='2.7.*|3.5.*|3.6.*|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|>=3.6,<3.7.0a0|>=2.7,<2.8.0a0|>=3.7,<3.8.0a0']
> > grpcio -> python_abi=3.7[build=*_cp37m] ->
> > python[version='2.7.*|3.5.*|3.6.*|3.7.*|3.4.*|3.8.*']
> > ggplot -> python[version='2.7.*|3.5.*|3.6.*|3.4.*']
> > plotnine -> descartes[version='>=1.1.0'] ->
> >
> python[version='3.4.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.8,<3.9.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0|>=3.5']
> > holoviews ->
> >
> python[version='2.7.*|3.5.*|3.6.*|3.4.*|>=3.5,<3.6.0a0|>=3.7,<3.8.0a0|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0']
> > bokeh -> jinja2[version='>=2.7'] ->
> > python[version='3.4.*|3.7.*|>=3.5|3.8.*']
> >
> > Package setuptools conflicts for:
> > altair -> jinja2 -> setuptools[version='>=18.5']
> > keras -> setuptools
> > ggplot -> brewer2mpl -> setuptools
> > holoviews -> ipython[version='>=5.4.0'] -> setuptools[version='>=18.5']
> > grpcio -> setuptools
> > matplotlib -> setuptools
> > protobuf -> setuptools
> > plotnine -> matplotlib-base[version='>=3.1.1'] -> setuptools
> > python=3.7 -> pip -> setuptools
> > bokeh -> jinja2[version='>=2.7'] -> setuptools
> > seaborn -> matplotlib-base[version='>=2.1.2'] -> setuptools
> >
> > Package enum34 conflicts for:
> > keras -> tensorflow -> enum34[version='>=1.1.6']
> > matplotlib -> pyqt -> enum34
> > altair -> traitlets -> enum34
> > grpcio -> enum34[version='>=1.0.4']
> >
> > Package python-dateutil conflicts for:
> > plotnine -> matplotlib-base[version='>=3.1.1'] ->
> > python-dateutil[version='>=2.5.*|>=2.6.1']
> > holoviews -> bokeh[version='>=1.1.0'] ->
> > python-dateutil[version='>=2.1|>=2.6.1|>=2.5.*']
> > matplotlib -> python-dateutil
> > pandas -> python-dateutil[version='>=2.5.*|>=2.6.1']
> > bokeh -> python-dateutil[version='>=2.1']
> > ggplot -> matplotlib-base -> python-dateutil[version='>=2.5.*|>=2.6.1']
> > seaborn -> matplotl

Installing python packages to support tutorial

2020-06-30 Thread David Boyd

All:

   So I am setting up a python based virtual environment to support 
zeppelin.


Has anyone sucessfully set up a virtual environment with all the 
packages for the python tutorial?

If so how?
An hour plus after I ran conda to set up the env, I got massive conflict 
errors.


I created an environment.yml file with all the packages that are 
referenced in the

tutorial as shown below:


name: py37
channels:
   - conda-forge
dependencies:
   - python=3.7
   - numpy
   - pandas
   - jupyter
   - grpcio
   - protobuf
   - matplotlib
   - seaborn
   - bokeh
   - holoviews
   - altair
   - keras
   - ggplot
   - plotnine

Using conda In attempted to create the environment:

conda env create -f environment.yml

An hour plus later I get back the following list of incompatibilities:
UnsatisfiableError: The following specifications were found to be 
incompatible with each other:


Output in format: Requested package -> Available versions

Package python conflicts for:
pandas -> python-dateutil[version='>=2.6.1'] -> 
python[version='3.7.*|3.8.*']

plotnine -> python[version='2.7.*|3.5.*|3.6.*|>=3.5.0|>=3.6.0']
jupyter -> ipykernel -> python[version='3.4.*|>=3|>=3.4|>=3.5']
holoviews -> bokeh[version='>=1.1.0'] -> 
python[version='>=2.7|>=3|>=3.8,<3.9.0a0']
protobuf -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
ggplot -> cycler -> 
python[version='>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']

protobuf -> python_abi=3.8[build=*_cp38] -> python[version='3.7.*|3.8.*']
keras -> h5py -> python[version='3.7.*|>=3.8,<3.9.0a0|3.8.*']
bokeh -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
altair -> 
python[version='2.7.*|3.4.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.5|>=3.6|>=3.8,<3.9.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0']
pandas -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
keras -> 
python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0|3.4.*']
grpcio -> 
python[version='>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']

matplotlib -> pyqt -> python[version='3.6.*|<3']
matplotlib -> 
python[version='2.7.*|3.4.*|3.5.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0']
seaborn -> statsmodels[version='>=0.8.0'] -> 
python[version='>=3.8,<3.9.0a0']

python=3.7
seaborn -> 
python[version='2.7.*|3.5.*|3.6.*|>=3.6|3.4.*|>=3.5,<3.6.0a0|>=3.7,<3.8.0a0|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0']
jupyter -> 
python[version='2.7.*|3.5.*|3.6.*|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|>=3.6,<3.7.0a0|>=2.7,<2.8.0a0|>=3.7,<3.8.0a0']
grpcio -> python_abi=3.7[build=*_cp37m] -> 
python[version='2.7.*|3.5.*|3.6.*|3.7.*|3.4.*|3.8.*']

ggplot -> python[version='2.7.*|3.5.*|3.6.*|3.4.*']
plotnine -> descartes[version='>=1.1.0'] -> 
python[version='3.4.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.8,<3.9.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0|>=3.5']
holoviews -> 
python[version='2.7.*|3.5.*|3.6.*|3.4.*|>=3.5,<3.6.0a0|>=3.7,<3.8.0a0|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0']
bokeh -> jinja2[version='>=2.7'] -> 
python[version='3.4.*|3.7.*|>=3.5|3.8.*']


Package setuptools conflicts for:
altair -> jinja2 -> setuptools[version='>=18.5']
keras -> setuptools
ggplot -> brewer2mpl -> setuptools
holoviews -> ipython[version='>=5.4.0'] -> setuptools[version='>=18.5']
grpcio -> setuptools
matplotlib -> setuptools
protobuf -> setuptools
plotnine -> matplotlib-base[version='>=3.1.1'] -> setuptools
python=3.7 -> pip -> setuptools
bokeh -> jinja2[version='>=2.7'] -> setuptools
seaborn -> matplotlib-base[version='>=2.1.2'] -> setuptools

Package enum34 conflicts for:
keras -> tensorflow -> enum34[version='>=1.1.6']
matplotlib -> pyqt -> enum34
altair -> traitlets -> enum34
grpcio -> enum34[version='>=1.0.4']

Package python-dateutil conflicts for:
plotnine -> matplotlib-base[version='>=3.1.1'] -> 
python-dateutil[version='>=2.5.*|>=2.6.1']
holoviews -> bokeh[version='>=1.1.0'] -> 
python-dateutil[version='>=2.1|>=2.6.1|>=2.5.*']

matplotlib -> python-dateutil
pandas -> python-dateutil[version='>=2.5.*|>=2.6.1']
bokeh -> python-dateutil[version='>=2.1']
ggplot -> matplotlib-base -> python-dateutil[version='>=2.5.*|>=2.6.1']
seaborn -> matplotlib-base[version='>=2.1.2'] -> 
python-dateutil[version='>=2.5.*|>=2.6.1']

bokeh -> matplotlib -> python-dateutil[version='>=2.5.*|>=2.6.1']
altair -> pandas -> python-dateutil[version='>=2.5.*|>=2.6.1']

Package functools32 conflicts for:
plotnine -> matplotlib[version='>=2.1.0'] -> functools32
seaborn -> matplotlib-base[version='>=2.1.2'] -> functools32
ggplot -> matplotlib-base -> functools32
holoviews -> matplotlib-base[version='>=2.2'] -> functools32
matplotlib -> functools32
bokeh -> matplotlib -> functools32
altair -> jsonschema -> functools32

Package expat conflicts for:
pandas -> pypy3.6[version='>=7.3.1'] -> exp

Re: Error starting spark interpreter with 0.9.0

2020-06-30 Thread Jeff Zhang
Which spark version do you use ? And could you check the spark interpreter
log file ? It is in ZEPPELIN_HOME/logs/zeppelin-interpreter-spark-*.log

David Boyd  于2020年6月30日周二 下午11:11写道:

> All:
>
> Just trying to get 0.9.0 to work and running into all sorts of issues.
> Previously I had set SPARK_MASTER to be yarn-client   so it would use my
> existing yarn cluster.
> That threw an error about yarn-client being deprecated in 2.0.
> So I switched it to local.
> I now get the error about the interpreter not starting and the following
> output in the note:
>
> > org.apache.zeppelin.interpreter.InterpreterException:
> > java.io.IOException: Fail to launch interpreter process: Interpreter
> > launch command: /opt/spark/spark-current/bin/spark-submit --class
> > org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
> > --driver-class-path
> >
> ":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar
>
> >
> /opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop"
>
> > --driver-java-options " -Dfile.encoding=UTF-8
> >
> -Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties'
>
> >
> -Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties'
>
> >
> -Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'"
>
> > --driver-memory 4G --executor-memory 6G --conf
> > spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer
> > --conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin
> > --conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\]
> > --conf spark\.sql\.crossJoin\.enabled\=true --conf
> > spark\.cores\.max\=10
> >
> /opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar
>
> > 10.1.50.111 33591 "spark-dspc_demo" : SLF4J: Class path contains
> > multiple SLF4J bindings. SLF4J: Found binding in
> >
> [jar:file:/opt/zeppelin/zeppelin-0.9.0-SNAPSHOT/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
>
> > Found binding in
> >
> [jar:file:/opt/spark/spark-2.4.3.bdp-1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
>
> > See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation. SLF4J: Actual binding is of type
> > [org.slf4j.impl.Log4jLoggerFactory] at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:134)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:281)
>
> > at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:412)
> > at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:72) at
> > org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
> >
> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>
> > at
> >
> org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:180)
>
> > at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> > at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>
> > at
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
> > at java.lang.Thread.run(Thread.java:748) Caused by:
> > java.io.IOException: Fail to launch interpreter process: Interpreter
> > launch command: /opt/spark/spark-current/bin/spark-submit --class
> > org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
> > --driver-class-path
> >
> ":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar
>
> >
> /opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop"
>
> > --driver-java-options " -Dfile.encoding=UTF-8
> >
> -Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties'
>
> >
> -Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.prope

Error starting spark interpreter with 0.9.0

2020-06-30 Thread David Boyd

All:

   Just trying to get 0.9.0 to work and running into all sorts of issues.
Previously I had set SPARK_MASTER to be yarn-client   so it would use my
existing yarn cluster.
That threw an error about yarn-client being deprecated in 2.0.
So I switched it to local.
I now get the error about the interpreter not starting and the following 
output in the note:


org.apache.zeppelin.interpreter.InterpreterException: 
java.io.IOException: Fail to launch interpreter process: Interpreter 
launch command: /opt/spark/spark-current/bin/spark-submit --class 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 
--driver-class-path 
":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar 
/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop" 
--driver-java-options " -Dfile.encoding=UTF-8 
-Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties' 
-Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties' 
-Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'" 
--driver-memory 4G --executor-memory 6G --conf 
spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer 
--conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin 
--conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\] 
--conf spark\.sql\.crossJoin\.enabled\=true --conf 
spark\.cores\.max\=10 
/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar 
10.1.50.111 33591 "spark-dspc_demo" : SLF4J: Class path contains 
multiple SLF4J bindings. SLF4J: Found binding in 
[jar:file:/opt/zeppelin/zeppelin-0.9.0-SNAPSHOT/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: 
Found binding in 
[jar:file:/opt/spark/spark-2.4.3.bdp-1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: 
See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation. SLF4J: Actual binding is of type 
[org.slf4j.impl.Log4jLoggerFactory] at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:134) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:281) 
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:412) 
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:72) at 
org.apache.zeppelin.scheduler.Job.run(Job.java:172) at 
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) 
at 
org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:180) 
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) 
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) 
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
at java.lang.Thread.run(Thread.java:748) Caused by: 
java.io.IOException: Fail to launch interpreter process: Interpreter 
launch command: /opt/spark/spark-current/bin/spark-submit --class 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 
--driver-class-path 
":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar 
/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop" 
--driver-java-options " -Dfile.encoding=UTF-8 
-Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties' 
-Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties' 
-Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'" 
--driver-memory 4G --executor-memory 6G --conf 
spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer 
--conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin 
--conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\] 
--conf spark\.sql\.crossJoin\.enabled

Question about 0.9.0 - Interpreters downloading dependencies

2020-06-30 Thread David Boyd

All:

   Very much a newbie question but one which I have not encountered before.
I am working with a build from source of 0.9.0  from the branch-0.9  
branch on github.

I am running this on AWS on an EC2 node.

In the log file when I start up I get a bunch of messages about 
interpreters downloading

dependencies.

But I can not see any of them change state to READY in the log.
How do I tell if the downloads are taking place?

I assume this only happens on the first start up after a clean install?

I assume it is downloading the files from the maven URL specified in the 
zeppelin-site.xml?


Where are the downloaded dependencies stored? (I have limited space on 
some file systems in production).


Is there documentation on how to "Pre-download" these and have them 
installed with the software?


--
= mailto:db...@incadencecorp.com 
David W. Boyd
VP,  Data Solutions
10432 Balls Ford, Suite 240
Manassas, VA 20109
office:   +1-703-552-2862
cell: +1-703-402-7908
== http://www.incadencecorp.com/ 
ISO/IEC JTC1 SC42/WG2, editor ISO/IEC 20546, ISO/IEC 20547-1
Chair INCITS TG Big Data
Co-chair NIST Big Data Public Working Group Reference Architecture
First Robotic Mentor - FRC, FTC - www.iliterobotics.org
Board Member- USSTEM Foundation - www.usstem.org

The information contained in this message may be privileged
and/or confidential and protected from disclosure.
If the reader of this message is not the intended recipient
or an employee or agent responsible for delivering this message
to the intended recipient, you are hereby notified that any
dissemination, distribution or copying of this communication
is strictly prohibited.  If you have received this communication
in error, please notify the sender immediately by replying to
this message and deleting the material from any computer.



RE: Using PAM with Zeppelin

2020-06-30 Thread Somanath Jeeva
Hi Stéphane

 

Thanks . Using the linux group name in roles worked for me.

 

 

With Regards

Somanath Thilak J

 

From: stephane.d...@orange.com  
Sent: Tuesday, June 30, 2020 12:15
To: users@zeppelin.apache.org
Subject: RE: Using PAM with Zeppelin

 

Hello Jeeva,

 

In the shiro.ini file, I’ve set some options like this:

/api/configurations/** = authc, roles[zepadmin]

 

Which means that only people from the zepadmin group can modify the
configuration for example. And the zepadmin is a Unix group which by the way
contains some LDAP users but I don’t have permissions on the LDAP directory
to create additional groups. Nevertheless, it does the job.

 

Stéphane

 

 

From: Somanath Jeeva [mailto:somanath.je...@ericsson.com] 
Sent: Tuesday, June 30, 2020 07:03
To: users@zeppelin.apache.org  
Subject: Using PAM with Zeppelin

 

Hi ,

 

I am trying to use zeppelin with red hat linux 7. I am trying to use the PAM
based authentication. 

 

Based on the documentation I am able to enable the PAM based authentication,
but I couldn’t find information on how to make a OS user/group as admin. 

 

Is there any way to make OS users as admin user?

 

 

With Regards

Somanath Thilak J

 



smime.p7s
Description: S/MIME cryptographic signature