Re: PostgreSQL JDBC Connections

2017-01-05 Thread t p
YMMV and I don’t think my approach will work for your use case. Here is a suggestion based on what I’ve done. In the first paragraph you can register tables with code as such. %spark val example = sqlContext.read.format("jdbc").options( Map("url" ->

Re: PostgreSQL JDBC Connections

2017-01-05 Thread Benjamin Kim
We are using the JDBC interpreter. The business analysts only know SQL and run ad-hoc queries for their report exports to CSV. Cheers, Ben > On Jan 5, 2017, at 2:21 PM, t p wrote: > > Are you using JDBC or the PSQL interpreter? I had encountered something > similar

Re: PostgreSQL JDBC Connections

2017-01-05 Thread t p
Are you using JDBC or the PSQL interpreter? I had encountered something similar while using the PSQL interpreter and I had to restart Zeppelin. My experience using PSQL (Postgresql, HAWK) was not as good as using spark/scala wrappers (JDBC data source) to connect via JDBC and then register

pyspark can't run through

2017-01-05 Thread Alec Lee
Hello, all I recently come cross good tool - zeppelin, it is easy to use. But I have some troubles to make pyspark work in my server. The code below used to work fine, but for no reason it pop up errors like permission deny. Code %pyspark import pandas ##