RE: Spark-sql showing no table

2016-07-13 Thread Vikash Kumar
I am creating a sqlContext from exiting sc. Var tables = sqlContext.sql("show tables") Thanks and regards, Vikash Kumar From: Mohit Jaggi [mailto:mohitja...@gmail.com] Sent: Wednesday, July 13, 2016 10:24 PM To: users@zeppelin.apache.org Subject: Re: Spark-sql showing no table

Connection refused when creating remote interperter

2016-07-13 Thread Jeff Zhang
I use the zeppelin 0.6 but hit this weird issue when running the tutorial notebook. It seems this happens when creating remote interpreter. This is a very basic feature to me, not sure why would it not work (Actually I don't have this issue in master). so I suspect whether it is caused by

Re: Order of paragraphs vs. different interpreters (spark vs. pyspark)

2016-07-13 Thread CloverHearts
nice to meet you. I have created a . Do you need the feature to run for all paragraph in note? I think that the function is needed. I will implement it. Thank you. 출발: xiufeng liu 회신 대상: 날짜:

Re: Order of paragraphs vs. different interpreters (spark vs. pyspark)

2016-07-13 Thread Hyung Sung Shim
hi I think you can run the workflows that you defined just 'run' paragraph. and I believe functionality of view are going to be better. :) 2016년 7월 14일 목요일, xiufeng liu님이 작성한 메시지: > It is easy to change the code. I did myself and use it as an ETL tool. It > is very powerful

Re: Order of paragraphs vs. different interpreters (spark vs. pyspark)

2016-07-13 Thread xiufeng liu
It is easy to change the code. I did myself and use it as an ETL tool. It is very powerful Afancy On Wednesday, July 13, 2016, Ahmed Sobhi wrote: > I think this pr addresses what I need. Case 2 seem to describe the issue > I'm having if I'm reading it correctly. > > The

Re: Clear results from Zeppelin notebook json

2016-07-13 Thread Ahmed Sobhi
I could not reproduce with 0.6.0. I reran several trials again with 0.5.6, and it now works as expected. I was trying both exporting, and checking the notebooks directory at the same time and that got me confused where directly grabbing the notebook json from the notebooks directory didn't seem

Re: Pass parameters to paragraphs via URL

2016-07-13 Thread TEJA SRIVASTAV
PS typo On Wed, Jul 13, 2016 at 9:03 PM TEJA SRIVASTAV wrote: > We do have work around for that but Validate. > You need to use angularBinding to achieve it > %angular > > var > scope=angular.element(document.getElementById("main")).scope().$root.compiledScope; >

Re: Pass parameters to paragraphs via URL

2016-07-13 Thread TEJA SRIVASTAV
We do have work around for that but Validate. You need to use angularBinding to achieve it %angular var scope=angular.element(document.getElementById("main")).scope().$root.compiledScope; scope.getLocationParams = function(){ var pairs = window.location.search.substring(1).split("&"), obj =

Re: Order of paragraphs vs. different interpreters (spark vs. pyspark)

2016-07-13 Thread xiufeng liu
You have to change the source codes to add the dependencies of running paragraphs. I think it is a really interesting feature, for example, it can be use as an ETL tool. But, unfortunately, there is no configuration option right now. /afancy On Wed, Jul 13, 2016 at 12:27 PM, Ahmed Sobhi

Order of paragraphs vs. different interpreters (spark vs. pyspark)

2016-07-13 Thread Ahmed Sobhi
Hello, I have been working on a large Spark Scala notebook. I recently had the requirement to produce graphs/plots out of these data. Python and PySpark seemed like a natural fit but since I've already invested a lot of time and effort into the Scala version, I want to restrict my usage of python

Re: Pass parameters to paragraphs via URL

2016-07-13 Thread Rajesh Balamohan
+1 on this. I am not sure if this is possible. If so, it would be really helpful. ~Rajesh.B On Fri, Jul 8, 2016 at 11:33 PM, on wrote: > Hi, > > I am trying to pass parameters via URL to a published paragraph (and to > run it after that), e.g., I would like to get

Spark-sql showing no table

2016-07-13 Thread Vikash Kumar
Hi all, I am using spark with scala to read phoenix tables and register as temporary table. Which I am able to do. After that when I am running query : %sql show tables Its giving all possible output, but when I am

RE: Phoenix Interpreter in 0.6 release

2016-07-13 Thread Vikash Kumar
Thank you dear. I will go for that commit. From: Jongyoul Lee [mailto:jongy...@gmail.com] Sent: Wednesday, July 13, 2016 11:43 AM To: users@zeppelin.apache.org Subject: Re: Phoenix Interpreter in 0.6 release Okay, I see your situation. You can use that feature with 'phoenix.TenantId' in your

Re: Phoenix Interpreter in 0.6 release

2016-07-13 Thread Jongyoul Lee
Okay, I see your situation. You can use that feature with 'phoenix.TenantId' in your interpreter tab. the properties from phoenix. will pass the properties extracting 'pheonix.'. Try it again and let me know the result. The last commit before removing PhoenixInterpreter is