. Then you use that
context for executing queries sent using your REST API.
Mohammed
From: Alaa Ali [mailto:contact.a...@gmail.com]
Sent: Sunday, November 23, 2014 12:37 PM
To: user@spark.apache.org
Subject: Creating a front-end for output from Spark/PySpark
Hello. Okay, so I'm working on a project
Hello. Okay, so I'm working on a project to run analytic processing using
Spark or PySpark. Right now, I connect to the shell and execute my
commands. The very first part of my commands is: create an SQL JDBC
connection and cursor to pull from Apache Phoenix, do some processing on
the returned
Alaa,
one option is to use Spark as a cache, importing subset of data from
hbase/phoenix that fits in memory, and using jdbcrdd to get more data on
cache miss. The front end can be created with pyspark and flusk, either as
rest api translating json requests to sparkSQL dialect, or simply