This looks potentially interesting.
Can you please clarify the following: 1. Quark allows online data from DW + archived data stored in HDFS via Hive? 2. Does Quack work with columnar commercial Data Warehouses like SAP Sybase IQ <http://www.sap.com/pc/tech/database/software/sybase-iq-big-data-management/index.html> 3. Sybase IQ does not require cubes like SSAS etc, will this work. My point is we shift the archive data from Sybase IQ to HDFS and query those tables there. Thanks, Dr Mich Talebzadeh LinkedIn https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw Sybase ASE 15 Gold Medal Award 2008 A Winning Strategy: Running the most Critical Financial Data on ASE 15 http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4 Publications due shortly: Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Technology Ltd, its subsidiaries nor their employees accept any responsibility. From: Rajat Venkatesh [mailto:[email protected]] Sent: 27 January 2016 10:57 To: [email protected] Subject: Quark - Transactional data in Hive, Mat. Views or Cubes in DWH I am a developer at Qubole and I want to introduce an open source project - Quark - https://github.com/qubole/quark. If you are using Apache Hive with data warehouses like Vertica, Greenplum or Redshift, Quark will simplify access to data for data analysts. Two concrete examples where Quark is useful are: 1. Hot data is stored in a data warehouse (Redshift, Vertica etc) and cold data is stored in HDFS and accessed through Apache Hive. 2. Cubes are stored in Redshift and the base tables are stored HDFS. Data analysts will submit queries to quark through a JDBC application like Apache Zeppelin or sqlLine. Quark reroutes queries to the optimal dataset. Note that Quark is *not* a federation engine. It does not join data across databases. It can integrate with Presto or Hive for federation but the preferred option is to run a query in a single datastore. Here is an example of Quark with Hive on EMR & Redshift: https://github.com/qubole/quark/blob/master/examples/EMR.md . If this sounds interesting, we want to hear from you. We are available at [email protected] <mailto:[email protected]> and https://gitter.im/qubole/quark
