Thanks Michael Missed that point as well as the integration of SQL within the scala shell (with setting the SQLContext)Looking forward to feature parity with feature releases. (Shark -> Spark SQL) Cheers.
From: mich...@databricks.com Date: Thu, 10 Jul 2014 16:20:20 -0700 Subject: Re: EC2 Cluster script. Shark install fails To: user@spark.apache.org There is no version of Shark that is compatible with Spark 1.0, however, Spark SQL does come included automatically. More information here: http://databricks.com/blog/2014/07/01/shark-spark-sql-hive-on-spark-and-the-future-of-sql-on-spark.html http://spark.apache.org/docs/latest/sql-programming-guide.html On Thu, Jul 10, 2014 at 5:51 AM, Jason H <jas...@developer.net.nz> wrote: Hi Just going though the process of installing Spark 1.0.0 on EC2 and notice that the script throws an error when installing shark. Unpacking Spark ~/spark-ec2 Initializing shark ~ ~/spark-ec2 ERROR: Unknown Shark version The install completes in the end but shark is completely missed. Looking for info on the best way to manually add this in now that the cluster is setup. Is there no Shark version compat with 1.0.0 or this script? Any suggestions appreciated.