(-dev) The KVStore API is private to Spark, it's not really meant to be used by others. You're free to try, and there's a lot of javadocs on the different interfaces, but it's not a general purpose database, so you'll need to figure out things like that by yourself.
On Tue, May 8, 2018 at 9:53 PM, Anshi Shrivastava <anshi.shrivast...@exadatum.com> wrote: > Hi Marcelo, Dev, > > Thanks for your response. > I have used SparkListeners to fetch the metrics (the public REST API uses > the same) but to monitor these metrics over time, I have to persist them > (using KVStore library of spark). Is there a way to fetch data from this > KVStore (which uses levelDb for storage) and filter it on basis on > timestamp? > > Thanks, > Anshi > > On Mon, May 7, 2018 at 9:51 PM, Marcelo Vanzin [via Apache Spark User List] > <ml+s1001560n32114...@n3.nabble.com> wrote: >> >> On Mon, May 7, 2018 at 1:44 AM, Anshi Shrivastava >> <[hidden email]> wrote: >> > I've found a KVStore wrapper which stores all the metrics in a LevelDb >> > store. This KVStore wrapper is available as a spark-dependency but we >> > cannot >> > access the metrics directly from spark since they are all private. >> >> I'm not sure what it is you're trying to do exactly, but there's a >> public REST API that exposes all the data Spark keeps about >> applications. There's also a programmatic status tracker >> (SparkContext.statusTracker) that's easier to use from within the >> running Spark app, but has a lot less info. >> >> > Can we use this store to store our own metrics? >> >> No. >> >> > Also can we retrieve these metrics based on timestamp? >> >> Only if the REST API has that feature, don't remember off the top of my >> head. >> >> >> -- >> Marcelo >> >> --------------------------------------------------------------------- >> To unsubscribe e-mail: [hidden email] >> >> >> >> ________________________________ >> If you reply to this email, your message will be added to the discussion >> below: >> >> http://apache-spark-user-list.1001560.n3.nabble.com/Re-Spark-UI-Source-Code-tp32114.html >> To start a new topic under Apache Spark User List, email >> ml+s1001560n1...@n3.nabble.com >> To unsubscribe from Apache Spark User List, click here. >> NAML > > > > > DISCLAIMER: > All the content in email is intended for the recipient and not to be > published elsewhere without Exadatum consent. And attachments shall be send > only if required and with ownership of the sender. This message contains > confidential information and is intended only for the individual named. If > you are not the named addressee, you should not disseminate, distribute or > copy this email. Please notify the sender immediately by email if you have > received this email by mistake and delete this email from your system. Email > transmission cannot be guaranteed to be secure or error-free, as information > could be intercepted, corrupted, lost, destroyed, arrive late or incomplete, > or contain viruses. The sender, therefore, does not accept liability for any > errors or omissions in the contents of this message which arise as a result > of email transmission. If verification is required, please request a > hard-copy version. -- Marcelo --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org