Guil; I can see what kind of thing you are trying to do. Perhaps a solution is already available using EVN? In EVN, you can create a "working copy", a kind of sandbox that captures all user edits. In the background scripts capture the edits and target specific names graphs with provenance information on who made the change and when.
EVN is available in TBC-ME. Go to the TBL Personal Server console, at http://localhost:8083/tbl, and choose the link to preview EVN. There is a "Guide to the Enterprise Vocabulary Net" on the home page of EVN. You may be particularly interested inthe sections on Editing a Working Copy, Reviewing and publishing your changes, etc. In terms of your previous question, I don't think deleting directly from <http:tb-session> is the best choice for isolating where triples came from. The way to view a SPARQLMotion script is as a stream of triples. You start by importing a set of triples. Subsequent modules manipulate what is contained in that set of triples. Two useful modules for that purpose is the ApplyConstruct module with replace=true. This will modify the triple stream so that only those in the CONSTRUCT clause are returned. E.g. given a set of input triples and an ApplyConstruct module with auery="CONSTRUCT {? s :myProp ?o} WHERE {?s ?p ?o}" and replace="true", the output triples will be only triples with :myProp as the property. FilterByConstruct is another useful module, and you have demonstrated use of that. Given this and use of GRAPH (e.g. a FilterByConstruct with query="CONSTRUCT {?s :myProp ?o} WHERE { GRAPH <mygraph> {?s ?p ?o}}" will remove all triples from that graph from the script's triple stream), you may be able to isolate the triples down to the triples created in the session, which can then be saved to a file, etc. I think EVN is worth looking at and can provide some hints on the effort required to save session changes and apply them back to the Production Copy. -- Scott On Apr 14, 6:19 pm, Guilherme Scomparim <[email protected]> wrote: > Hi Scott, > > You are right, the main objective is to be able to save in a > separated file only the new triples created as instantiations of the > meta-model without the meta-model and other supporting triples used > during the process. > > Sorry to not make this clearer before > > Thanks in advance, > > Guil > > On Apr 15, 3:05 am, Scott Henninger <[email protected]> > wrote: > > > Guil; I'm not quite understanding what you're trying to accomplish. > > Is it that when users enter data you want to save the changes in a > > separate file? > > > -- Scott > > > On Apr 14, 7:32 am, Guilherme Scomparim <[email protected]> > > wrote: > > > > Hi Scott and Irene, > > > > Finally I manage to get my scripts to work in the ensemble > > > application. > > > > Was a combination of things that you mention that guided me to be able > > > to play with data in the <http://tb-session> graph and also in the > > > current script graph. > > > > Unfortunately I had to find a work around for a funny result I am > > > dealing with. > > > > In case I have a sml:PerformUpdate with the following query after a > > > sml:ImportRDFFromWorkspace > > > > MODIFY GRAPH <http://tb-session> > > > DELETE { > > > ?s ?p ?o .} > > > INSERT { > > > } > > > > WHERE { > > > ?s ?p ?o . > > > > } > > > > If the URI imported is any URI previously imported that is not the > > > Data Graph currently opened by ensemble the query above works fine > > > deleting the imported triples from <http://tb-session> and I can then > > > identify using the following query the current triples on the <http:// > > > tb-session> graph without the deleted imported model > > > > SELECT ?subject ?predicate ?object > > > WHERE { > > > GRAPH <http://tb-session> { > > > ?subject ?predicate ?object . > > > } . > > > > } > > > > However if I import the current Data Graph opened by ensemble it seams > > > that no triples are deleted from the <http://tb-session>. Is this > > > normal or I am doing something wrong? > > > > The workaround was then to create a complex query to find the possible > > > triples that would be created using the meta-model. This is not that > > > elegant or reusable but is working. > > > > The final solution then is to > > > Imports the Data Graph Model using sml:ImportRDFFromWorkspace > > > Delete all triples using sml:FilterByConstruct > > > CONSTRUCT { > > > ?s ?p ?o . > > > } > > > WHERE { > > > ?s ?p ?o . > > > } > > > And then using a sml:IterateOverSelect I inserted the query that > > > select the possible triples created in the model using WHERE {GRAPH > > > <http://tb-session> > > > > After that I just created a series of sml:ApplyConstruct linked as > > > body that will build URIs of the current instantiations of the model > > > in the new file. > > > > I then create the base URI, The Imports triples and export the new > > > file using sml:ExportToRDFFile > > > > I also would like to say that I am impressed with the Debug > > > Functionality available in TBC for Ensemble applications that use > > > SPARQL Motion Scripts. Every day I like more using TBC, and I am > > > looking forward in testing the new features of the 3.5 version. > > > > Tanks again, > > > Guil- Hide quoted text - > > > - Show quoted text - -- You received this message because you are subscribed to the Google Group "TopBraid Suite Users", the topics of which include TopBraid Composer, TopBraid Live, TopBraid Ensemble, SPARQLMotion and SPIN. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/topbraid-users?hl=en
