On 06/03/15 02:21, Saikat Maitra wrote:
  Hello,

I wanted to clarify few points related to separate service end point for
the SPARQL Query Caching  feature. I was considering to take up either of
the below approach to implement the cache service end point. I am open for
both the approach and wanted to discuss further.

Option 1 : create a separate micro service  that can be deployed in tomcat
or run using jetty server for cache service like external cache service and
cache the Sparql query result. The SPARQL_UberServlet can checks the
external service if cache data is available then serve the cache data
otherwise send the query to SPARQL_Query and copy the data and populate the
cache in the separate micro service.

To check my understanding, let me put that in my own words: This is essentially fronting the Fuseki server with another webapp that intercepts all requests to the Fuseki server. ("All requests" because it has to see update to manage the cache).

Option 2 : Create a separate servlet like Sparql_Cache that run as part of
fuseki server and we send all sparql query to Sparql_Cache servlet which
checks if the cache data is available then serve or contact Sparql_Query to
get latest data and respond.

Again, to check my understanding: this is a new servlet, or servlet filter, that sits between SPARQL_UberServlet and the incoming request and decides whether

So the difference between option1 and option 2 is whether it is a another webapp external to Fuseki, or code inside Fuseki.

Both will work - personally, I'd go for option 2 because the Sparql_Cache servlet can have detailed access to the internals of Fuseki, for example the service registry and the admin interface. It can be integrated into the UI (in Fuseki2) so the admin console can be extended to manage the cache. See the stats page in the Fuseki2 UI. If its external, the management will be a separate function.

Does that make sense?

        Andy


Please let me know your feedback.

Regards
Saikat

Reply via email to