On 16/03/15 16:07, Stephen Allen wrote:
This looks like a possible combination of two bugs. In
TextDocProducerTriples, there is a ThreadLocal called inTransaction that
should be subclassing ThreadLocal and overriding the initialValue() method
to return false.  It is not doing so.

This would be OK if the start()/finish() methods were being called as part
of a transaction, but for some reason they are not.  I don't know why they
are not, my guess is perhaps the inferencing graph is not interacting well
with the transaction boundaries?

-Stephen

Hi Stephen,

Inference does not know about transactions - for anything non transactional, Fuseki wraps it a DatasetGraphWithLock that uses an MRSW lock to fake it. That does not support abort. (in HttpAction). That might be the start/finish cause.

(if it were not for inference, this could be a recording datasetgraph that could abort - that might still work but I haven't looked deeply enough)

Fuseki is also careful about parsing data first so the action does not crash during updates from syntax errors. That's not what is happening here.

        Andy

On Mon, Mar 16, 2015 at 11:55 AM, Andy Seaborne <a...@apache.org> wrote:

Hi Yang,

That's really useful.

Was in both cases of
    :text_dataset -> <#tdb_graph> -> <#tdb_ds>
and
    :text_dataset -> <#tdb_ds>

or just the latter.



1.1.1 to 1.1.2. is a maintenance update to Fuseki1.

Fuseki2 server is restructured to make the security work, add the admin
operations so it is quite different, and much better structured, to
Fuseki1.  That said, it does reuse lots of code taken from Fuseki1 - it's a
separate code base now.  Server-wise it is supposed to be compatible with
Fuseki1 start-up though it has better ways to run as a server, hopefully
making it easier to use in deployment environments. And it can run as a WAR.

With the new UI, the task of start-create dataset-load data-make query can
be done without config files.  The new UI is new, and not entirely finished.


         Andy


On 16/03/15 14:44, Yang Yuanzhe wrote:

Hi Andy,

Ha, your suggestion is to the point precisely. It works after removing
RDFS reasoner, in both 1.1.2 and 2! So it seems the problem is from RDFS
side. From 1.1.1 to 2 is a huge leap, but is there also a big
improvement from 1.1.1 to 1.1.2? They collaborated very well before.

Regards,
Yang

On 03/16/2015 02:29 PM, Andy Seaborne wrote:

Thanks - that suggest it is not Fuseki itself but other changes in
this release.

I don't understand why it's aborting at all, uploads aren't supposed
to abort.

One more question:

What happens if there is no RDFS? Either using a dataset directly for
:text_dataset or just removing the RDFS wrapper on the graph
<#tdb_inf_ds> to <#tdb_graph>.

(OK - that's two questions really - sorry to do this
question-by-question but to produce a replicated test case details
matter. There is quite alot of machinery here so a start is to changes
some of that and see if it is still broken.)

     Andy


On 16/03/15 12:55, Yang Yuanzhe wrote:

Hi Andy,

Thanks for your reminder. We are using 1.1.1. But I just gave a try on
1.1.2. The result was almost the same:
```
13:47:38 INFO  [1] Upload: Graph: default (6 triple(s))
13:47:38 WARN  Exception during abort (operation attempts to continue):
Can't abort a write lock-transaction
13:47:38 INFO  [1] 500 Server Error (222 ms)
```

Regards,
Yang

On 03/16/2015 01:31 PM, Andy Seaborne wrote:

Hi Yang,

One quick question - does your configuration work with Fuseki 1.1.2?
It was released at the same time as Fuseki2 (last Friday).

It would help in knowing if this is a text indexing issue of a Fuseki
server issue.  2.0.0 and 1.1.2 use the same text indexing code.

     Andy

On 16/03/15 12:03, Yang Yuanzhe wrote:

Hi there,

We upgraded to Fuseki 2 as soon as it is released. However the
configuration file we are using for Fuseki 1 is no longer working (or
compatible?) with Fuseki 2. We spent quite a lot of time to tune it
but
we failed. There is no sufficient reference for this issue on your
website, either. That's why we are writing to you to seek some help.
Sorry to trouble you and thank you very much in advance.

We want to use TDB with RDFS reasoning and full text search, the
corresponding config file used in Fuseki 1 is as follows:

```
@prefix :        <#> .
@prefix fuseki: <http://jena.apache.org/fuseki#> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
@prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
@prefix text: <http://jena.apache.org/text#> .
@prefix spatial: <http://jena.apache.org/spatial#> .
@prefix skos: <http://www.w3.org/2004/02/skos/core#> .

[] a fuseki:Server ;
     fuseki:services (
       <#tdb>
     ) .

# Custom code.
[] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .

# TDB
tdb:DatasetTDB  rdfs:subClassOf  ja:RDFDataset .
tdb:GraphTDB    rdfs:subClassOf  ja:Model .

<#tdb> a fuseki:Service ;
      fuseki:name              "tdb" ;             # http://host/tdb
      fuseki:serviceQuery      "sparql" ;          # SPARQL query
service
      fuseki:serviceQuery      "query" ;
      fuseki:serviceUpdate     "update" ;
      fuseki:serviceUpload     "upload" ;          # Non-SPARQL upload
service
      fuseki:serviceReadGraphStore    "get" ;
      fuseki:serviceReadWriteGraphStore    "data" ;
      fuseki:dataset           :text_dataset ;
      .

# ---- RDFS Inference models
# These must be incorporate in a dataset in order to use them.
# All in one file.
<#tdb_inf_ds> a ja:RDFDataset ;
      ja:defaultGraph       <#tdb_inf> ;
      .

<#tdb_inf> a ja:InfModel ;
      rdfs:label "RDFS Inference Model" ;
      ja:baseModel <#tdb_graph> ;
      ja:reasoner
           [ ja:reasonerURL
<http://jena.hpl.hp.com/2003/RDFSExptRuleReasoner> ]
      .

<#tdb_graph> a tdb:GraphTDB ;
      tdb:dataset <#tdb_ds> .

# A TDB datset used for RDF storage
<#tdb_ds> a tdb:DatasetTDB;
      tdb:location "Data";
      .

## Initialize text query
[] ja:loadClass       "org.apache.jena.query.text.TextQuery" .
# A TextDataset is a regular dataset with a text index.
text:TextDataset      rdfs:subClassOf   ja:RDFDataset .
# Lucene index
text:TextIndexLucene  rdfs:subClassOf   text:TextIndex .
# Solr index
#text:TextIndexSolr    rdfs:subClassOf   text:TextIndex .

:text_dataset a text:TextDataset ;
      text:dataset   <#tdb_inf_ds> ;
      text:index     <#textIndexLucene> ;
      .

<#textIndexLucene> a text:TextIndexLucene ;
      #text:directory <file:Text> ;
      text:directory "mem" ;
      text:entityMap <#entMap> ;
      .

<#entMap> a text:EntityMap ;
      text:entityField      "uri" ;
      text:defaultField     "text" ;
      text:map (
           [ text:field "text" ; text:predicate rdfs:label ]
           ) .
```

Here is some test input:
```
@prefix cc: <http://creativecommons.org/ns#> .
@prefix dct: <http://purl.org/dc/terms/> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix locn: <http://www.w3.org/ns/locn#> .
@prefix org: <http://www.w3.org/ns/org#> .
@prefix owl: <http://www.w3.org/2002/07/owl#> .
@prefix pav: <http://purl.org/pav/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix schema: <http://schema.org/> .
@prefix skos: <http://www.w3.org/2004/02/skos/core#> .
@prefix xsd: <http://www.w3.org/2001/XMLSchema#> .

<http://id.vlaanderen.be/licentie/open-data-licentie-
tegen-billijke-vergoeding-1.2#id>



    rdf:type cc:License ;
    cc:legalcode
<http://opendataforum.info/Docs/licenties/GOD_BV.htm> ;
    dct:title "Modellicentie 3 - Open Data Licentie tegen Billijke
Vergoeding - v1.2"@nl ;
    pav:hasEarlierVersion
<http://id.vlaanderen.be/licentie/open-data-licentie-
tegen-billijke-vergoeding#id>


;
    pav:version "1.2"^^xsd:string ;
    rdfs:label "Modellicentie 3 - Open Data Licentie tegen Billijke
Vergoeding"@nl ;
.
```

Delete existing datasets, start a new instance with config file,
upload
the test input to the dataset, and we are stuck:
```
[2015-03-16 12:21:44] Fuseki     INFO  [12] Filename: test.ttl,
Content-Type=text/turtle, Charset=null => Turtle : Count=6 Triples=6
Quads=0
[2015-03-16 12:21:44] HttpAction WARN  Exception during abort
(operation
attempts to continue): Can't abort a write lock-transaction
[2015-03-16 12:21:44] Fuseki     INFO  [12] 500 Server Error (199 ms)
```

If we ask Fuseki to list all triples now, we can discover that the
triple for "rdfs:label" exists, but all other triples are missing.

So we tried an indirect approach: delete the triple for "rdfs:label",
the upload succeeds; then insert this triple:
```
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
INSERT DATA {
<http://id.vlaanderen.be/licentie/open-data-licentie-
tegen-billijke-vergoeding-1.2#id>



      rdfs:label "Modellicentie 3 - Open Data Licentie tegen Billijke
Vergoeding"@nl .
}
```

However, we got an exception:
```
[2015-03-16 12:33:37] Fuseki     INFO  [14] POST
http://localhost:3030/tdb/update
[2015-03-16 12:33:37] Fuseki     INFO  [14] POST /tdb :: 'update' ::
[application/x-www-form-urlencoded charset=UTF-8] ?
[2015-03-16 12:33:37] HttpAction WARN  Exception during abort
(operation
attempts to continue): Can't abort a write lock-transaction
java.lang.NullPointerException
      at
org.apache.jena.query.text.TextDocProducerTriples.change(
TextDocProducerTriples.java:66)



      at
com.hp.hpl.jena.sparql.core.DatasetGraphMonitor.record(
DatasetGraphMonitor.java:194)



      at
com.hp.hpl.jena.sparql.core.DatasetGraphMonitor.add$(
DatasetGraphMonitor.java:120)



      at
com.hp.hpl.jena.sparql.core.DatasetGraphMonitor.add(
DatasetGraphMonitor.java:96)



      at
com.hp.hpl.jena.sparql.core.DatasetGraphWrapper.add(
DatasetGraphWrapper.java:88)



      at
com.hp.hpl.jena.sparql.modify.UpdateEngineWorker.addToGraphStore(
UpdateEngineWorker.java:555)



      at
com.hp.hpl.jena.sparql.modify.UpdateEngineWorker.visit(
UpdateEngineWorker.java:315)



      at
com.hp.hpl.jena.sparql.modify.request.UpdateDataInsert.
visit(UpdateDataInsert.java:27)



      at
com.hp.hpl.jena.sparql.modify.UpdateVisitorSink.send(
UpdateVisitorSink.java:46)



      at
com.hp.hpl.jena.sparql.modify.UpdateVisitorSink.send(
UpdateVisitorSink.java:26)



      at org.apache.jena.atlas.iterator.Iter.sendToSink(Iter.java:695)
      at org.apache.jena.atlas.iterator.Iter.sendToSink(Iter.java:702)
      at
com.hp.hpl.jena.sparql.modify.UpdateProcessorBase.execute(
UpdateProcessorBase.java:61)



      at
com.hp.hpl.jena.update.UpdateAction.execute$(UpdateAction.java:234)
      at
com.hp.hpl.jena.update.UpdateAction.execute(UpdateAction.java:224)
      at
com.hp.hpl.jena.update.UpdateAction.execute(UpdateAction.java:204)
      at
com.hp.hpl.jena.update.UpdateAction.execute(UpdateAction.java:186)
      at
org.apache.jena.fuseki.servlets.SPARQL_Update.
execute(SPARQL_Update.java:222)



      at
org.apache.jena.fuseki.servlets.SPARQL_Update.
executeForm(SPARQL_Update.java:197)



      at
org.apache.jena.fuseki.servlets.SPARQL_Update.
perform(SPARQL_Update.java:105)



      at
org.apache.jena.fuseki.servlets.ActionSPARQL.
executeLifecycle(ActionSPARQL.java:130)



      at
org.apache.jena.fuseki.servlets.SPARQL_UberServlet.
serviceDispatch(SPARQL_UberServlet.java:324)



      at
org.apache.jena.fuseki.servlets.SPARQL_UberServlet.
executeAction(SPARQL_UberServlet.java:236)



      at
org.apache.jena.fuseki.servlets.ActionSPARQL.
execCommonWorker(ActionSPARQL.java:84)



      at
org.apache.jena.fuseki.servlets.ActionBase.doCommon(
ActionBase.java:81)

      at
org.apache.jena.fuseki.servlets.FusekiFilter.
doFilter(FusekiFilter.java:71)


      at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.
doFilter(ServletHandler.java:1632)



      at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(
ProxiedFilterChain.java:61)



      at
org.apache.shiro.web.servlet.AdviceFilter.executeChain(
AdviceFilter.java:108)



      at
org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(
AdviceFilter.java:137)



      at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(
OncePerRequestFilter.java:125)



      at
org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(
ProxiedFilterChain.java:66)



      at
org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(
AbstractShiroFilter.java:449)



      at
org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(
AbstractShiroFilter.java:365)



      at
org.apache.shiro.subject.support.SubjectCallable.
doCall(SubjectCallable.java:90)



      at
org.apache.shiro.subject.support.SubjectCallable.call(
SubjectCallable.java:83)



      at
org.apache.shiro.subject.support.DelegatingSubject.
execute(DelegatingSubject.java:383)



      at
org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(
AbstractShiroFilter.java:362)



      at
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(
OncePerRequestFilter.java:125)



      at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.
doFilter(ServletHandler.java:1624)



      at
org.eclipse.jetty.servlet.ServletHandler.doHandle(
ServletHandler.java:550)


      at
org.eclipse.jetty.server.handler.ScopedHandler.handle(
ScopedHandler.java:143)



      at
org.eclipse.jetty.security.SecurityHandler.handle(
SecurityHandler.java:568)


      at
org.eclipse.jetty.server.session.SessionHandler.
doHandle(SessionHandler.java:221)



      at
org.eclipse.jetty.server.handler.ContextHandler.
doHandle(ContextHandler.java:1110)



      at
org.eclipse.jetty.servlet.ServletHandler.doScope(
ServletHandler.java:479)


      at
org.eclipse.jetty.server.session.SessionHandler.
doScope(SessionHandler.java:183)



      at
org.eclipse.jetty.server.handler.ContextHandler.
doScope(ContextHandler.java:1044)



      at
org.eclipse.jetty.server.handler.ScopedHandler.handle(
ScopedHandler.java:141)



      at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(
HandlerWrapper.java:97)



      at org.eclipse.jetty.server.Server.handle(Server.java:459)
      at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:281)
      at
org.eclipse.jetty.server.HttpConnection.onFillable(
HttpConnection.java:232)


      at
org.eclipse.jetty.io.AbstractConnection$1.run(
AbstractConnection.java:505)


      at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(
QueuedThreadPool.java:607)



      at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(
QueuedThreadPool.java:536)



      at java.lang.Thread.run(Thread.java:745)
[2015-03-16 12:33:37] Fuseki     INFO  [14] 500 Server Error (163 ms)
```

Then we stopped our trials because we can't find more possibilities.
Does anybody have any clue about it? Thank you in advance and have a
nice day.

Regards,
Yang









Reply via email to