Hi Eliot,

Thanks for your observations, and some first debugging.

The idmap file is only created and required if UPDINDEX is enabled. Well,
let’s say it should: Maybe this file is not created in due time.

I haven’t managed to reproduce the bug yet. If you are more successful,
please let us know.

Cheers,
Christian


On Wed, Jan 24, 2024 at 12:22 AM Eliot Kimber <eliot.kim...@servicenow.com>
wrote:

> I found the proximate cause: Not having UPDINDEX set to true() on the
> failing server.
>
> When I set UPDINDEX to true on the target database, then the load
> succeeded.
>
> Tracking through the Java code it looks like a failure to initialize the
> idmap member of the DiskData object. This code looks like it could be the
> issue:
>
> // open data and indexes
> init();
> if(meta.updindex) {
>   idmap = new IdPreMap(meta.dbFile(DATAIDP));
>   if(meta.textindex) textIndex = new UpdatableDiskValues(this,
> IndexType.TEXT);
>   if(meta.attrindex) attrIndex = new UpdatableDiskValues(this,
> IndexType.ATTRIBUTE);
>   if(meta.tokenindex) tokenIndex = new UpdatableDiskValues(this,
> IndexType.TOKEN);
> } else {
>   if(meta.textindex) textIndex = new DiskValues(this, IndexType.TEXT);
>   if(meta.attrindex) attrIndex = new DiskValues(this,
> IndexType.ATTRIBUTE);
>   if(meta.tokenindex) tokenIndex = new DiskValues(this, IndexType.TOKEN);
> }
> if(meta.ftindex) ftIndex = new FTIndex(this);
>
>
>
> The idmap is only set of meta.updIndex is true.
>
>
>
> Cheers,
>
>
>
> E.
>
> _____________________________________________
>
> *Eliot Kimber*
>
> Sr Staff Content Engineer
>
> O: 512 554 9368
>
> M: 512 554 9368
>
> servicenow.com <https://www.servicenow.com>
>
> LinkedIn <https://www.linkedin.com/company/servicenow> | Twitter
> <https://twitter.com/servicenow> | YouTube
> <https://www.youtube.com/user/servicenowinc> | Facebook
> <https://www.facebook.com/servicenow>
>
>
>
> *From: *Eliot Kimber <eliot.kim...@servicenow.com>
> *Date: *Tuesday, January 23, 2024 at 11:51 AM
> *To: *basex-talk@mailman.uni-konstanz.de <
> basex-talk@mailman.uni-konstanz.de>
> *Subject: *Consistent NPE loading content on one server, other server
> works
>
> I have two servers running the same code, both on 10.7. I have a REST API
> handler that takes data as input and stores it in a database, applying some
> preprocessing to the data first.
>
> The data successfully loads on my dev server and consistently fails on the
> production server, with this Java failure:
>
> [INFO] Posting
> "/Users/eliot.kimber/git-basex/product-content-analytics/analytics/adobe/adobe-analytics-Utah_Pages_Viewed_Aug2023-clean.csv"
>
> [INFO]   as filename "adobe-analytics-Utah_Pages_Viewed_Aug2023-clean.csv"
>
> [INFO]   to Mirabel server "http://mirabel.corp.service-now.com:9984";...
>
> Unexpected error: Improper use? Potential bug? Your feedback is welcome:
>
> Contact: basex-talk@mailman.uni-konstanz.de
>
> Version: BaseX 10.7
>
> Java: Oracle Corporation, 17.0.8
>
> OS: Linux, amd64
>
> Stack Trace:
>
> java.lang.NullPointerException: Cannot invoke
> "org.basex.index.IdPreMap.write(org.basex.io.IOFile)" because "this.idmap"
> is null
>
>                 at org.basex.data.DiskData.write(DiskData.java:151)
>
>                 at org.basex.data.DiskData.close(DiskData.java:160)
>
>                 at org.basex.core.Datas.unpin(Datas.java:52)
>
>                 at org.basex.core.cmd.Close.close(Close.java:45)
>
>                 at
> org.basex.core.cmd.OptimizeAll.optimizeAll(OptimizeAll.java:124)
>
>                 at
> org.basex.query.up.primitives.db.DBOptimize.apply(DBOptimize.java:119)
>
>                 at
> org.basex.query.up.DataUpdates.applyDbUpdates(DataUpdates.java:213)
>
>                 at
> org.basex.query.up.DataUpdates.apply(DataUpdates.java:172)
>
>                 at
> org.basex.query.up.ContextModifier.apply(ContextModifier.java:120)
>
>                 at org.basex.query.up.Updates.apply(Updates.java:179)
>
>                 at
> org.basex.query.QueryContext.update(QueryContext.java:663)
>
>                 at
> org.basex.query.QueryContext.lambda$iter$4(QueryContext.java:357)
>
>                 at org.basex.query.QueryContext.run(QueryContext.java:766)
>
>                 at org.basex.query.QueryContext.iter(QueryContext.java:357)
>
>                 at
> org.basex.http.restxq.RestXqResponse.serialize(RestXqResponse.java:78)
>
>                 at
> org.basex.http.web.WebResponse.create(WebResponse.java:58)
>
>                 at
> org.basex.http.restxq.RestXqServlet.run(RestXqServlet.java:72)
>
>                 at
> org.basex.http.BaseXServlet.service(BaseXServlet.java:69)
>
>                 at
> javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
>
>                 at
> org.eclipse.jetty.servlet.ServletHolder$NotAsync.service(ServletHolder.java:1459)
>
>                 at
> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)
>
>                 at
> org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656)
>
>                 at
> org.eclipse.jetty.servlets.CrossOriginFilter.handle(CrossOriginFilter.java:319)
>
>                 at
> org.eclipse.jetty.servlets.CrossOriginFilter.doFilter(CrossOriginFilter.java:273)
>
>                 at
> org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:201)
>
>                 at
> org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
>
>                 at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552)
>
>                 at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
>
>                 at
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:600)
>
>                 at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
>
>                 at
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
>
>                 at
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
>
>                 at
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
>
>                 at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
>
>                 at
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
>
>                 at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
>
>                 at
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
>
>                 at
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
>
>                 at
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
>
>                 at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>
>                 at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
>
>                 at org.eclipse.jetty.server.Server.handle(Server.java:516)
>
>                 at
> org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
>
>                 at
> org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
>
>                 at
> org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
>
>                 at
> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
>
>                 at
> org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
>
>                 at
> org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
>
>                 at
> org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
>
>                 at
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
>
>                 at
> org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
>
>                 at java.base/java.lang.Thread.run(Thread.java:833)[INFO]
> Data loaded.
>
>
>
> The two servers are essentially identical Linux servers, although the
> production server has more resources.
>
> Any idea what might cause this failure or what I can do to diagnose it?
>
> There must be some non-obvious difference in these two servers but I don’t
> know what to look for.
>
> The code doing the load to a database is:
>
>     let $reportPath as xs:string :=
>
>
>         
> ``[/`{$analyticsmgmt:csvPathRoot}`/`{$source}`/`{$reportType}`/`{$reportName}`.xml]``
>
>     let $msg := prof:dump(``[[INFO] Storing analytics report
> "`{$reportPath}`"]``)
>
>     let $csv := $csv transform with { insert node attribute {'timestamp'}
> {$timeStampStr} into ./* }
>
>     let $csv := $csv transform with {
> analyticsmgmt:dispatchAnalysticsStorePreProcessing($reportName, ., $debug) }
>
>     return
>
>     try {
>
>       (db:put($analyticsDb, $csv, $reportPath),
>
>        db:optimize($analyticsDb,
>
>        true(),
>
>          map{
>
>            'attrindex' : true(),
>
>            'tokenindex' : true(),
>
>            'updindex' : true()
>
>          })
>
>       )
>
>     } catch * {
>
>       util:logToConsole(
>
>         'analyticsmgmt:storeAnalyticsCsvReport',
>
>         ``[`{$err:code}`: `{$err:description}`]``,
>
>         'error'
>
>       )
>
>     }
>
>
>
> Thanks,
>
>
>
> Eliot
>
> _____________________________________________
>
> *Eliot Kimber*
>
> Sr Staff Content Engineer
>
> O: 512 554 9368
>
> M: 512 554 9368
>
> servicenow.com <https://www.servicenow.com>
>
> LinkedIn <https://www.linkedin.com/company/servicenow> | Twitter
> <https://twitter.com/servicenow> | YouTube
> <https://www.youtube.com/user/servicenowinc> | Facebook
> <https://www.facebook.com/servicenow>
>

Reply via email to