How are you constructing the Stream with classes or using a Streaming
Expression?
In either case can you post either the code or expression?
Are there more errors in the logs? The place where this NPE is occurring is
that an underlying stream is null, which leads me to believe there would be
Can you provide a sample expression that would be able to reproduce this?
Are you able to try a newer version by chance - I know we've fixed a few
NPEs recently, maybe https://issues.apache.org/jira/browse/SOLR-14700
On Thu, Jan 21, 2021 at 4:13 PM ufuk yılmaz
wrote:
> Solr version 8.4. I’m
that syntax isn’t isn’t a syntax that Solr recognize intentionally at all.
At very best, the select handler is using the default field (if it’s defined).
Although not having a “q” parameter means all bets are off. On a local copy of
7.3
I have lying around I get a valid response, but using a
Yes, we are sure that this is not typo.
Actually we did more experiments and found that
1) https://hostname:8983/solr/my_collection/select?ids=169455599|1
2) https://hostname:8983/solr/my_collection/select?q=id:169455599|1
3) https://hostname:8983/solr/my_collection/get?ids=169455599|1
1)
Hmm, an NPE is weird in any case, but assuming “ids” is a field, your syntax
is wrong
q=ids:111|222
or
q=ids: (111 222) would do too.
Are you sure you used this syntax before? Or is it a typo?
Erick
> On Sep 3, 2020, at 1:02 PM, Louis wrote:
>
> We are using SolrCloud 7.7.2 and having some
This should be considered a bug. Feel free file jira for this.
Joel Bernstein
http://joelsolr.blogspot.com/
On Tue, Jun 4, 2019 at 9:16 AM aus...@3bx.org.INVALID
wrote:
> Just wanted to provide a bit more information on this issue after
> experimenting a bit more.
>
> The error I've
Just wanted to provide a bit more information on this issue after
experimenting a bit more.
The error I've described below only seems to occur when I'm
collapsing/expanding on an integer field. If I switch the field type to a
string, no errors occur if there are missing field values within the
Probably SOLR-11770 and/or SOLR-11792.
In the meantime, insure that has stored=true set and
insure that there are terms.
You'll probably have to re-index though
Best,
Erick
On Wed, Jul 18, 2018 at 10:38 AM, babuasian wrote:
> Hi,Running solr version 6.5.Trying to get tf-idf values of a term
I went ahead and resolved the jira - it was never seen again by us in later
versions of Solr. There are a number of bug fixes since the 6.2 release, so
I personally recommend updating!
On Wed, Nov 22, 2017 at 11:48 AM, Pushkar Raste
wrote:
> As mentioned in the JIRA,
As mentioned in the JIRA, exception seems to be coming from a log
statement. The issue was fixed in 6.3, here is relevant line f rom 6.3
https://github.com/apache/lucene-solr/blob/releases/lucene-solr/6.3.0/solr/core/src/java/org/apache/solr/update/PeerSync.java#L707
On Wed, Nov 22, 2017 at
Right, if there's no "fixed version" mentioned and if the resolution
is "unresolved", it's not in the code base at all. But that JIRA is
not apparently reproducible, especially on more recent versions that
6.2. Is it possible to test a more recent version (6.6.2 would be my
recommendation).
Erick
My bad. I found it at https://issues.apache.org/jira/browse/SOLR-9453
But I could not find it in changes.txt perhaps because its yet not resolved.
On Tue, Nov 21, 2017 at 9:15 AM, Erick Erickson
wrote:
> Did you check the JIRA list? Or CHANGES.txt in more recent
Did you check the JIRA list? Or CHANGES.txt in more recent versions?
On Tue, Nov 21, 2017 at 1:13 AM, S G wrote:
> Hi,
>
> We are running 6.2 version of Solr and hitting this error frequently.
>
> Error while trying to recover.
Joel:
Would it make sense to throw a more informative error when the stream
context wasn't set? Maybe an explicit check in open() or some such?
Erick
On Fri, Jul 14, 2017 at 8:25 AM, Joe Obernberger
wrote:
> Still stuck on this one. I suspect there is something
Still stuck on this one. I suspect there is something I'm not setting
in the StreamContext. I'm not sure what to put for these two?
context.put("core", this.coreName);
context.put("solr-core", req.getCore());
Also not sure what the class is for ClassifyStream? Error that I'm
getting is:
If you can include the stack trace and version of Solr we can see what's
causing the exception.
Joel Bernstein
http://joelsolr.blogspot.com/
On Thu, Jul 13, 2017 at 4:33 PM, Joe Obernberger <
joseph.obernber...@gmail.com> wrote:
> Thanks for this. I'm now trying to use stream for classify, but
Thanks for this. I'm now trying to use stream for classify, but am
getting an ArrayIndexOutOfBounds error on the stream.open(). I'm
setting the streamFactory up, and including
.withFunctionName("classify", ClassifyStream.class) - but is that class
in orga.apache.solr.handler?
-
Thank you Joel - that was it.
context = new StreamContext();
context.setSolrClientCache(StaticInfo.getSingleton(props).getClientCache());
context.workerID = 0;
context.numWorkers = 1;
context.setModelCache(StaticInfo.getSingleton(props).getModelCache());
Then:
This the working code snippet I have, if that helps
public static void main(String []args) throws IOException
{
String clause;
TupleStream stream;
List tuples;
StreamContext streamContext = new StreamContext();
SolrClientCache solrClientCache = new SolrClientCache();
It's most likely that you're not setting the StreamContext. New versions of
Solr expect the StreamContext to be set before the stream is opened. The
SolrClientCache also needs to present in the StreamContext. You can take a
look at how the StreamHandler does this for an example:
Yes, I'm aware that building an index is expensive and I will remove
"buildOnStartup" once I have it working. The field I added was an
attempt to get it working...
I have attached my latest version of solrconfig.xml and schema.xml (both
are in the same attachment), except that I have removed
Mark Fenbers [mailto:mark.fenb...@noaa.gov]
Sent: 12 October 2015 12:14
To: solr-user@lucene.apache.org
Subject: Re: NullPointerException
On 10/12/2015 5:38 AM, Duck Geraint (ext) GBJH wrote:
> "When I use the Admin UI (v5.3.0), and check the spellcheck.build box"
> Out of interes
; Toxicology and Health Sciences
> Syngenta UK
> Email: geraint.d...@syngenta.com
>
>
> -Original Message-
> From: Mark Fenbers [mailto:mark.fenb...@noaa.gov]
> Sent: 12 October 2015 12:14
> To: solr-user@lucene.apache.org
> Subject: Re: NullPointerExceptio
"When I use the Admin UI (v5.3.0), and check the spellcheck.build box"
Out of interest, where is this option within the Admin UI? I can't find
anything like it in mine...
Do you get the same issue by submitting the build command directly with
something like this instead:
On 10/12/2015 5:38 AM, Duck Geraint (ext) GBJH wrote:
"When I use the Admin UI (v5.3.0), and check the spellcheck.build box"
Out of interest, where is this option within the Admin UI? I can't find
anything like it in mine...
This is in the expanded options that open up once I put a checkmark in
Hi - yes it is worth a ticket as the javadoc says it is ok:
http://lucene.apache.org/solr/4_10_1/solr-core/org/apache/solr/schema/ExternalFileField.html
-Original message-
From:Matthew Nigl matthew.n...@gmail.com
Sent: Wednesday 8th October 2014 14:48
To: solr-user@lucene.apache.org
Thanks Markus. I initially interpreted the line It's OK to have a keyField
value that can't be found in the index as meaning that the key field value
in the external file does not have to exist as a term in the index.
On 8 October 2014 23:56, Markus Jelsma markus.jel...@openindex.io wrote:
It seems to be a modified row and referenced in EvaluatorBag.
I am not familiar with either.
Sent from my iPad
On Nov 22, 2013, at 3:05 AM, Adrien RUFFIE a.ruf...@e-deal.com wrote:
Hello all,
I have perform a full indexation with solr, but when I try to perform an
incrementation
If this went away when you made your id field into a string type rather
than analyzed then it's probably not worth a JIRA...
Erick
On Thu, Nov 8, 2012 at 11:39 AM, Otis Gospodnetic
otis.gospodne...@gmail.com wrote:
Looks like a bug. If Solr 4.0, maybe this needs to be in JIRA along with
Looks like a bug. If Solr 4.0, maybe this needs to be in JIRA along with
some sample data you indexed + your schema, so one can reproduce it.
Otis
--
Search Analytics - http://sematext.com/search-analytics/index.html
Performance Monitoring - http://sematext.com/spm/index.html
On Thu, Nov 8,
Hi all,
wow, this is weird...
now before I file the JIRA issue - one thing I forgot to mention is that
I am using the edismax query parser.
I've just done the following:
1) searched with edismax parser:
well, to sum it up... it doesn't really matter if I use standard or
dismax, at the moment both give me NullPointers for the same query,
although I didn't change anything since it was working ... it seems
totally random, sometimes it works a couple of times, sometimes it
doesn't :(
Weird...
Hi again,
well it turns out that it still doesn't work ...
Sometimes it works (i.e. for some cores), sometimes I still get the
nullpointer - e.g. if I create a new core and use the same settings as a
working one, but index different data, then I add a synonym (e.g. foo
= bar) and activate
Sure, no problem, I'll submit a JIRA entry :)
Am 21.09.2010 16:13, schrieb Robert Muir:
I don't think you should get an error like this from SynonymFilter... would
you mind opening a JIRA issue?
On Tue, Sep 21, 2010 at 9:49 AM, Stefan Moisesmoi...@shoptimax.de wrote:
Hi again,
well it
doh, looks like I only forgot to add the spellcheck component to my
edismax request handler... now it works with:
...
arr name=last-components
strspellcheck/str
strelevator/str
/arr
What's strange is that spellchecking seemed to work *without* that
entry, too
Cheers,
Stefan
Am
Ouch! Absolutely correct - quoting the URL fixed it. Thanks for saving me a
sleepless night!
cheers - rene
2010/7/26 Chris Hostetter hossman_luc...@fucit.org
: However, when I'm trying this very URL with curl within my (perl) script,
I
: receive a NullPointerException:
: CURL-COMMAND: curl
: However, when I'm trying this very URL with curl within my (perl) script, I
: receive a NullPointerException:
: CURL-COMMAND: curl -sL
:
http://localhost:8983/solr/select?indent=onversion=2.2q=*fq=ListId%3A881start=0rows=0fl=*%2Cscoreqt=standardwt=standard
it appears you aren't quoting the
: never keep a str name=maxOptimizedCommitsToKeep0/str.
:
: It is better to leave not mention the deletionPolicy at all. The
: defaults are usually fine.
if setting the keep values to 0 results in NPEs we should do one (if not
both) of the following...
1) change the init code to warn/fail if
Hi Shalin,
Thanks for your reply. Please see below.
On Jan 18, 2010, at 4:19 AM, Shalin Shekhar Mangar wrote:
On Wed, Jan 13, 2010 at 12:51 AM, Stephen Weiss
swe...@stylesight.comwrote:
...
When we replicate
manually (via the admin page) things seem to go well. However, when
On Wed, Jan 13, 2010 at 12:51 AM, Stephen Weiss swe...@stylesight.comwrote:
Hi Solr List,
We're trying to set up java-based replication with Solr 1.4 (dist tarball).
We are running this to start with on a pair of test servers just to see how
things go.
There's one major problem we can't
When you copy paste config from wiki, just copy what you need.
excluding documentation and comments
On Wed, Jan 13, 2010 at 12:51 AM, Stephen Weiss swe...@stylesight.com wrote:
Hi Solr List,
We're trying to set up java-based replication with Solr 1.4 (dist tarball).
We are running this to
Just to clarify, the error is being thrown FROM a search, DURING an update.
This error is making distributed SOLR close to unusable for me. Any ideas?
Does SOLR fail on searches if one node takes too long to respond?
hossman wrote:
: Hi,
: I'm running a distributed solr index (3
: Hi,
: I'm running a distributed solr index (3 nodes) and have noticed frequent
: exceptions thrown during updates. The exception (see below for full trace)
what do you mean during updates ? ... QueryComponent isn't used at all
when updating hte index, so there may be a missunderstanding here.
What version are you using? If a nightly build, from when?
Thanks
Erick
On Wed, Dec 2, 2009 at 12:53 PM, smock harish.agar...@gmail.com wrote:
Hi,
I'm running a distributed solr index (3 nodes) and have noticed frequent
exceptions thrown during updates. The exception (see below for full
I think it might be to do with the library itself
I downloaded semanticvectors-1.22 and compiled from source. Then created a demo
corpus using
java org.apache.lucene.demo.IndexFiles against the lucene src directory
I then ran a java pitt.search.semanticvectors.BuildIndex against the index and
On Thu, Jul 30, 2009 at 9:45 PM, Andrew Cleggandrew.cl...@gmail.com wrote:
Erik Hatcher wrote:
On Jul 30, 2009, at 11:54 AM, Andrew Clegg wrote:
entity dataSource=filesystem name=domain_pdb
url=${domain.pdb_code}-noatom.xml processor=XPathEntityProcessor
forEach=/
Hi Andrew,
your inner entity uses an XML type datasource. The default entity
processor is the SQL one, however.
For your inner entity, you have to specify the correct entity processor
explicitly. You do that by adding the attribute processor, and the
value is the classname of the processor
Chantal Ackermann wrote:
Hi Andrew,
your inner entity uses an XML type datasource. The default entity
processor is the SQL one, however.
For your inner entity, you have to specify the correct entity processor
explicitly. You do that by adding the attribute processor, and the
value
On Jul 30, 2009, at 11:54 AM, Andrew Clegg wrote:
entity dataSource=filesystem name=domain_pdb
url=${domain.pdb_code}-noatom.xml processor=XPathEntityProcessor
forEach=/
field column=content
xpath=//*[local-name()='structCategory']/*[local-name()='struct']/
Hi Andrew,
my experience with XPathEntityProcessor is non-existent. ;-)
Just after a quick look at the method that throws the exception:
private void addField0(String xpath, String name, boolean multiValued,
boolean isRecord) {
ListString paths = new
Erik Hatcher wrote:
On Jul 30, 2009, at 11:54 AM, Andrew Clegg wrote:
entity dataSource=filesystem name=domain_pdb
url=${domain.pdb_code}-noatom.xml processor=XPathEntityProcessor
forEach=/
field column=content
Chantal Ackermann wrote:
my experience with XPathEntityProcessor is non-existent. ;-)
Don't worry -- your hints put me on the right track :-)
I got it working with:
entity dataSource=filesystem name=domain_pdb
url=${domain.pdb_code}-noatom.xml
On Jul 30, 2009, at 12:19 PM, Andrew Clegg wrote:
Don't worry -- your hints put me on the right track :-)
I got it working with:
entity dataSource=filesystem name=domain_pdb
url=${domain.pdb_code}-noatom.xml processor=XPathEntityProcessor
forEach=/datablock
field
It's very easy to write your own entity processor. At least, that is my
experience with extending the SQLEntityProcessor to my needs. So, maybe
you'd be better off subclassing the xpath processor and handling the
xpath in a way you can keep your configuration straight forward.
Andrew Clegg
Do you have an index where this exception happens consistently, eg
when you try to optimize? Can you post that somewhere?
Also, which exact JRE version are you using?
Mike
On Sun, Mar 29, 2009 at 1:28 PM, Sameer Maggon mag...@gmail.com wrote:
In our application, we are getting
I dunno if the problem is w/ date. are cdt and mdt date fields in the DB?
On Fri, Sep 26, 2008 at 12:58 AM, Shalin Shekhar Mangar
[EMAIL PROTECTED] wrote:
I'm not sure about why the NullPointerException is coming. Is that the whole
stack trace?
The mdt and cdt are date in schema.xml but the
Hi,
Yes, cdt mdt are the date in MYSQL DB
Date: Fri, 26 Sep 2008 13:58:24 +0530
From: [EMAIL PROTECTED]
To: solr-user@lucene.apache.org
Subject: Re: NullPointerException
I dunno if the problem is w/ date. are cdt and mdt date fields in the DB?
On Fri, Sep 26, 2008 at 12:58 AM, Shalin
I'm not sure about why the NullPointerException is coming. Is that the whole
stack trace?
The mdt and cdt are date in schema.xml but the format that is in the log is
wrong. Look at the DateFormatTransformer in DataImportHandler which can
format strings in your database to the correct date format
: I'm just looking into transitioning from solr 1.2 to 1.3 (trunk). I
: have some legacy handler code (called AdvancedRequestHandler) that
: used to work with 1.2 but now throws an exception using 1.3 (latest
: nightly build).
This is an interesting use case that wasn't really considered
: I'm just looking into transitioning from solr 1.2 to 1.3 (trunk). I
: have some legacy handler code (called AdvancedRequestHandler) that
: used to work with 1.2 but now throws an exception using 1.3 (latest
: nightly build). The exception is this:
The short answer is: right after you call
Otis,
Thanks for the response, that list should be very useful!
Charlie
-Original Message-
From: Otis Gospodnetic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, May 02, 2007 11:13 AM
To: solr-user@lucene.apache.org
Subject: Re: NullPointerException (not schema related)
Charlie
Nevermind this...looks like my problem was tagging the args as an
str node instead of an arr node. Thanks anyway!
Charlie
-Original Message-
From: Charlie Jackson [mailto:[EMAIL PROTECTED]
Sent: Tuesday, May 01, 2007 12:02 PM
To: solr-user@lucene.apache.org
Subject: NullPointerException
: listener event=postCommit class=solr.RunExecutableListener
: str name=exesnapshooter/str
: str name=dir/usr/local/Production/solr/solr/bin//str
: bool name=waittrue/bool
: /listener
: the directory. However, when I committed data to the index, I was
: getting No such
On 5/1/07, Charlie Jackson [EMAIL PROTECTED] wrote:
This is what came in the solrconfig.xml file with just a minor tweak to
the directory. However, when I committed data to the index, I was
getting No such file or directory errors from the Runtime.exec call. I
verified all of the permissions,
64 matches
Mail list logo