stats.field using Config API

2019-01-17 Thread Antelmo Aguilar
Hi all,

I am trying to set multiple stats.field parameters to get the min and max
of multiple fields in one request.  I am able do this by doing this in the
URL stats.field=statsfield1&stats.field=statsfield2&stats.field=statsfield3

I would like to replicate this, but in the configoverlay.json file, but I
can't get it working.  I am only able to set one of the stats field by
using something like this:

"requesthanlder": {
  "name": "/query",
  "class": "solr.SearchHandler",
  "defaults": {
"echoParams": "all",
"df": "text"
  },
  "invariants": {
"wt": "json",
"json.nl": "map",
"rows": 0,
"stats": "true",
"stats.field": "statsfield1"
  }
}

The issue is that the above request handler only allows me to set one
stats.field parameter.  If I add another stats.field parameter underneath,
it just overrides the value instead of adding another stats.field to the
request.

Any help would be appreciated.

-Antelmo


Re: Reason Why Query Does Not Work

2018-09-18 Thread Antelmo Aguilar
Hi Alex and Erick,

We could possibly put them in fq, but how we set everything up would make
it hard to do so, but going that route might be the only option.

I did take a look at the parsed query and this is the difference:

This is the one that works:
"-WithinPrefixTreeQuery(fieldName=collection_date_range,queryShape=[2000 TO
2018-09-18],detailLevel=9,prefixGridScanLevel=7)
-WithinPrefixTreeQuery(fieldName=collection_date_range,queryShape=[1960 TO
1998-09-18],detailLevel=9,prefixGridScanLevel=7)
+IntersectsPrefixTreeQuery(fieldName=collection_season,queryShape=1999-05,detailLevel=9,prefixGridScanLevel=8)"

This is the one that does not work
"+(-WithinPrefixTreeQuery(fieldName=collection_date_range,queryShape=[2000
TO 2018-09-18],detailLevel=9,prefixGridScanLevel=7)
-WithinPrefixTreeQuery(fieldName=collection_date_range,queryShape=[1960 TO
1998-09-18],detailLevel=9,prefixGridScanLevel=7))
+IntersectsPrefixTreeQuery(fieldName=collection_season,queryShape=1999-05,detailLevel=9,prefixGridScanLevel=8)"

If someone knows by just looking at these queries why I get no results in
the second one, I would appreciate it.  From looking at the page Erick
pointed out, I do not think it covers my case?  ((-X AND -Y) AND Z)

Sorry for the trouble and thanks again!

Best,
Antelmo

On Tue, Sep 18, 2018 at 2:56 PM, Erick Erickson 
wrote:

> Also, Solr does _not_ implement strict Boolean logic, although with
> appropriate parentheses you can get it to look like Boolean logic.
> See: https://lucidworks.com/2011/12/28/why-not-and-or-and-not/.
>
> Additionally, for _some_ clauses a pure-not query is translated into
> *:* -pure_not_query which is helpful, but occasionally confusing.
>
> Best,
> Erick
> On Tue, Sep 18, 2018 at 11:43 AM Alexandre Rafalovitch
>  wrote:
> >
> > Have a look at what debug shows in the parsed query. I think every
> > bracket is quite significant actually and you are generating a
> > different type of clause.
> >
> > Also, have you thought about putting those individual clauses into
> > 'fq' instead of jointly into 'q'? This may give you faster search too,
> > as Solr will not have to worry about ranking.
> >
> > Regards,
> >Alex.
> >
> > On 18 September 2018 at 14:38, Antelmo Aguilar  wrote:
> > > Hi,
> > >
> > > I am doing some date queries and I was wondering if there is some way
> of
> > > getting this query to work.
> > >
> > > ( ( !{!field f=collection_date_range op=Within v='[2000-01-01 TO
> > > 2018-09-18]'} AND !{!field f=collection_date_range op=Within
> v='[1960-01-01
> > > TO 1998-09-18]'} ) AND collection_season:([1999-05 TO 1999-05]) )
> > >
> > > I understand that I could just not do NOT queries and instead search
> for
> > > 1998-09-18 TO 2000-01-01, but doing NOT queries gives me more results
> (e.g
> > > records that do not have collection_date_range defined).
> > >
> > > If I remove the parenthesis enclosing the NOT queries, it works.
> Without
> > > the parenthesis the query does not return results though.  So the query
> > > below, does work.
> > >
> > > ( !{!field f=collection_date_range op=Within v='[2000-01-01 TO
> > > 2018-09-18]'} AND !{!field f=collection_date_range op=Within
> v='[1960-01-01
> > > TO 1998-09-18]'} AND collection_season:([1999-05 TO 1999-05]) )
> > >
> > > Any insight would be appreciated.  I really do not see the reason why
> the
> > > parenthesis enclosing the NOT queries would cause it to not return
> results.
> > >
> > > Best,
> > > Antelmo
>


Reason Why Query Does Not Work

2018-09-18 Thread Antelmo Aguilar
Hi,

I am doing some date queries and I was wondering if there is some way of
getting this query to work.

( ( !{!field f=collection_date_range op=Within v='[2000-01-01 TO
2018-09-18]'} AND !{!field f=collection_date_range op=Within v='[1960-01-01
TO 1998-09-18]'} ) AND collection_season:([1999-05 TO 1999-05]) )

I understand that I could just not do NOT queries and instead search for
1998-09-18 TO 2000-01-01, but doing NOT queries gives me more results (e.g
records that do not have collection_date_range defined).

If I remove the parenthesis enclosing the NOT queries, it works.  Without
the parenthesis the query does not return results though.  So the query
below, does work.

( !{!field f=collection_date_range op=Within v='[2000-01-01 TO
2018-09-18]'} AND !{!field f=collection_date_range op=Within v='[1960-01-01
TO 1998-09-18]'} AND collection_season:([1999-05 TO 1999-05]) )

Any insight would be appreciated.  I really do not see the reason why the
parenthesis enclosing the NOT queries would cause it to not return results.

Best,
Antelmo


Re: Date Query Using Local Params

2018-09-10 Thread Antelmo Aguilar
Hi Erik,

Thank you! I did mess with the v parameter, but I was doing it wrong.  I
was doing this v='([2013-07-08 TO 2013-07-09] OR [2013-07-21 TO
2013-07-25])'

Anyways, I needed to use the "fq" parameter and I did this fq=({!field
f=collection_date_range op=Within v='[2013-07-08 TO 2013-07-09]'} OR
{!field f=collection_date_range op=Within v='[2017-01-10 TO 2018-06-17]'})
and it worked like I would expect.

Thank you for all your help!

Best,
Antelmo

On Mon, Sep 10, 2018 at 4:18 PM, Erik Hatcher 
wrote:

> When using the {!...} syntax, and combining it with other clauses, the
> expression parsed needs to come from a local-param `v` parameter
> (otherwise, without `v`, the parser eats the rest of the string after the
> closing curly bracket).  So you could do something like this:
>
>
> q={!field f=collection_date_range op=Within v='[2013-07-08 TO
> 2013-07-09]'} OR {!field
> f=collection_date_range op=Within v='[2013-07-21 TO 2013-07-25]'}
>
> Or you could do this sort of thing, which allows the date ranges to be
> parameterized:
>
> q={!field f=collection_date_range op=Within v=$range1} OR {!field
> f=collection_date_range op=Within v=$range2}
>  &range1=[2013-07-08 TO 2013-07-09]
>  &range2=[2013-07-21 TO 2013-07-25]
>
> Erik
>
>
>
>
>
> > On Sep 10, 2018, at 3:59 PM, Antelmo Aguilar  wrote:
> >
> > Hi Shawn,
> >
> > Thank you.  So just to confirm, there is no way for me to use an OR
> > operator with also using the "within" op parameter described in the
> bottom
> > of this page?
> >
> > https://lucene.apache.org/solr/guide/6_6/working-with-
> dates.html#WorkingwithDates-MoreDateRangeFieldDetails
> >
> > I appreciate your resposne.
> >
> > Best,
> > Antelmo
> >
> > On Mon, Sep 10, 2018 at 3:51 PM, Shawn Heisey 
> wrote:
> >
> >> On 9/10/2018 1:21 PM, Antelmo Aguilar wrote:
> >>
> >>> Hi,
> >>>
> >>> I have a question.  I am trying to use the "within" op parameter in a
> Date
> >>> Search.  This works like I would expect: {!field
> f=collection_date_range
> >>> op=Within}[2013-07-08 TO 2013-07-09]
> >>>
> >>> I would like to use an OR with the query though, something like this:
> >>> {!field
> >>> f=collection_date_range op=Within}[2013-07-08 TO 2013-07-09] OR {!field
> >>> f=collection_date_range op=Within}[2013-07-21 TO 2013-07-25]
> >>>
> >>> However, I tried different approaches and none of them worked.  Is
> there a
> >>> way of doing something like this for querying dates using the "within"
> op
> >>> parameter?
> >>>
> >>
> >> I don't think the field parser can do this.  Also, usually it's not
> >> possible to use localparams in a second query clause like that --
> >> localparams must almost always be the very first thing in the "q"
> >> parameter, or they will not be interpreted as localparams.  Use the
> >> standard (lucene) parser without localparams.  The q parameter should
> look
> >> like this:
> >>
> >> collection_date_range:[2013-07-08 TO 2013-07-09] OR
> >> collection_date_range:[2013-07-21 TO 2013-07-25]
> >>
> >> If the default operator hasn't been changed (which would mean it is
> using
> >> OR), then you could remove the "OR" from that.
> >>
> >> Thanks,
> >> Shawn
> >>
> >>
>
>


Re: Date Query Using Local Params

2018-09-10 Thread Antelmo Aguilar
Hi Shawn,

Thank you.  So just to confirm, there is no way for me to use an OR
operator with also using the "within" op parameter described in the bottom
of this page?

https://lucene.apache.org/solr/guide/6_6/working-with-dates.html#WorkingwithDates-MoreDateRangeFieldDetails

I appreciate your resposne.

Best,
Antelmo

On Mon, Sep 10, 2018 at 3:51 PM, Shawn Heisey  wrote:

> On 9/10/2018 1:21 PM, Antelmo Aguilar wrote:
>
>> Hi,
>>
>> I have a question.  I am trying to use the "within" op parameter in a Date
>> Search.  This works like I would expect: {!field f=collection_date_range
>> op=Within}[2013-07-08 TO 2013-07-09]
>>
>> I would like to use an OR with the query though, something like this:
>> {!field
>> f=collection_date_range op=Within}[2013-07-08 TO 2013-07-09] OR {!field
>> f=collection_date_range op=Within}[2013-07-21 TO 2013-07-25]
>>
>> However, I tried different approaches and none of them worked.  Is there a
>> way of doing something like this for querying dates using the "within" op
>> parameter?
>>
>
> I don't think the field parser can do this.  Also, usually it's not
> possible to use localparams in a second query clause like that --
> localparams must almost always be the very first thing in the "q"
> parameter, or they will not be interpreted as localparams.  Use the
> standard (lucene) parser without localparams.  The q parameter should look
> like this:
>
> collection_date_range:[2013-07-08 TO 2013-07-09] OR
> collection_date_range:[2013-07-21 TO 2013-07-25]
>
> If the default operator hasn't been changed (which would mean it is using
> OR), then you could remove the "OR" from that.
>
> Thanks,
> Shawn
>
>


Date Query Using Local Params

2018-09-10 Thread Antelmo Aguilar
Hi,

I have a question.  I am trying to use the "within" op parameter in a Date
Search.  This works like I would expect: {!field f=collection_date_range
op=Within}[2013-07-08 TO 2013-07-09]

I would like to use an OR with the query though, something like this: {!field
f=collection_date_range op=Within}[2013-07-08 TO 2013-07-09] OR {!field
f=collection_date_range op=Within}[2013-07-21 TO 2013-07-25]

However, I tried different approaches and none of them worked.  Is there a
way of doing something like this for querying dates using the "within" op
parameter?

Thanks,
Antelmo


Not possible to use NOT queries with Solr Export Handler?

2018-08-23 Thread Antelmo Aguilar
Hello,

I asked this question in the IRC channel, but had to leave so was not able
to wait for a response.  So sending it through here instead with the hopes
that someone can give me some insight on the issue I am experiencing.

So in our Solr setup, we use the Solr Export request handler.  Our users
are able to construct Not queries and export the results.  We were testing
this feature and noticed that Not queries do not return anything, but
normal queries do return results.  Is there a reason for this or am I
missing something that will allow NOT queries to work with the Export
handler?

I would really appreciate the help.

-Antelmo


Re: Issue Using JSON Facet API Buckets in Solr 6.6

2018-02-23 Thread Antelmo Aguilar
Hi Yonik,

Good to hear you were able to reproduce it.  Looking forward for the fix.
Will use the version of Solr that works in the meantime.

-Antelmo

On Thu, Feb 22, 2018 at 5:10 PM, Yonik Seeley  wrote:

> I've reproduced the issue and opened
> https://issues.apache.org/jira/browse/SOLR-12020
>
> -Yonik
>
>
>
> On Thu, Feb 22, 2018 at 11:03 AM, Yonik Seeley  wrote:
> > Thanks Antelmo, I'm trying to reproduce this now.
> > -Yonik
> >
> >
> > On Mon, Feb 19, 2018 at 10:13 AM, Antelmo Aguilar 
> wrote:
> >> Hi all,
> >>
> >> I was wondering if the information I sent is sufficient to look into the
> >> issue.  Let me know if you need anything else from me please.
> >>
> >> Thanks,
> >> Antelmo
> >>
> >> On Thu, Feb 15, 2018 at 1:56 PM, Antelmo Aguilar 
> wrote:
> >>
> >>> Hi,
> >>>
> >>> Here are two pastebins.  The first is the full complete response with
> the
> >>> search parameters used.  The second is the stack trace from the logs:
> >>>
> >>> https://pastebin.com/rsHvKK63
> >>>
> >>> https://pastebin.com/8amxacAj
> >>>
> >>> I am not using any custom code or plugins with the Solr instance.
> >>>
> >>> Please let me know if you need anything else and thanks for looking
> into
> >>> this.
> >>>
> >>> -Antelmo
> >>>
> >>> On Wed, Feb 14, 2018 at 12:56 PM, Yonik Seeley 
> wrote:
> >>>
> >>>> Could you provide the full stack trace containing "Invalid Date
> >>>> String"  and the full request that causes it?
> >>>> Are you using any custom code/plugins in Solr?
> >>>> -Yonik
> >>>>
> >>>>
> >>>> On Mon, Feb 12, 2018 at 4:55 PM, Antelmo Aguilar 
> wrote:
> >>>> > Hi,
> >>>> >
> >>>> > I was using the following part of a query to get facet buckets so
> that I
> >>>> > can use the information in the buckets for some post-processing:
> >>>> >
> >>>> > "json":
> >>>> > "{\"filter\":[\"bundle:pop_sample\",\"has_abundance_data_b:
> >>>> true\",\"has_geodata:true\",\"${project}\"],\"facet\":{\"ter
> >>>> m\":{\"type\":\"terms\",\"limit\":-1,\"field\":\"${term:spec
> >>>> ies_category}\",\"facet\":{\"collection_dates\":{\"type\":\
> >>>> "terms\",\"limit\":-1,\"field\":\"collection_date\",\"facet\
> >>>> ":{\"collection\":
> >>>> > {\"type\":\"terms\",\"field\":\"collection_assay_id_s\",\"fa
> >>>> cet\":{\"abnd\":\"sum(div(sample_size_i,
> >>>> > collection_duration_days_i))\""
> >>>> >
> >>>> > Sorry if it is hard to read.  Basically what is was doing was
> getting
> >>>> the
> >>>> > following buckets:
> >>>> >
> >>>> > First bucket will be categorized by "Species category" by default
> >>>> unless we
> >>>> > pass in the request the "term" parameter which we will categories
> the
> >>>> first
> >>>> > bucket by whatever "term" is set to.  Then inside this first
> bucket, we
> >>>> > create another buckets of the "Collection date" category.  Then
> inside
> >>>> the
> >>>> > "Collection date" category buckets, we would use some functions to
> do
> >>>> some
> >>>> > calculations and return those calculations inside the "Collection
> date"
> >>>> > category buckets.
> >>>> >
> >>>> > This query is working fine in Solr 6.2, but I upgraded our instance
> of
> >>>> Solr
> >>>> > 6.2 to the latest 6.6 version.  However it seems that upgrading to
> Solr
> >>>> 6.6
> >>>> > broke the above query.  Now it complains when trying to create the
> >>>> buckets
> >>>> > of the "Collection date" category.  I get the following error:
> >>>> >
> >>>> > Invalid Date String:'Fri Aug 01 00:00:00 UTC 2014'
> >>>> >
> >>>> > It seems that when creating the buckets of a date field, it does
> some
> >>>> > conversion of the way the date is stored and causes the error to
> appear.
> >>>> > Does anyone have an idea as to why this error is happening?  I would
> >>>> really
> >>>> > appreciate any help.  Hopefully I was able to explain my issue well.
> >>>> >
> >>>> > Thanks,
> >>>> > Antelmo
> >>>>
> >>>
> >>>
>


Re: Issue Using JSON Facet API Buckets in Solr 6.6

2018-02-19 Thread Antelmo Aguilar
Hi all,

I was wondering if the information I sent is sufficient to look into the
issue.  Let me know if you need anything else from me please.

Thanks,
Antelmo

On Thu, Feb 15, 2018 at 1:56 PM, Antelmo Aguilar  wrote:

> Hi,
>
> Here are two pastebins.  The first is the full complete response with the
> search parameters used.  The second is the stack trace from the logs:
>
> https://pastebin.com/rsHvKK63
>
> https://pastebin.com/8amxacAj
>
> I am not using any custom code or plugins with the Solr instance.
>
> Please let me know if you need anything else and thanks for looking into
> this.
>
> -Antelmo
>
> On Wed, Feb 14, 2018 at 12:56 PM, Yonik Seeley  wrote:
>
>> Could you provide the full stack trace containing "Invalid Date
>> String"  and the full request that causes it?
>> Are you using any custom code/plugins in Solr?
>> -Yonik
>>
>>
>> On Mon, Feb 12, 2018 at 4:55 PM, Antelmo Aguilar  wrote:
>> > Hi,
>> >
>> > I was using the following part of a query to get facet buckets so that I
>> > can use the information in the buckets for some post-processing:
>> >
>> > "json":
>> > "{\"filter\":[\"bundle:pop_sample\",\"has_abundance_data_b:
>> true\",\"has_geodata:true\",\"${project}\"],\"facet\":{\"ter
>> m\":{\"type\":\"terms\",\"limit\":-1,\"field\":\"${term:spec
>> ies_category}\",\"facet\":{\"collection_dates\":{\"type\":\
>> "terms\",\"limit\":-1,\"field\":\"collection_date\",\"facet\
>> ":{\"collection\":
>> > {\"type\":\"terms\",\"field\":\"collection_assay_id_s\",\"fa
>> cet\":{\"abnd\":\"sum(div(sample_size_i,
>> > collection_duration_days_i))\""
>> >
>> > Sorry if it is hard to read.  Basically what is was doing was getting
>> the
>> > following buckets:
>> >
>> > First bucket will be categorized by "Species category" by default
>> unless we
>> > pass in the request the "term" parameter which we will categories the
>> first
>> > bucket by whatever "term" is set to.  Then inside this first bucket, we
>> > create another buckets of the "Collection date" category.  Then inside
>> the
>> > "Collection date" category buckets, we would use some functions to do
>> some
>> > calculations and return those calculations inside the "Collection date"
>> > category buckets.
>> >
>> > This query is working fine in Solr 6.2, but I upgraded our instance of
>> Solr
>> > 6.2 to the latest 6.6 version.  However it seems that upgrading to Solr
>> 6.6
>> > broke the above query.  Now it complains when trying to create the
>> buckets
>> > of the "Collection date" category.  I get the following error:
>> >
>> > Invalid Date String:'Fri Aug 01 00:00:00 UTC 2014'
>> >
>> > It seems that when creating the buckets of a date field, it does some
>> > conversion of the way the date is stored and causes the error to appear.
>> > Does anyone have an idea as to why this error is happening?  I would
>> really
>> > appreciate any help.  Hopefully I was able to explain my issue well.
>> >
>> > Thanks,
>> > Antelmo
>>
>
>


Re: Issue Using JSON Facet API Buckets in Solr 6.6

2018-02-15 Thread Antelmo Aguilar
Hi,

Here are two pastebins.  The first is the full complete response with the
search parameters used.  The second is the stack trace from the logs:

https://pastebin.com/rsHvKK63

https://pastebin.com/8amxacAj

I am not using any custom code or plugins with the Solr instance.

Please let me know if you need anything else and thanks for looking into
this.

-Antelmo

On Wed, Feb 14, 2018 at 12:56 PM, Yonik Seeley  wrote:

> Could you provide the full stack trace containing "Invalid Date
> String"  and the full request that causes it?
> Are you using any custom code/plugins in Solr?
> -Yonik
>
>
> On Mon, Feb 12, 2018 at 4:55 PM, Antelmo Aguilar  wrote:
> > Hi,
> >
> > I was using the following part of a query to get facet buckets so that I
> > can use the information in the buckets for some post-processing:
> >
> > "json":
> > "{\"filter\":[\"bundle:pop_sample\",\"has_abundance_data_
> b:true\",\"has_geodata:true\",\"${project}\"],\"facet\":{\"
> term\":{\"type\":\"terms\",\"limit\":-1,\"field\":\"${term:
> species_category}\",\"facet\":{\"collection_dates\":{\"type\
> ":\"terms\",\"limit\":-1,\"field\":\"collection_date\",\"
> facet\":{\"collection\":
> > {\"type\":\"terms\",\"field\":\"collection_assay_id_s\",\"
> facet\":{\"abnd\":\"sum(div(sample_size_i,
> > collection_duration_days_i))\""
> >
> > Sorry if it is hard to read.  Basically what is was doing was getting the
> > following buckets:
> >
> > First bucket will be categorized by "Species category" by default unless
> we
> > pass in the request the "term" parameter which we will categories the
> first
> > bucket by whatever "term" is set to.  Then inside this first bucket, we
> > create another buckets of the "Collection date" category.  Then inside
> the
> > "Collection date" category buckets, we would use some functions to do
> some
> > calculations and return those calculations inside the "Collection date"
> > category buckets.
> >
> > This query is working fine in Solr 6.2, but I upgraded our instance of
> Solr
> > 6.2 to the latest 6.6 version.  However it seems that upgrading to Solr
> 6.6
> > broke the above query.  Now it complains when trying to create the
> buckets
> > of the "Collection date" category.  I get the following error:
> >
> > Invalid Date String:'Fri Aug 01 00:00:00 UTC 2014'
> >
> > It seems that when creating the buckets of a date field, it does some
> > conversion of the way the date is stored and causes the error to appear.
> > Does anyone have an idea as to why this error is happening?  I would
> really
> > appreciate any help.  Hopefully I was able to explain my issue well.
> >
> > Thanks,
> > Antelmo
>


Re: Issue Using JSON Facet API Buckets in Solr 6.6

2018-02-14 Thread Antelmo Aguilar
Hello,

I just wanted to follow up on this issue I am having in case it got lost.
I have been trying to figure this out and so far the only solution I can
find is using the older version.

If you need more details from me, please let me know.  I would really
appreciate any help.

Best,
Antelmo

On Feb 12, 2018 4:55 PM, "Antelmo Aguilar"  wrote:

> Hi,
>
> I was using the following part of a query to get facet buckets so that I
> can use the information in the buckets for some post-processing:
>
> "json": "{\"filter\":[\"bundle:pop_sample\",\"has_abundance_data_
> b:true\",\"has_geodata:true\",\"${project}\"],\"facet\":{\"
> term\":{\"type\":\"terms\",\"limit\":-1,\"field\":\"${term:
> species_category}\",\"facet\":{\"collection_dates\":{\"type\
> ":\"terms\",\"limit\":-1,\"field\":\"collection_date\",\"facet\":{\"collection\":
> {\"type\":\"terms\",\"field\":\"collection_assay_id_s\",\"
> facet\":{\"abnd\":\"sum(div(sample_size_i, collection_duration_days_i))\"
> "
>
> Sorry if it is hard to read.  Basically what is was doing was getting the
> following buckets:
>
> First bucket will be categorized by "Species category" by default unless
> we pass in the request the "term" parameter which we will categories the
> first bucket by whatever "term" is set to.  Then inside this first bucket,
> we create another buckets of the "Collection date" category.  Then inside
> the "Collection date" category buckets, we would use some functions to do
> some calculations and return those calculations inside the "Collection
> date" category buckets.
>
> This query is working fine in Solr 6.2, but I upgraded our instance of
> Solr 6.2 to the latest 6.6 version.  However it seems that upgrading to
> Solr 6.6 broke the above query.  Now it complains when trying to create the
> buckets of the "Collection date" category.  I get the following error:
>
> Invalid Date String:'Fri Aug 01 00:00:00 UTC 2014'
>
> It seems that when creating the buckets of a date field, it does some
> conversion of the way the date is stored and causes the error to appear.
> Does anyone have an idea as to why this error is happening?  I would really
> appreciate any help.  Hopefully I was able to explain my issue well.
>
> Thanks,
> Antelmo
>


Issue Using JSON Facet API Buckets in Solr 6.6

2018-02-12 Thread Antelmo Aguilar
Hi,

I was using the following part of a query to get facet buckets so that I
can use the information in the buckets for some post-processing:

"json":
"{\"filter\":[\"bundle:pop_sample\",\"has_abundance_data_b:true\",\"has_geodata:true\",\"${project}\"],\"facet\":{\"term\":{\"type\":\"terms\",\"limit\":-1,\"field\":\"${term:species_category}\",\"facet\":{\"collection_dates\":{\"type\":\"terms\",\"limit\":-1,\"field\":\"collection_date\",\"facet\":{\"collection\":
{\"type\":\"terms\",\"field\":\"collection_assay_id_s\",\"facet\":{\"abnd\":\"sum(div(sample_size_i,
collection_duration_days_i))\""

Sorry if it is hard to read.  Basically what is was doing was getting the
following buckets:

First bucket will be categorized by "Species category" by default unless we
pass in the request the "term" parameter which we will categories the first
bucket by whatever "term" is set to.  Then inside this first bucket, we
create another buckets of the "Collection date" category.  Then inside the
"Collection date" category buckets, we would use some functions to do some
calculations and return those calculations inside the "Collection date"
category buckets.

This query is working fine in Solr 6.2, but I upgraded our instance of Solr
6.2 to the latest 6.6 version.  However it seems that upgrading to Solr 6.6
broke the above query.  Now it complains when trying to create the buckets
of the "Collection date" category.  I get the following error:

Invalid Date String:'Fri Aug 01 00:00:00 UTC 2014'

It seems that when creating the buckets of a date field, it does some
conversion of the way the date is stored and causes the error to appear.
Does anyone have an idea as to why this error is happening?  I would really
appreciate any help.  Hopefully I was able to explain my issue well.

Thanks,
Antelmo


Re: Broken Feature in Solr 6.6

2018-01-30 Thread Antelmo Aguilar
Hi Joel,

Thank you!  Changing the class from SearchHandler to ExportHandler worked.
I appreciate you looking into it.

-Antelmo

On Tue, Jan 30, 2018 at 10:43 AM, Joel Bernstein  wrote:

> I think the best approach is to use the /export handler. The wt=xsort I
> believe has been removed from the system. The configuration for the /export
> handler uses wt=json now.
>
> The configurations in the implicitPlugins.js look like this:
>
> "/export": {
>   "class": "solr.ExportHandler",
>   "useParams":"_EXPORT",
>   "components": [
> "query"
>   ],
>   "defaults": {
> "wt": "json"
>   },
>   "invariants": {
> "rq": "{!xport}",
> "distrib": false
>   }
>
>
>
>
>
>
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Tue, Jan 30, 2018 at 8:23 AM, Antelmo Aguilar  wrote:
>
> > Hi Joel,
> >
> > I apologize, I should have been more specific.  We do not use the export
> > handler that is defined by Solr.  We use a couple export handlers that we
> > defined using the convention explained in the ticket that implemented the
> > feature.
> >
> > We did this because we have "categories" of things we export so there are
> > additional invariants for each category so we do not have to worry about
> > them when constructing the query.
> >
> > It seems that with version 6.6, these custom export handlers do not work
> > anymore.
> >
> > Best,
> > Antelmo
> >
> >
> > On Jan 29, 2018 7:37 PM, "Joel Bernstein"  wrote:
> >
> > There was a change in the configs between 6.1 and 6.6. If you upgraded
> you
> > system and kept the old configs then the /export handler won't work
> > properly. Check solrconfig.xml and remove any reference to the /export
> > handler. You also don't need to specify the rq or wt when you access the
> > /export handler anymore. This should work fine:
> >
> > http://host:port/solr/collection/export?q=*:*&fl=
> > exp_id_s&sort=exp_id_s+asc
> >
> > Joel Bernstein
> > http://joelsolr.blogspot.com/
> >
> > On Mon, Jan 29, 2018 at 4:59 PM, Antelmo Aguilar 
> wrote:
> >
> > > Hi All,
> > >
> > > I was using this feature in Solr 6.1:
> > > https://issues.apache.org/jira/browse/SOLR-5244
> > >
> > > It seems that this feature is broken in Solr 6.6.  If I do this query
> in
> > > Solr 6.1, it works as expected.
> > >
> > > q=*:*&fl=exp_id_s&rq={!xport}&wt=xsort&sort=exp_id_s+asc
> > >
> > > However, doing the same query in Solr 6.6 does not return all the
> > results.
> > > It just returns 10 results.
> > >
> > > Also, it seems that the wt=xsort parameter does not do anything since
> it
> > > returns the results in xml format.  In 6.1 it returned the results in
> > > JSON.  I asked same question in the IRC channel and they told me that
> it
> > is
> > > supposed to still work the same way.  Had to leave so hopefully someone
> > can
> > > help me out through e-mail.  I would really appreciate it.
> > >
> > > Thank you,
> > > Antelmo
> > >
> >
>


Re: Broken Feature in Solr 6.6

2018-01-30 Thread Antelmo Aguilar
Hi Joel,

I apologize, I should have been more specific.  We do not use the export
handler that is defined by Solr.  We use a couple export handlers that we
defined using the convention explained in the ticket that implemented the
feature.

We did this because we have "categories" of things we export so there are
additional invariants for each category so we do not have to worry about
them when constructing the query.

It seems that with version 6.6, these custom export handlers do not work
anymore.

Best,
Antelmo


On Jan 29, 2018 7:37 PM, "Joel Bernstein"  wrote:

There was a change in the configs between 6.1 and 6.6. If you upgraded you
system and kept the old configs then the /export handler won't work
properly. Check solrconfig.xml and remove any reference to the /export
handler. You also don't need to specify the rq or wt when you access the
/export handler anymore. This should work fine:

http://host:port/solr/collection/export?q=*:*&fl=exp_id_s&sort=exp_id_s+asc

Joel Bernstein
http://joelsolr.blogspot.com/

On Mon, Jan 29, 2018 at 4:59 PM, Antelmo Aguilar  wrote:

> Hi All,
>
> I was using this feature in Solr 6.1:
> https://issues.apache.org/jira/browse/SOLR-5244
>
> It seems that this feature is broken in Solr 6.6.  If I do this query in
> Solr 6.1, it works as expected.
>
> q=*:*&fl=exp_id_s&rq={!xport}&wt=xsort&sort=exp_id_s+asc
>
> However, doing the same query in Solr 6.6 does not return all the results.
> It just returns 10 results.
>
> Also, it seems that the wt=xsort parameter does not do anything since it
> returns the results in xml format.  In 6.1 it returned the results in
> JSON.  I asked same question in the IRC channel and they told me that it
is
> supposed to still work the same way.  Had to leave so hopefully someone
can
> help me out through e-mail.  I would really appreciate it.
>
> Thank you,
> Antelmo
>


Broken Feature in Solr 6.6

2018-01-29 Thread Antelmo Aguilar
Hi All,

I was using this feature in Solr 6.1:
https://issues.apache.org/jira/browse/SOLR-5244

It seems that this feature is broken in Solr 6.6.  If I do this query in
Solr 6.1, it works as expected.

q=*:*&fl=exp_id_s&rq={!xport}&wt=xsort&sort=exp_id_s+asc

However, doing the same query in Solr 6.6 does not return all the results.
It just returns 10 results.

Also, it seems that the wt=xsort parameter does not do anything since it
returns the results in xml format.  In 6.1 it returned the results in
JSON.  I asked same question in the IRC channel and they told me that it is
supposed to still work the same way.  Had to leave so hopefully someone can
help me out through e-mail.  I would really appreciate it.

Thank you,
Antelmo


Status on updating log4j to log4j2

2017-05-09 Thread Antelmo Aguilar
Hi,

I noticed that you guys are working on upgrading log4j to log4j2:
https://issues.apache.org/jira/browse/SOLR-7887

I was wondering if there is any priority on doing this since it has been
several months since the last comment.  It would be nice since it seems
log4j2 makes it easier to make the logs be GELF compliant.

Thank you!


Help Indexing Large File

2015-12-14 Thread Antelmo Aguilar
Hello,

I am trying to index a very large file in Solr (around 5GB).  However, I
get out of memory errors using Curl.  I tried using the post script and I
had some success with it.  After indexing several hundred thousand records
though, I got the following error message:

*SimplePostTool: FATAL: IOException while posting data:
java.io.IOException: too many bytes written*

Would it be possible to get some help on where I can start looking to solve
this issue?  I tried finding some type of log that would give me more
information.  I have not had any luck.  The only logs I was able to find
related to this error were the logs from Solr, but I assume these are from
the "server" perspective and not "cient's" perspective of the error.  I
would really appreciate the help.

Thanks,
Antelmo