Re: Modify partial configsets using API

2019-05-08 Thread Shawn Heisey
On 5/8/2019 10:50 AM, Mike Drob wrote: Solr Experts, Is there an existing API to modify just part of my configset, for example synonyms or stopwords? I see that there is the schema API, but that is pretty specific in scope. Not sure if I should be looking at configset API to upload a zip with a

Modify partial configsets using API

2019-05-08 Thread Mike Drob
Solr Experts, Is there an existing API to modify just part of my configset, for example synonyms or stopwords? I see that there is the schema API, but that is pretty specific in scope. Not sure if I should be looking at configset API to upload a zip with a single file, or if there are more granul

Re: Search using filter query on multivalued fields

2019-05-03 Thread David Hastings
ent:[SALT_20 TO *]. That’s not very flexible and you have to > normalize (i.e. 1% couldn’t be SALT_1), so “it depends”. > > The point is that you have to index cleverly to do what you want. > > Best, > Erick > > > On May 3, 2019, at 6:26 AM, Srinivas Kashyap > wrote:

Re: Search using filter query on multivalued fields

2019-05-03 Thread Erick Erickson
(i.e. 1% couldn’t be SALT_1), so “it depends”. The point is that you have to index cleverly to do what you want. Best, Erick > On May 3, 2019, at 6:26 AM, Srinivas Kashyap wrote: > > Hi, > > I have indexed data as shown below using DIH: > > "INGREDIEN

Search using filter query on multivalued fields

2019-05-03 Thread Srinivas Kashyap
Hi, I have indexed data as shown below using DIH: "INGREDIENT_NAME": [ "EGG", "CANOLA OIL", "SALT" ], "INGREDIENT_NO": [ "550", "297", "314"

Re: Spatial Search using two separate fields for lat and long

2019-04-13 Thread Alexandre Rafalovitch
latitude and > > longitude fields. I want to use those two separate fields for searching > > with a bounding box. Is this possible (not using deprecated LatLonType) or > > do I need to combine them into one single field when indexing? The reason I > > want to keep the fiel

Re: Spatial Search using two separate fields for lat and long

2019-04-13 Thread David Smiley
oper http://www.linkedin.com/in/davidwsmiley On Wed, Apr 3, 2019 at 1:54 AM Tim Hedlund wrote: > Hi all, > > I'm importing documents (rows in excel file) that includes latitude and > longitude fields. I want to use those two separate fields for searching > with a bounding box. Is this p

Using solrconfig for json facet sorting

2019-04-11 Thread sagandhi
Hi, Is it possible to configure sorting on json.facet in solrconfig.xml just like for traditional facets? Thanks, Soham -- Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html

Spatial Search using two separate fields for lat and long

2019-04-03 Thread Tim Hedlund
Hi all, I'm importing documents (rows in excel file) that includes latitude and longitude fields. I want to use those two separate fields for searching with a bounding box. Is this possible (not using deprecated LatLonType) or do I need to combine them into one single field when indexing

Re: Using copyFields

2019-03-28 Thread Sharmadha
Thanks. Adding default field in solrconfig.xml worked. -- Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html

Re: Using copyFields

2019-03-28 Thread Toke Eskildsen
On Thu, 2019-03-28 at 05:03 -0700, Sharmadha wrote: > I created my own field named "cfield1" of type text_general and added > a copyField with src = "*" and dest = "cfield1". Posted the films > data. After this , on typing "comedy" in query field , the query > doesn't fetch results. You need to te

Re: Using copyFields

2019-03-28 Thread Sharmadha
As mentioned in https://lucene.apache.org/solr/guide/7_7/solr-tutorial.html#create-a-catchall-copy-field , instead of having a copyField on src="*" ,dest ="_text_" , I added a copyField with src="*" ,dest ="cfield". when I did copyField on src="*" ,dest ="_text_" , on firing query=comedy , it list

Re: Using copyFields

2019-03-28 Thread Jörn Franke
What do you mean does not fetch results? It returns the found documents, but not the text content? In this case you need to store the field. Is comedy a stop word defined by you? > Am 28.03.2019 um 13:03 schrieb Sharmadha : > > Following solr tutorial , > https://lucene.apache.org/solr/guide/7_

Using copyFields

2019-03-28 Thread Sharmadha
Following solr tutorial , https://lucene.apache.org/solr/guide/7_7/solr-tutorial.html#create-a-catchall-copy-field ,without adding copyField with src = "*" and dest = "_text_" , I created my own field named "cfield1" of type text_general and added a copyField with src = "*" and dest = "cfield1". Po

Questions on nested child document split using the "split" parameter

2019-03-27 Thread Zheng Lin Edwin Yeo
wse/SOLR-12633>: When JSON data is sent to Solr with nested child documents split using the "split" parameter, the child docs will now be associated to their parents by the field/label string used in the JSON instead of anonymously. Most users probably won't notice the distinction s

Re: Different behavior when using function queries

2019-03-18 Thread Erik Hatcher
If you have no documents in the results, there’s nothing to attach the function result to.`fl` is field list of fields to show in matched documents. You have no matches documents. Erik > On Mar 18, 2019, at 07:55, Ashish Bisht wrote: > > Can someone please explain the below behavio

Different behavior when using function queries

2019-03-18 Thread Ashish Bisht
Can someone please explain the below behavior.For different q parameter function query response differs although function queries are same http://:8983/solr/SCSpell/select?q="*market place*"&defType=edismax&qf=spellcontent&wt=json&rows=1&fl=internet_of_things:if(exists(query({!edismax v='"internet

Re: Using solr graph to traverse N relationships

2019-03-13 Thread Pratik Patel
Problem #1 can probably be solved by using "fetch" function. ( https://lucene.apache.org/solr/guide/6_6/stream-decorators.html#fetch) Problem #2 and #3 can be solved by normalizing the graph connections and by applying cartesianProduct on multi valued field, as described here. htt

Using solr graph to traverse N relationships

2019-03-13 Thread Nightingale, Jonathan A (US)
erflow.com/questions/55130208/using-solr-graph-to-traverse-n-relationships I'm investigating if I can use an existing solr store to do graph traversal. It would be ideal to not have to duplicate the data in a graph store. I was playing with the solr streaming capabilities and the nodes (gatherNo

RE: Index database with SolrJ using xml file directly throws an error

2019-03-04 Thread sami
Thanks James, it works! -- Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html

RE: Index database with SolrJ using xml file directly throws an error

2019-03-01 Thread Dyer, James
Instead of dataConfig=data-config.xml, use config=data-config.xml . From: sami Sent: Friday, March 1, 2019 3:05 AM To: solr-user@lucene.apache.org Subject: RE: Index database with SolrJ using xml file directly throws an error Hi James, Thanks for your reply. I am not absolotuely sure I

RE: Index database with SolrJ using xml file directly throws an error

2019-03-01 Thread sami
lutely fine. But, I want to use SolrJ API instead of using the inbuilt execute function. The data-config.xml and solrconfig.xml works fine with my database. I am using the same data-config.xml file and solrconfig.xml file to do the indexing with program mentioned in my query. String url =

RE: Index database with SolrJ using xml file directly throws an error

2019-02-28 Thread Dyer, James
e filename (eg. data-config.xml). This file is the DIH configuration, not solrconfig.xml as you are using. It is just the filename, or path starting at the base configuration directory, not a full path as you are using. Unless you want users to override the DIH configuration at request time, it is

Re: Index database with SolrJ using xml file directly throws an error

2019-02-28 Thread Erick Erickson
dex my database using SolrJ Java API. I have already tried > to use DIH directly from the Solr server. It works and indexes well. But > when I would like to use the same XML config file with SolrJ it throws an > error. > > **Solr version 7.6.0 SolrJ 7.6.0** > > Here

Index database with SolrJ using xml file directly throws an error

2019-02-28 Thread sami
I would like to index my database using SolrJ Java API. I have already tried to use DIH directly from the Solr server. It works and indexes well. But when I would like to use the same XML config file with SolrJ it throws an error. **Solr version 7.6.0 SolrJ 7.6.0** Here is the full code I am

Re: Is anyone using proxy caching in front of solr?

2019-02-25 Thread Walter Underwood
wood >> wun...@wunderwood.org >> http://observer.wunderwood.org/ (my blog) >> >>> On Feb 25, 2019, at 7:57 AM, Edward Ribeiro >> wrote: >>> >>> Maybe you could add a length filter factory to filter out queries with 2 >> or >>>

Re: Is anyone using proxy caching in front of solr?

2019-02-25 Thread Michael Gibney
19, at 7:57 AM, Edward Ribeiro > wrote: > > > > Maybe you could add a length filter factory to filter out queries with 2 > or > > 3 characters using > > > https://lucene.apache.org/solr/guide/7_4/filter-descriptions.html#FilterDescriptions-LengthFilter > > ? &

Re: Is anyone using proxy caching in front of solr?

2019-02-25 Thread Walter Underwood
019, at 7:57 AM, Edward Ribeiro wrote: > > Maybe you could add a length filter factory to filter out queries with 2 or > 3 characters using > https://lucene.apache.org/solr/guide/7_4/filter-descriptions.html#FilterDescriptions-LengthFilter > ? > > PS: this filter requires a max

Re: Is anyone using proxy caching in front of solr?

2019-02-25 Thread Edward Ribeiro
Maybe you could add a length filter factory to filter out queries with 2 or 3 characters using https://lucene.apache.org/solr/guide/7_4/filter-descriptions.html#FilterDescriptions-LengthFilter ? PS: this filter requires a max length too. Edward Em qui, 21 de fev de 2019 04:52, Furkan KAMACI

Re: Is anyone using proxy caching in front of solr?

2019-02-20 Thread Furkan KAMACI
cters in the search field solr starts > serving hits. > Of course this generates a lot of "unnecessary" queries (in the sense that > they are never shown to the user) which is why I started thinking about > using something like squid or varnish to cache a bunch of these 2-4 >

Is anyone using proxy caching in front of solr?

2019-02-20 Thread Joakim Hansson
own to the user) which is why I started thinking about using something like squid or varnish to cache a bunch of these 2-4 character queries. It seems most stuff I find about it is from pretty old sources, but as far as I know solrcloud doesn't have distributed cache support. Our indexes aren

Re: Using the terms component in Solr Cloud gives random result

2019-02-01 Thread Markus Kalkbrenner
I’ll answer my own question: setting distrib=true solved the issue … mostly Our client requests wt=json&json.nl =flat But the result isn’t flat! As soon I set distrib=true the json response is formatted as a map instead go being flat. Did I find a bug or this a known limitati

Using the terms component in Solr Cloud gives random result

2019-01-31 Thread Markus Kalkbrenner
Hi, I tried to use the terms component with the techproducts example in cloud mode and was surprised, that the results toggle for each request. The response to http://localhost:8983/solr/techproducts/terms?terms=true&terms.fl=name permanently toggles between these two results: { "responseHead

Re: Error using collapse parser with /export

2019-01-29 Thread Rahul Goswami
- The > collapsing itself is causing the failure (or did I not understand your > question right?) > 2) After exporting is it possible to unique the records using the > unique Streaming Expression? (This can't be done since we require the > unique document in a group subject to a

Limit facet terms based on a substring using the JSON facet API

2019-01-29 Thread Tom Van Cuyck
Hi In the old Solr facet API there are the facet.contains and facet.conains.ignoreCase parameters to limit the facet values to those terms containing the specified substring. Is there an equivalent option in the JSON facet API? Or is there a way to obtain the same behavior with the JSON API? I can

Re: Error using collapse parser with /export

2019-01-27 Thread Rahul Goswami
Hi Joel, Thanks for responding to the query. Answers to your questions: 1) After collapsing is it not possible to use the /select handler? - The collapsing itself is causing the failure (or did I not understand your question right?) 2) After exporting is it possible to unique the records using

Re: Error using collapse parser with /export

2019-01-21 Thread Joel Bernstein
handler? 2) After exporting is it possible to unique the records using the unique Streaming Expression? Either of those cases would be the typical uses of these features. Joel Bernstein http://joelsolr.blogspot.com/ On Sun, Jan 20, 2019 at 10:13 PM Rahul Goswami wrote: > Hello, > > Follo

Re: Error using collapse parser with /export

2019-01-20 Thread Rahul Goswami
an 17, 2019 at 4:58 PM Rahul Goswami wrote: > Hello, > > I am using SolrCloud on Solr 7.2.1. > I get the NullPointerException in the Solr logs (in ExportWriter.java) > when the /stream handler is invoked with a search() streaming expression > with qt="/export" contain

Error using collapse parser with /export

2019-01-17 Thread Rahul Goswami
Hello, I am using SolrCloud on Solr 7.2.1. I get the NullPointerException in the Solr logs (in ExportWriter.java) when the /stream handler is invoked with a search() streaming expression with qt="/export" containing fq="{!collapse field=id_field sort="time desc"} (

stats.field using Config API

2019-01-17 Thread Antelmo Aguilar
lay.json file, but I can't get it working. I am only able to set one of the stats field by using something like this: "requesthanlder": { "name": "/query", "class": "solr.SearchHandler", "defaults": {

Re: Logging fails when starting Solr in Windows using solr.cmd

2019-01-15 Thread Oskar
I faced the same issue as jakob with solr-7.6.0, eclipse-2018-12 (4.10.0), Java 1.8.0_191: *Solution:* In eclipse Run Configuration run-solr remove "file:" from Argument -Dlog4j.configurationFile="file:${workspace_loc:solr-7.6.0}/solr/server/resources/log4j2.xml" -- Sent from: http://lucene.47

Re: Federated / Distributed search using Solr

2018-12-29 Thread Erick Erickson
bq. I have 3 indexes and all three use different schema. I assume that the user can't indicate a field to search against or this is a total non-starter. Even if you are just sending bare search terms to the different collections, how are you going to compare returns from them? Scores aren't compa

Re: Federated / Distributed search using Solr

2018-12-29 Thread Gus Heck
If the indexes are on the same cluster, you can use an alias for querying. Updating via aliases doesn't work well (updates go to the first collection listed only) unless it's a time routed alias. http://lucene.apache.org/solr/guide/7_6/collections-api.html#createalias On Sat, Dec 29, 2018 at 4:

Federated / Distributed search using Solr

2018-12-29 Thread Steven White
Hi everyone, Does Solr support federated / distributed search when the schema of the 2 or more indexes are different? I have 3 indexes and all three use different schema. However, the unique key field name is the same across all 3 indexes. I need to search on all 3 indexes and return results as

Re: Geofilt and distance measurement problems using SpatialRecursivePrefixTreeFieldType field type

2018-12-23 Thread David Smiley
our data. In my defence that is > far from obvious in the documentation. > > Thanks again for your help. > > Cheers, > Peter. > > -Original Message- > From: David Smiley [mailto:david.w.smi...@gmail.com] > Sent: 21 December 2018 04:44 > To:

RE: Geofilt and distance measurement problems using SpatialRecursivePrefixTreeFieldType field type

2018-12-21 Thread Peter Lancaster
ation. Thanks again for your help. Cheers, Peter. -Original Message- From: David Smiley [mailto:david.w.smi...@gmail.com] Sent: 21 December 2018 04:44 To: solr-user@lucene.apache.org Subject: Re: Geofilt and distance measurement problems using SpatialRecursivePrefixTreeFieldType field

RE: Geofilt and distance measurement problems using SpatialRecursivePrefixTreeFieldType field type

2018-12-21 Thread Peter Lancaster
Hi David, Thanks for coming back to me. When using rpt fields I believe you do need to use a space between Lat and Lon to indicate a point; for rpt fields commas are used to separate points in a polygon. See https://archive.apache.org/dist/lucene/solr/ref-guide/apache-solr-ref-guide-5.5.pdf

Re: Geofilt and distance measurement problems using SpatialRecursivePrefixTreeFieldType field type

2018-12-20 Thread David Smiley
Lancaster < peter.lancas...@findmypast.com> wrote: > I am currently using Solr 5.5.2 and implementing a GeoSpatial search that > returns results within a radius in Km of a specified LatLon. Using a field > of type solr.LatLonType and a geofilt query this gives good results but is > much

Geofilt and distance measurement problems using SpatialRecursivePrefixTreeFieldType field type

2018-12-13 Thread Peter Lancaster
I am currently using Solr 5.5.2 and implementing a GeoSpatial search that returns results within a radius in Km of a specified LatLon. Using a field of type solr.LatLonType and a geofilt query this gives good results but is much slower than our regular queries. Using a bbox query is faster but

Using streaming expressions with a pre-existing schema (aka migrating a standalone instance to SolrCloud)

2018-12-10 Thread Guillaume Rossolini
Hi there, This is about undocumented restrictions about using streaming expressions (in the sense that I haven't found the right documentation). ** Setup I just followed the documentation to start SolrCloud on my local machine, and I made it so it would replace the previous standalone ser

RE: Nested Documents without using "type" field ? Possible or Not ?

2018-12-06 Thread Bruno Mannina
-user@lucene.apache.org Objet : Nested Documents without using "type" field ? Possible or Not ? Hello, I would like to use SOLR to index the Cooperative Patent Classification, The CPC has a hierarchical structure and it can have more than 20 level. It's a basic structure without Type of n

Nested Documents without using "type" field ? Possible or Not ?

2018-12-05 Thread Bruno Mannina
Hello, I would like to use SOLR to index the Cooperative Patent Classification, The CPC has a hierarchical structure and it can have more than 20 level. It's a basic structure without Type of nested doc. i.e: A -> A01 -> A01B -> A01B3/00 -> A01B3/40 -> A01B3/4025 . A -> A01 -> A01L -> A01L1

Re: Solr Setup using NRT and PULL replicas

2018-12-02 Thread Daniel Carrasco
about new PULL nodes. > > > > We've for now Solr 7.2.1, but we're planning to migrate to Solr 7.5, and > > I've read on Solr guide that recommended setups are: > > > >- All NRT > >- All TLOG > >- Some TLOG with PULL replicas > >

Re: Solr Setup using NRT and PULL replicas

2018-12-02 Thread Edward Ribeiro
and > I've read on Solr guide that recommended setups are: > >- All NRT >- All TLOG >- Some TLOG with PULL replicas > > We're not fully convinced about TLOG replicas because we've read something > about index problems if a node goes down suddenly,

Solr Setup using NRT and PULL replicas

2018-11-30 Thread Daniel Carrasco
G - Some TLOG with PULL replicas We're not fully convinced about TLOG replicas because we've read something about index problems if a node goes down suddenly, or using kill -9 (just what the init.d script does if takes long to stop/restart), and is the leader, or about the increase in

Re: solr is using TLS1.0

2018-11-30 Thread Jan Høydahl
ucene.apache.org > Date: 22-11-2018 12:53 > Subject: Re: solr is using TLS1.0 > > > > > Hi Anchal, > > the IBM JVM behaves differently in the TLS setup then the Oracle JVM. If > you search for IBM Java TLS 1.2 you find tons of reports of problems >

Re: solr is using TLS1.0

2018-11-29 Thread Anchal Sharma2
user@lucene.apache.org Date: 22-11-2018 12:53 Subject:Re: solr is using TLS1.0 Hi Anchal, the IBM JVM behaves differently in the TLS setup then the Oracle JVM. If you search for IBM Java TLS 1.2 you find tons of reports of problems with that. In most cases you can get around that using

Re: Autoscaling using triggers to create new replicas

2018-11-27 Thread Daniel Carrasco
Hello, Finally I've found the way to do it. Just limiting the replicas number by node using policies is the trick: curl -X POST -H 'Content-Type: application/json' ' http://localhost:8983/api/cluster/autoscaling' --data-binary '{ "set-cluster-policy": [{

Autoscaling using triggers to create new replicas

2018-11-26 Thread Daniel Carrasco
dmin/collections?action=CREATE&name=test&numShards=1&replicationFactor=1&maxShardsPerNode=1 " curl " http://127.0.0.1:8983/solr/admin/collections?action=CREATE&name=test2&numShards=1&replicationFactor=1&maxShardsPerNode=1 " I've added the two trigger

Re: Solr Cloud - Store Data using multiple drives -2

2018-11-25 Thread Shawn Heisey
, the precise way in which you want to use multiple drives is not possible.  You cannot split a single core between two storage locations. Another detail:  If your disk is actually full, then you are using Solr improperly. In order to ensure everything works properly, you should have enough

Re: Solr Cloud - Store Data using multiple drives -2

2018-11-22 Thread Charlie Hull
On 22/11/2018 11:50, Tech Support wrote: Dear Solr Team, I am using SOLR 7.5.0 in Windows OS (SOLR Cloud). My primary need is , If the current data storage drive is full, I need to use another one drive without moving the existing data into the new location. If I add new the dataDir

Solr Cloud - Store Data using multiple drives -2

2018-11-22 Thread Tech Support
Dear Solr Team, I am using SOLR 7.5.0 in Windows OS (SOLR Cloud). My primary need is , If the current data storage drive is full, I need to use another one drive without moving the existing data into the new location. If I add new the

Re: solr is using TLS1.0

2018-11-21 Thread Hendrik Haddorp
Hi Anchal, the IBM JVM behaves differently in the TLS setup then the Oracle JVM. If you search for IBM Java TLS 1.2 you find tons of reports of problems with that. In most cases you can get around that using the system property "com.ibm.jsse2.overrideDefaultTLS" as documented he

Re: solr is using TLS1.0

2018-11-21 Thread Anchal Sharma2
Hi Shawn , Thanks for your reply . Here are the details abut java we are using : java version "1.8.0_151" IBM J9 VM (build 2.9, JRE 1.8.0 AIX ppc64-64 Compressed References 20171102_369060 (JIT enabled, AOT enabled) I have already patched the policy jars . And I tried to comme

Re: Solr Cloud - Store Data using multiple drives

2018-11-21 Thread Shawn Heisey
On 11/21/2018 7:32 AM, Tech Support wrote: As per your suggestion, I had add the dataDir in the core.properties file. It creates the data directory on the new location. But newly added data only accessable. If I move the existing index files into the new location, then only I can able to

Re: Solr Cloud - Store Data using multiple drives

2018-11-21 Thread Alexandre Rafalovitch
gt; > > Thanks , > > Karthick Ramu > > > > From: Tech Support [mailto:techsupp...@sardonyx.in] > Sent: Monday, November 19, 2018 7:15 PM > To: 'solr-user@lucene.apache.org' > Subject: Solr Cloud - Store Data using multiple drives > > > > Hel

Re: Solr Cloud - Store Data using multiple drives

2018-11-21 Thread Erick Erickson
le to read the data. > > Is it possible to read both the existing and new data, without moving > existing data into new data location ? > > > Thanks , > > Karthick Ramu > > > > From: Tech Support [mailto:techsupp...@sardonyx.in] > Sent: Monday, November 19, 2018 7:15

RE: Solr Cloud - Store Data using multiple drives

2018-11-21 Thread Tech Support
Solr Cloud - Store Data using multiple drives Hello Solr Team, I am using Solr 7.5. , Indexed data stored in the Solr Installation directory. I need the following features, Is it possible to achieve the following scenarios in SOLR Cloud? 1. If the disk free space is completed, is it p

Re: solr is using TLS1.0

2018-11-20 Thread Shawn Heisey
On 11/20/2018 3:02 AM, Anchal Sharma2 wrote: I have enabled SSL for solr using steps mentioned over Lucene website .And though solr console URL is now secure(https) ,it is still using TLS v1.0. I have tried few things to force SSL to use TLS1.2 protocol ,but they have not worked for me

solr is using TLS1.0

2018-11-20 Thread Anchal Sharma2
Hi All, I have enabled SSL for solr using steps mentioned over Lucene website .And though solr console URL is now secure(https) ,it is still using TLS v1.0. I have tried few things to force SSL to use TLS1.2 protocol ,but they have not worked for me . While trying to do same ,I have

Re: Solr Cloud - Store Data using multiple drives

2018-11-19 Thread Shawn Heisey
On 11/19/2018 6:44 AM, Tech Support wrote: 1. If the disk free space is completed, is it possible to configure another drive? Which means, if C drive free space is over need to configure the D drive. I need to read the data from both C and D drives. There is no automated way to do this.

Re: Solr Cloud - Store Data using multiple drives

2018-11-19 Thread Alexandre Rafalovitch
This seems very similar to: https://lists.apache.org/thread.html/48b6dcb20058de29936616633b88d21e1b6f6a32bc968d161eae4a21@%3Csolr-user.lucene.apache.org%3E Regards, Alex. On Mon, 19 Nov 2018 at 11:15, Tech Support wrote: > > Hello Solr Team, > > > > I am using Solr 7.5. , I

Solr Cloud - Store Data using multiple drives

2018-11-19 Thread Tech Support
Hello Solr Team, I am using Solr 7.5. , Indexed data stored in the Solr Installation directory. I need the following features, Is it possible to achieve the following scenarios in SOLR Cloud? 1. If the disk free space is completed, is it possible to configure another drive? Which means

Re: Trouble using the MIGRATE command in the collections API on solr 7.3.1

2018-11-17 Thread yogesh.sh
Even am facing exactly the same issue, all looks good the response is success but the documents are not migrated. Was this issue resolved? Am using version SOLR 5.5. Thanks, Yogesh -- Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html

Re: Trouble using the MIGRATE command in the collections API on solr 7.3.1

2018-11-17 Thread yogesh.sh
Even am getting exactly the same issue, am using SOLR 5.5 version. Please suggest what was the fix for this issue? Documents are not getting moved to the target collection even after getting success message. Thanks, Yogesh -- Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068

Re: How to log full URL using Jetty RequestLogHandler in Solr 7

2018-11-08 Thread dimaf
Yes, jetty.xml looks the same for Solr 6 and Solr 7, no difference for the log request section: diff ~dbl/solr/solr-6.6.0/server/etc/jetty.xml ~dbl/solr/solr-7.4.0/server/etc/jetty.xml 45c45 < --- > name="solr.jetty.threads.idle.timeout" default="12"/> 107a108,115 > >

Re: SolrCloud Using Solrconfig.xml Instead of Configoverlay.json for RequestHandler QF

2018-11-07 Thread Jan Høydahl
clean solr. -- Jan Høydahl, search solution architect Cominvent AS - www.cominvent.com > 5. nov. 2018 kl. 21:00 skrev Corey Ellsworth : > > Hello, > > I'm using Solr Cloud 6.6. I have a situation where I have a RequestHandler > configuration that exists in both the

Re: How to log full URL using Jetty RequestLogHandler in Solr 7

2018-11-07 Thread Shawn Heisey
staying in the channel to inform people of the change.  It was recommended to open an issue here: https://github.com/eclipse/jetty.project/issues/new I do not have the precise information about the Solr/Jetty versions you're running, which will be needed for an issue.  Are you usin

Re: How to log full URL using Jetty RequestLogHandler in Solr 7

2018-11-07 Thread dimaf
Shawn, thanks a lot. -- Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html

Re: How to log full URL using Jetty RequestLogHandler in Solr 7

2018-11-07 Thread Shawn Heisey
On 11/7/2018 8:01 AM, dimaf wrote: After migration from Solr 6 to Solr 7, Jetty RequestLogHandler logs only the path and parameters of URL instead of saving full URL as it does in Solr 6. So the question is how Jetty/Solr can be configured to log full URL? That is 100 percent Jetty config. 

How to log full URL using Jetty RequestLogHandler in Solr 7

2018-11-07 Thread dimaf
After migration from Solr 6 to Solr 7, Jetty RequestLogHandler logs only the path and parameters of URL instead of saving full URL as it does in Solr 6. Solr 6: 127.0.0.1 - - [07/Nov/2018:09:34:27 -0700] "GET //localhost:/solr/admin/collections?action=CLUSTERSTATUS&wt=json HTTP/1.1" 200 3088

SolrCloud Using Solrconfig.xml Instead of Configoverlay.json for RequestHandler QF

2018-11-05 Thread Corey Ellsworth
Hello, I'm using Solr Cloud 6.6. I have a situation where I have a RequestHandler configuration that exists in both the solrconfig.xml file and configoverlay.json file (We inherited this application and are not sure why it is set up like this). From reading the documentation, it seem

Re: Edismax query returning the same number of results using AND as it does with OR

2018-10-26 Thread Shawn Heisey
Followup: I had a theory that Nicky tested, and I think what was observed confirms the theory. TL;DR: In previous versions, I think there was a bug where the presence of boolean operators caused edismax to ignore the mm parameter, and only rely on the boolean operator(s). After that bug got

RE: Reading data using Tika to Solr

2018-10-26 Thread Martin Frank Hansen (MHQ)
Hi Tim, Thanks again, I will update Tika and try it again. -Original Message- From: Tim Allison Sent: 26. oktober 2018 12:53 To: solr-user@lucene.apache.org Subject: Re: Reading data using Tika to Solr Ha...emails passed in the ether. As you saw, we added the RecursiveParserWrapper a

Re: Reading data using Tika to Solr

2018-10-26 Thread Tim Allison
concatenates contents, throws out attachment metadata and silently swallows attachment exceptions. On Fri, Oct 26, 2018 at 6:25 AM Martin Frank Hansen (MHQ) wrote: > Hi again, > > Never mind, I got manage to get the content of the msg-files as well using > the following link as inspirat

Re: Reading data using Tika to Solr

2018-10-26 Thread Tim Allison
how do I get it to read the attachments as well? > > -Original Message- > From: Tim Allison > Sent: 25. oktober 2018 21:57 > To: solr-user@lucene.apache.org > Subject: Re: Reading data using Tika to Solr > > If you’re processing actual msg (not eml), you’ll also nee

RE: Reading data using Tika to Solr

2018-10-26 Thread Martin Frank Hansen (MHQ)
Hi again, Never mind, I got manage to get the content of the msg-files as well using the following link as inspiration: https://wiki.apache.org/tika/RecursiveMetadata But thanks again for all your help! -Original Message- From: Martin Frank Hansen (MHQ) Sent: 26. oktober 2018 10:14 To

RE: Reading data using Tika to Solr

2018-10-26 Thread Martin Frank Hansen (MHQ)
Hi Tim, It is msg files and I added tika-app-1.14.jar to the build path - and now it works 😊 But how do I get it to read the attachments as well? -Original Message- From: Tim Allison Sent: 25. oktober 2018 21:57 To: solr-user@lucene.apache.org Subject: Re: Reading data using Tika to

Re: Edismax query returning the same number of results using AND as it does with OR

2018-10-25 Thread Zheng Lin Edwin Yeo
Hi, What is your full query path or URL that you pass for the query? And how is your setting like for the edismax in your solrconfig.xml? Regards, Edwin On Fri, 26 Oct 2018 at 06:24, Nicky Mastin wrote: > > Oddity with edismax and queries involving boolean operators. Here's the > "parsedquery

Edismax query returning the same number of results using AND as it does with OR

2018-10-25 Thread Nicky Mastin
Oddity with edismax and queries involving boolean operators. Here's the "parsedquery_toString" from two different queries: input: "dog AND kiwi": https://apaste.info/gaQl input: "dog OR kiwi": https://apaste.info/sBwa Both queries return the same number of results (389). The query with OR was

Re: Reading data using Tika to Solr

2018-10-25 Thread Tim Allison
t; Thanks for your answers, I can see that my mail got messed up on the way > through the server. It looked much more readable at my end 😉 The > attachment simply included my build-path. > > @Erick I am compiling the program using Netbeans at the moment. > > I updated to tika-1.7

RE: Reading data using Tika to Solr

2018-10-25 Thread Martin Frank Hansen (MHQ)
Hi Erick and Tim, Thanks for your answers, I can see that my mail got messed up on the way through the server. It looked much more readable at my end 😉 The attachment simply included my build-path. @Erick I am compiling the program using Netbeans at the moment. I updated to tika-1.7 but that

Re: Reading data using Tika to Solr

2018-10-25 Thread Tim Allison
To follow up w Erick’s point, there are a bunch of transitive dependencies from tika-parsers. If you aren’t using maven or similar build system to grab the dependencies, it can be tricky to get it right. If you aren’t using maven, and you can afford the risks of jar hell, consider using tika-app

Re: Reading data using Tika to Solr

2018-10-25 Thread Erick Erickson
nsen (MHQ) wrote: > > Hi, > > > > I am trying to read content of msg-files using Tika and index these in Solr, > however I am having some problems with the OfficeParser(). I keep getting the > error java.lang.NoClassDefFoundError for the OfficeParcer, even though both > tika-

Reading data using Tika to Solr

2018-10-25 Thread Martin Frank Hansen (MHQ)
Hi, I am trying to read content of msg-files using Tika and index these in Solr, however I am having some problems with the OfficeParser(). I keep getting the error java.lang.NoClassDefFoundError for the OfficeParcer, even though both tika-core and tika-parsers are included in the build path

Re: Storing & using feature vectors

2018-10-22 Thread Ken Krugler
t; I know a couple of people who have worked on solutions. And I've used a > couple of hacks: > > - You can hack together something that does cosine similarity using the > term frequency & query boosts DelimitedTermFreqFilterFactory. Basically the > term frequency becomes a

Re: Storing & using feature vectors

2018-10-19 Thread Doug Turnbull
This is a pretty big hole in Lucene-based search right now that many practitioners have struggled with I know a couple of people who have worked on solutions. And I've used a couple of hacks: - You can hack together something that does cosine similarity using the term frequency & que

Storing & using feature vectors

2018-10-19 Thread Ken Krugler
Hi all, [I posted on the Lucene list two days ago, but didn’t see any response - checking here for completeness] I’ve been looking at directly storing feature vectors and providing scoring/filtering support. This is for vectors consisting of (typically 300 - 2048) floats or doubles. It’s fol

Re: Named entity extraction/correlation using Semantic Knowledge Graph

2018-10-18 Thread Pratik Patel
I am on look out for ideas too but I was thinking of using some NER technique to index named entities in a specific field and then use Semantic Knowledge Graph on that specific field i.e. limit SKG queries to that field only. I am not sure however if this would produce desired results. I don&#

Highlight documents using group.query?

2018-10-18 Thread atawfik
Hi, if I am using a group.query to get documents, is there a way to highlight the documents matching group.query using the matching query itself? If I am not mistaken currently solr will highlight documents using the main query pass via the request q parameter? -- Sent from: http://lucene

<    1   2   3   4   5   6   7   8   9   10   >