there.
very appreciative of your thoughts.
Nick
On Friday, October 17, 2014 4:57:52 PM UTC-7, Nick Tackes wrote:
Hello, I am experimenting with word_delimiter and have an example with a
special character that is indexed. The character is in the type table for
the word delimiter. analysis
: {
query : HER2@
}
}
}
}'
curl -X DELETE localhost:9200/specialchars
On Friday, October 17, 2014 4:57:52 PM UTC-7, Nick Tackes wrote:
Hello, I am experimenting with word_delimiter and have an example with a
special character that is indexed. The character is in the type
Hello, I am experimenting with word_delimiter and have an example with a
special character that is indexed. The character is in the type table for
the word delimiter. analysis of the tokenization looks good, but when i
attempt to do a match query it doesnt seem to respect tokenization as
I am passing in a full phrase and using a shingle filter to match sub
phrases in that string. What I would like is to be able to 'mark' the
initial request with what is matched and what is not matched. Is there a
way to highlight or get the offset position of where the match was made
within
I have created a gist with an analyzer that uses filter shingle in attempt
to match sub phrases.
For instance I have entries in the table with discrete phrases like
EGFR
Lung Cancer
Lung
Cancer
and I want to match these when searching the phrase 'EGFR related lung
cancer
My
, July 23, 2014 9:37:03 AM UTC-5, Nick Tackes wrote:
I have created a gist with an analyzer that uses filter shingle in
attempt to match sub phrases.
For instance I have entries in the table with discrete phrases like
EGFR
Lung Cancer
Lung
Cancer
and I want to match these when
Not a direct match,
but i did locate this grammar to lucene.
https://github.com/thoward/lucene-query-parser.js/blob/master/lucene-query.grammar
On Tuesday, April 22, 2014 1:26:23 PM UTC-7, Lukáš Vlček wrote:
Hi,
is there available any formal grammar of Query DSL for Elasticsearch? (may