[ 
https://issues.apache.org/jira/browse/SOLR-2155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13185224#comment-13185224
 ] 

David Smiley commented on SOLR-2155:
------------------------------------

Srikanth,
# According to the table on wikipedia, a geohash length of 8 translates to +- 
19 meters error.  This means that if you do a filter query, it may match points 
that are <=19 meters outside of the query shape erroneously, or it may exclude 
points <= 19 meters within the query shape erroneously.  The accuracy is 
associated to the indexing of the data, not the query shape.
# At query time, if you only do geospatial filters, then there is *no* RAM 
requirement for this field.  *If* you sort by distance or influence relevancy 
by distance, then the indexed points are all brought into memory "inverted" so 
that the center of the query shape can be compared against all indexed points 
for documents matching the query.  LatLonType does the same thing for the same 
reason, but this code is multiValue aware, returning the closest indexed point 
a document has to the query shape.  When a commit happens on Solr, this 
in-memory cache is unused and garbage collected, and the data is brought into 
memory again.
                
> Geospatial search using geohash prefixes
> ----------------------------------------
>
>                 Key: SOLR-2155
>                 URL: https://issues.apache.org/jira/browse/SOLR-2155
>             Project: Solr
>          Issue Type: Improvement
>            Reporter: David Smiley
>         Attachments: GeoHashPrefixFilter.patch, GeoHashPrefixFilter.patch, 
> GeoHashPrefixFilter.patch, 
> SOLR-2155_GeoHashPrefixFilter_with_sorting_no_poly.patch, SOLR.2155.p3.patch, 
> SOLR.2155.p3tests.patch, Solr2155-1.0.2-project.zip, 
> Solr2155-1.0.3-project.zip, Solr2155-for-1.0.2-3.x-port.patch
>
>
> There currently isn't a solution in Solr for doing geospatial filtering on 
> documents that have a variable number of points.  This scenario occurs when 
> there is location extraction (i.e. via a "gazateer") occurring on free text.  
> None, one, or many geospatial locations might be extracted from any given 
> document and users want to limit their search results to those occurring in a 
> user-specified area.
> I've implemented this by furthering the GeoHash based work in Lucene/Solr 
> with a geohash prefix based filter.  A geohash refers to a lat-lon box on the 
> earth.  Each successive character added further subdivides the box into a 4x8 
> (or 8x4 depending on the even/odd length of the geohash) grid.  The first 
> step in this scheme is figuring out which geohash grid squares cover the 
> user's search query.  I've added various extra methods to GeoHashUtils (and 
> added tests) to assist in this purpose.  The next step is an actual Lucene 
> Filter, GeoHashPrefixFilter, that uses these geohash prefixes in 
> TermsEnum.seek() to skip to relevant grid squares in the index.  Once a 
> matching geohash grid is found, the points therein are compared against the 
> user's query to see if it matches.  I created an abstraction GeoShape 
> extended by subclasses named PointDistance... and CartesianBox.... to support 
> different queried shapes so that the filter need not care about these details.
> This work was presented at LuceneRevolution in Boston on October 8th.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

Reply via email to