Hi,

I'm currently using solr to index a moderate amount of information with
the help of logstash and the solr_http contrib output plugin.

solr is receiving documents, I've got banana as a web interface and I am
running it with a schemaless core.

I'm feeding documents via the contrib plugin solr_http and logstash. One
of the filters I'm using is geoip with the following setup:

  geoip {
    source => "subject_ip"
    database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
    target => "geoip"
    fields => ["latitude", "longitude"]
  }

However this created a "string" field called geoip with the value:
{"latitude"=>2.0, "longitude"=>13.0, "location"=>[2.0, 13.0]}

This is "meant" to become three "sub" fields:
geoip.latitude => 2.0
geoip.longitude => 13.0
geoip.location => 2.0, 13.0

The above setup worked with logstash feeding into elasticsearch,
resulting in geoip.location being populated correctly as a field itself.

Given it did work with ES, I assume the first issue is, solr either does
not know how to parse a value as additional variables with values, OR I
simply have not configured solr correctly (I'm betting on the latter).

I have only been using solr for about 8 hours (installed today), had to
try something as no amount of tweaking would resolve the indexing
performance issues I had with ES - I'm now indexing the same amount of
data into solr at near real-time on the exact same machine that was
running ES where indexing would stop after about 2 hours.

The whole point of the geoip field is to get geoip.location which will
be the location field used by bettermap on the banana web interface.

I am not running SiLK.
I am running solr 5.1, logstash 1.4.

Regards,
Daniel

Reply via email to