Re: Copying fields to a geopoint type ?
I can use it in Kibana 3 but not in Kibana 4. Any idea why? On Wednesday, 1 April 2015 20:31:05 UTC+1, Pascal VINCENT wrote: I finally come up with : if [latitude] and [longitude] { mutate { add_field = [ [location], %{longitude} ] add_field = [ [location], %{latitude} ] } mutate { convert = [ [location], float ] } } -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/83648add-5ea4-4a68-826a-58e7141dea12%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
Re: Copying fields to a geopoint type ?
Were you ever able to figure out a solution to this? I'm in a similar boat. On Thursday, September 11, 2014 at 2:14:29 AM UTC-7, Kushal Zamkade wrote: Hello, I have created a location filed by using below code if [latitude] and [longitude] { mutate { rename = [ latitude, [location][lat], longitude, [location][lon] ] } } But when i check location field type then it is not created as geo_point. when i am trying to search a geo_point then i am getting below error. QueryParsingException[[logstash-2014.09.11] failed to find geo_point field [location1]]; can you help me to resolve this On Thursday, April 10, 2014 2:42:22 AM UTC+5:30, Pascal VINCENT wrote: Hi, I have included logstash in my stack and started to play with it. I'm sure it can do the trick I was looking for, and much more. Thank you ... [waiting for your blog post :)] Pascal. On Mon, Apr 7, 2014 at 9:38 AM, Alexander Reelsen a...@spinscale.de wrote: Hey, I dont know about your stack, but maybe logstash would be a good idea to add it in there. It is more flexible than the csv river and features a CSV input as well. You can easily change the structure of the data you want to index. This is how the logstash config would look like if [latitude] and [longitude] { mutate { rename = [ latitude, [location][lat], longitude, [location][lon] ] } } I am currently working on a blog post how to utilize elasticsearch, logstash and kibana on CSV based data and hope to release it soonish on the .org blog - which covers exactly this. Stay tuned! :-) --Alex On Thu, Apr 3, 2014 at 12:21 AM, Pascal VINCENT pasvi...@gmail.com wrote: Hi, I'm new to elasticsearch. My usecase is to load a csv file containing some agencies with geo location, each lines are like : id;label;address;zipcode;city;region;*latitude*;*longitude*;(and some others fields)+ I'm using the csv river plugin to index the file. My mapping is : { office: { properties: { *(first fields omitted...)* *latitude*: { type: double, }, *longitude*: { type: double, }, *location*: { type: geo_point, lat_lon: true } } } I'd like to index the location .lon and .lat value from the latitude and longitude fields. I tried the copy_to function with no success : latitude: { type: double, copy_to: location.lat }, longitude: { type: double, copy_to: location.lon }, Is there any way to feed the location property from latitude and longitude fields at indexation ? My point is that I don't want to modify the input csv file to adapt it to the GeoJSON format (i.e concat lat and lon in one field in the csv file). Thank you for any hints. Pascal. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearc...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.com https://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to a topic in the Google Groups elasticsearch group. To unsubscribe from this topic, visit https://groups.google.com/d/topic/elasticsearch/QaI1fj74RlM/unsubscribe. To unsubscribe from this group and all its topics, send an email to elasticsearc...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAGCwEM-uHKT74qVbDT%3D8qg5Cv4vH0y%3DOzC8hGyO2uq_sY3sJ8g%40mail.gmail.com https://groups.google.com/d/msgid/elasticsearch/CAGCwEM-uHKT74qVbDT%3D8qg5Cv4vH0y%3DOzC8hGyO2uq_sY3sJ8g%40mail.gmail.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/4102f8c3-bdb5-457c-8adb-6c19cb2627c2%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
Re: Copying fields to a geopoint type ?
I finally come up with : if [latitude] and [longitude] { mutate { add_field = [ [location], %{longitude} ] add_field = [ [location], %{latitude} ] } mutate { convert = [ [location], float ] } } -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/47c8334d-109e-4052-9973-afa69dd49709%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
Re: Copying fields to a geopoint type ?
Woah crazy, never would've thought of that, thanks a lot for following up! On Wed, Apr 1, 2015 at 12:31 PM, Pascal VINCENT pasvinc...@gmail.com wrote: I finally come up with : if [latitude] and [longitude] { mutate { add_field = [ [location], %{longitude} ] add_field = [ [location], %{latitude} ] } mutate { convert = [ [location], float ] } } -- You received this message because you are subscribed to a topic in the Google Groups elasticsearch group. To unsubscribe from this topic, visit https://groups.google.com/d/topic/elasticsearch/QaI1fj74RlM/unsubscribe. To unsubscribe from this group and all its topics, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/47c8334d-109e-4052-9973-afa69dd49709%40googlegroups.com https://groups.google.com/d/msgid/elasticsearch/47c8334d-109e-4052-9973-afa69dd49709%40googlegroups.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- -- Alex -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CACXbQESyNcYr9g1u2dEFRovHE-NtB_JwugNn76G7_36Tm0Mteg%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Re: Copying fields to a geopoint type ?
Hello, I have created a location filed by using below code if [latitude] and [longitude] { mutate { rename = [ latitude, [location][lat], longitude, [location][lon] ] } } But when i check location field type then it is not created as geo_point. when i am trying to search a geo_point then i am getting below error. QueryParsingException[[logstash-2014.09.11] failed to find geo_point field [location1]]; can you help me to resolve this On Thursday, April 10, 2014 2:42:22 AM UTC+5:30, Pascal VINCENT wrote: Hi, I have included logstash in my stack and started to play with it. I'm sure it can do the trick I was looking for, and much more. Thank you ... [waiting for your blog post :)] Pascal. On Mon, Apr 7, 2014 at 9:38 AM, Alexander Reelsen a...@spinscale.de javascript: wrote: Hey, I dont know about your stack, but maybe logstash would be a good idea to add it in there. It is more flexible than the csv river and features a CSV input as well. You can easily change the structure of the data you want to index. This is how the logstash config would look like if [latitude] and [longitude] { mutate { rename = [ latitude, [location][lat], longitude, [location][lon] ] } } I am currently working on a blog post how to utilize elasticsearch, logstash and kibana on CSV based data and hope to release it soonish on the .org blog - which covers exactly this. Stay tuned! :-) --Alex On Thu, Apr 3, 2014 at 12:21 AM, Pascal VINCENT pasvi...@gmail.com javascript: wrote: Hi, I'm new to elasticsearch. My usecase is to load a csv file containing some agencies with geo location, each lines are like : id;label;address;zipcode;city;region;*latitude*;*longitude*;(and some others fields)+ I'm using the csv river plugin to index the file. My mapping is : { office: { properties: { *(first fields omitted...)* *latitude*: { type: double, }, *longitude*: { type: double, }, *location*: { type: geo_point, lat_lon: true } } } I'd like to index the location .lon and .lat value from the latitude and longitude fields. I tried the copy_to function with no success : latitude: { type: double, copy_to: location.lat }, longitude: { type: double, copy_to: location.lon }, Is there any way to feed the location property from latitude and longitude fields at indexation ? My point is that I don't want to modify the input csv file to adapt it to the GeoJSON format (i.e concat lat and lon in one field in the csv file). Thank you for any hints. Pascal. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearc...@googlegroups.com javascript:. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.com https://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to a topic in the Google Groups elasticsearch group. To unsubscribe from this topic, visit https://groups.google.com/d/topic/elasticsearch/QaI1fj74RlM/unsubscribe. To unsubscribe from this group and all its topics, send an email to elasticsearc...@googlegroups.com javascript:. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAGCwEM-uHKT74qVbDT%3D8qg5Cv4vH0y%3DOzC8hGyO2uq_sY3sJ8g%40mail.gmail.com https://groups.google.com/d/msgid/elasticsearch/CAGCwEM-uHKT74qVbDT%3D8qg5Cv4vH0y%3DOzC8hGyO2uq_sY3sJ8g%40mail.gmail.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/d31447ff-ec4b-4273-a35c-fc5134aaedf0%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
Re: Copying fields to a geopoint type ?
Hi, I have included logstash in my stack and started to play with it. I'm sure it can do the trick I was looking for, and much more. Thank you ... [waiting for your blog post :)] Pascal. On Mon, Apr 7, 2014 at 9:38 AM, Alexander Reelsen a...@spinscale.de wrote: Hey, I dont know about your stack, but maybe logstash would be a good idea to add it in there. It is more flexible than the csv river and features a CSV input as well. You can easily change the structure of the data you want to index. This is how the logstash config would look like if [latitude] and [longitude] { mutate { rename = [ latitude, [location][lat], longitude, [location][lon] ] } } I am currently working on a blog post how to utilize elasticsearch, logstash and kibana on CSV based data and hope to release it soonish on the .org blog - which covers exactly this. Stay tuned! :-) --Alex On Thu, Apr 3, 2014 at 12:21 AM, Pascal VINCENT pasvinc...@gmail.comwrote: Hi, I'm new to elasticsearch. My usecase is to load a csv file containing some agencies with geo location, each lines are like : id;label;address;zipcode;city;region;*latitude*;*longitude*;(and some others fields)+ I'm using the csv river plugin to index the file. My mapping is : { office: { properties: { *(first fields omitted...)* *latitude*: { type: double, }, *longitude*: { type: double, }, *location*: { type: geo_point, lat_lon: true } } } I'd like to index the location .lon and .lat value from the latitude and longitude fields. I tried the copy_to function with no success : latitude: { type: double, copy_to: location.lat }, longitude: { type: double, copy_to: location.lon }, Is there any way to feed the location property from latitude and longitude fields at indexation ? My point is that I don't want to modify the input csv file to adapt it to the GeoJSON format (i.e concat lat and lon in one field in the csv file). Thank you for any hints. Pascal. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to a topic in the Google Groups elasticsearch group. To unsubscribe from this topic, visit https://groups.google.com/d/topic/elasticsearch/QaI1fj74RlM/unsubscribe. To unsubscribe from this group and all its topics, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAGCwEM-uHKT74qVbDT%3D8qg5Cv4vH0y%3DOzC8hGyO2uq_sY3sJ8g%40mail.gmail.comhttps://groups.google.com/d/msgid/elasticsearch/CAGCwEM-uHKT74qVbDT%3D8qg5Cv4vH0y%3DOzC8hGyO2uq_sY3sJ8g%40mail.gmail.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAHN8fD5eQXCPMZD9Y99de%2BqOAFGoQAByb15VOaBUT4maaEOEEQ%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Re: Copying fields to a geopoint type ?
Hey, I dont know about your stack, but maybe logstash would be a good idea to add it in there. It is more flexible than the csv river and features a CSV input as well. You can easily change the structure of the data you want to index. This is how the logstash config would look like if [latitude] and [longitude] { mutate { rename = [ latitude, [location][lat], longitude, [location][lon] ] } } I am currently working on a blog post how to utilize elasticsearch, logstash and kibana on CSV based data and hope to release it soonish on the .org blog - which covers exactly this. Stay tuned! :-) --Alex On Thu, Apr 3, 2014 at 12:21 AM, Pascal VINCENT pasvinc...@gmail.comwrote: Hi, I'm new to elasticsearch. My usecase is to load a csv file containing some agencies with geo location, each lines are like : id;label;address;zipcode;city;region;*latitude*;*longitude*;(and some others fields)+ I'm using the csv river plugin to index the file. My mapping is : { office: { properties: { *(first fields omitted...)* *latitude*: { type: double, }, *longitude*: { type: double, }, *location*: { type: geo_point, lat_lon: true } } } I'd like to index the location .lon and .lat value from the latitude and longitude fields. I tried the copy_to function with no success : latitude: { type: double, copy_to: location.lat }, longitude: { type: double, copy_to: location.lon }, Is there any way to feed the location property from latitude and longitude fields at indexation ? My point is that I don't want to modify the input csv file to adapt it to the GeoJSON format (i.e concat lat and lon in one field in the csv file). Thank you for any hints. Pascal. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.com?utm_medium=emailutm_source=footer . For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAGCwEM-uHKT74qVbDT%3D8qg5Cv4vH0y%3DOzC8hGyO2uq_sY3sJ8g%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Copying fields to a geopoint type ?
Hi, I'm new to elasticsearch. My usecase is to load a csv file containing some agencies with geo location, each lines are like : id;label;address;zipcode;city;region;*latitude*;*longitude*;(and some others fields)+ I'm using the csv river plugin to index the file. My mapping is : { office: { properties: { *(first fields omitted...)* *latitude*: { type: double, }, *longitude*: { type: double, }, *location*: { type: geo_point, lat_lon: true } } } I'd like to index the location .lon and .lat value from the latitude and longitude fields. I tried the copy_to function with no success : latitude: { type: double, copy_to: location.lat }, longitude: { type: double, copy_to: location.lon }, Is there any way to feed the location property from latitude and longitude fields at indexation ? My point is that I don't want to modify the input csv file to adapt it to the GeoJSON format (i.e concat lat and lon in one field in the csv file). Thank you for any hints. Pascal. -- You received this message because you are subscribed to the Google Groups elasticsearch group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/6e12ced7-5b1a-4142-93d1-a3d22d7138a2%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.