Hello All,
I'm a n00b, and I'm having trouble changing a field's datatype in 
elasticsearch - so that kibana can use it.

I read in a CSV with logstash. Here is a sample of that CSV:

DateTime,Session,Event,Data/Duration
2014-05-12T21:51:44,1399945863,Pressure,7.00



Here is my logstash config:

input {
  file {
    path => 
"/elk/Samples/CPAP_07_14_2014/CSV/SleepSheep_07_14_2014_no_header.csv"
    start_position => beginning
  }
}


filter {
  csv {
    columns => ["DateTime","Session","Event","Data/Duration"]
  }
}


output {
  elasticsearch {
    host => localhost
  }
  stdout { codec => rubydebug }
}



Whenever the data reaches elasticsearch, the mapping shows the 
"Date/Duration" field as a string, not a float, preventing kibana from 
using it for graphing.  I tried to use PUT on elasticsearch to overwrite 
the mapping, but it wont let me.


Where should I configure this datatype? In the CSV filter, in the output, 
in elasticsearch?

Thanks,
Barry

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to elasticsearch+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/5fac5f75-bcd3-4900-8d0a-94c930e7935c%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to