-
Notifications
You must be signed in to change notification settings - Fork 42
Date filter fails to parse timestamps #95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hmm.. If it's already a timestamp type then maybe we should add a way to copy field values around? I'm not sure about having the date filter do this because it would focus on copying only fields with a timestamp type. Alternately, maybe have the jdbc input allow users to specify what column should provide the I agree this is funky behavior and that we should make it possible (and easy) to do what you are describing. Thoughts? |
Or not have timestamp types at all? I like to think of Logstash as a JSON processor, and a timestamp type just doesn't fit in. Internally it's of course fine if the values are stored as timestamps, but when that is exposed to users it easily leads to confusion:
|
Another user having a bad time: https://discuss.elastic.co/t/multiple-problems-with-logstash/90554 |
I'm thinking about how we can make this not a problem for users. My thought is that you shouldn't even want to use the Date filter, you should just be able to have the JDBC plugin store the timestamp in |
How would the plugin know which column to store in What if the result set contains multiple columns that should be stored as timestamps in ES? |
What's wrong with the original suggestion? I just hit this myself and took me a while to figure out what was going on.. |
I've hit this and am not able to get my log timestamps to replace the timestamp attribute. https://discuss.elastic.co/t/date-filter/118128/6 |
The JDBC input will convert known Time-ish JDBC datatypes (Timestamp, Date) to Ruby Time objects. When these are added to the Event they are converted to internal There is one additional case to be considered - when the DB records times not in UTC. In this case one can set the A workaround for the bug is to not set the JDBC input users often have to CAST their "timestamp" column to a string so they can use the date filter to set the OTOH we could extend the jdbc input to have the user specify which column goes into the However, I'm also open to extending the date filter to offer TZ conversion and |
In case anyone is struggling with this, the solution is below. The suggested workarounds on discuss.elastic.co don't quite cut it. filter {
mutate {
convert => {"log_date" => "string"}
}
date {
match => ["log_date", "ISO8601"]
}
} This converts the date from SQL to a string, then parses it and assigns it to @timestamp. |
Thank you @LucidObscurity , your workaround worked perfectly. |
The above workaround unfortunately doesn't work with JSON log files:
Any idea how to do it ? |
@Constantin07 This issue only deals with events coming from the jdbc input (or similar inputs that produce timestamp values). Parsing timestamps from JSON files is an entirely different problem. Please ask for help with that at discuss.elastic.co. |
When using the jdbc input to fetch events from a database, timestamp fields end up as timestamp fields in Logstash. One could argue that this is a feature, but it causes confusion since those fields apparently can't be processed by the date filter. Could we either call to_s on the source string or check if the source already is a timestamp and, if so, just copy it to the destination field?
See https://discuss.elastic.co/t/trouble-matching-timestamp/83768 for an example.
The text was updated successfully, but these errors were encountered: