Why my logs do not have the expected timestamp?

By default Datadog generates a timestamp and appends it in a date attribute when logs are received on our intake API.

However, this default timestamp does not always reflect the actual value that might be contained in the log itself; this article describes how to override this default value.

1. Displayed timestamp

The first thing to understand is how the log timestamp (visible from the log explorer and at the top section of the contextual panel) is generated. 

Timestamps are stored in UTC and displayed in the user local timezone. 
On the above screenshot my local profile is set to UTC+1 therefore the reception time of my log was 11:06:16.807 UTC.

You can check your user settings to understand if this could be linked to a bad timezone on your profile:

But we can extract the timestamp from the message to override the actual log date for both raw and JSON logs.

2. Raw logs

     2.1 Extract the timestamp value with a parser

While writing a parsing rule for your logs, you need to extract the timestamp in a specific attribute.

More general information about parsing available in our documentation and best practice articles. We have some examples specific to date parsing here (make sure you also take the timezone into account as shown in the example).

For the above log, we would use the following rule with the "date()" matcher to extract the date and pass it into a custom date attribute:


     2.2 Define a Log Date Remapper

The value is now stored in a date attribute. We just need to add a Log Date remapper to make sure we override the official log timestamp with the value in that date attribute.

All new logs that are processed by that pipeline should now have the correct timestamp. 

Note: Any modification on the pipeline only impacts new logs as all the processing is done at ingestion.

The following log generated at 06:01:03 EST, which correspond to 11:01:03 UTC, is correctly displayed as 12:01:03 (display timezone is UTC+1 in my case).

3. JSON logs

     3.1 Supported Date formats 

JSON logs are automatically parsed in Datadog.

The log date is one of the reserved attributes in Datadog which means JSON logs that use those attributes will have their values treated specially--in this case to derive the log's date. You are able to change the default remapping for those attribute at the top of your pipeline as explained here.

So let's imagine that the actual timestamp of the log is contained in the attribute mytimestamp.

To make sure this attribute value is taken to override the log date, we would simply need to add it in the list of Date attributes.

The date remapper looks for each of the reserved attributes in the order in which they are configured in the reserved attribute mapping, so to be 100% sure that our "mytimestamp" attribute will be used to derive the date, we can place it first in the list. 

Note: Any modification on the pipeline only impacts new logs as all the processing is done at ingestion.

There are specific date formats to respect for the remapping to work. The recognized date formats are: ISO8601UNIX (the milliseconds EPOCH format) and RFC3164.

If the format is different from one of the above (so if your logs still do not have the right timestamp), there is a solution.

     3.2 Custom Date format

If the format is not supported by the remapper by default, we need to parse this format and convert it to a supported format. To do this we need a parser processor that applies only on our attribute. 

If you do not have a pipeline filtered on those logs yet, create a new one and add a processor. (Note, you will have to set this processor only to apply to the custom "mytimestamp" attribute under the "advanced" settings.)

Then define the right parsing rule depending on your date format. Examples are available here

Add a Log Date Remapper and the timestamp will finally be correct on new logs.


Have more questions? Submit a request


Please sign in to leave a comment.
Powered by Zendesk