Help Add Log Type

Adding a Log Type

A Log Type is a clear definition of the format in which an application writes logs. Site24x7 supports more than 30 different log types by default. Enable AppLogs, create a Log Profile, associate a Log Type to it and start collecting, analyzing, and managing your logs. You can also create custom log types in addition to what Site24x7 supports by default.

Synopsis  

Adding a custom log type

If your log type isn't listed on the list of supported log types, you can create custom log types and define them. Go to Admin > AppLogs > Log Types > Add Log Type.

  1. Display Name: Enter a dispaly name.
  2. Log Type: Give a name for your Log Type.
  3. Retention: Retention implies the number of days for which the logs will be available for search in Site24x7.
  4. Maximum Upload Limit: The maximum amount of logs (for this log type) that you could upload during the current billing cycle.
  5. Auto Discovery: Toggle Enable to automatically look for this log format across any new servers that have been associated with this log type and start to upload them.
  6. Sample Logs: Give three sample lines of your log entry to enable us to discover the log pattern. (Note: It should at least have a Date Field for your logs to be parsed)
    While uploading a multiline log, add <NewLine> to break up the multiline log, and also enter a log pattern.

    Add log type
  7. Log Pattern: You can edit any column in the sample output field by editing the respective field name in the Log Pattern Section.
    Every field name should start and end with $ (Ex : $Message$)
    If you encounter a Date Pattern Mismatch error, you can define a custom date pattern.
    • Custom log pattern can be given in the Log Pattern section with the following syntax:
      LogPattern: $FieldName:DataType:Format$
      Attribute Description
      Field Name Provides a description for an attribute
      Data Type This is the type of data, associated with a FieldName. A FieldName can be of the following DataTypes:
      - Number
      - String
      - Date - (Folder or file | Different language | Without date value | Without date and time value)
      - Decimal
      - IP (IPv4 or IPv6)
      - Word
      - Config
      - Pattern
      Format Needed only for the "Date" DataType. For the other data types, there is no need for a format
    • Defining a Number Field 

      ($FieldName:Number$) 
      Here, Number is the Data Type of the value associated with the Field Name. 
    • Defining a String Field

      ($FieldName$) or ($FieldName:String$) 
      Here, String is the text associated with the Field Name.
      (Note: String is the default data type, and hence it need not be separately mentioned).
    • Defining a Date Field

      ($FieldName:Date:Format$) 
      Here, FieldName is the variable name, Date is the Data Type of that variable. However, a Date variable must be defined with a Format.
      $DateTime:date:EEEE MMM dd HH:MM:SS:SSS$
      For example, to mention
      • Tuesday Sep 19 13:34:56.123 2007 - The format should be EEEE MMM dd HH:mm:ss:SSS
      • Sep 19 2007 13:34:56 123456 PST - The format should be MMM dd yyyy HH:mm:ss:SSSSS z 
      • 19-09-07 1:34:56 pm -0800 - The format should be dd-MM-yy (or) y hh:mm:ss a Z
      • 13:34:56,262 - The format should be HH:mm:ss, D
      • Tue September 19 13:34:56 - The format should be EEE MMMM dd HH:mm:ss
      • For the Unix time (in seconds) 1190234095, the format should be $DateTime:date:unix$

      Supported Date Formats:

      Format Requirement  Date Format Example
      Year - 2 digits  yy (or) y 17 or 7 
      Year - 4 digits yyyy  2017
      Month - 2 digits MM  07
      Month - 3 letters MMM  Sep
      Month Name in full MMMM  September
      Date dd  19
      Hours in a day (0-12) hh 1
      Hours in a day (0-23) HH 13
      Minutes in an hour mm   34
      Seconds in a minute  ss   56
      Milliseconds in a second SSS  123
      Time Zone (+0800; -1100)
      Time Zone (PST)
      Time Zone (-08:00)
      Z
      z
      X
      -0800
      PST
      +01:00
      Day in a year 262
      Day Name - 3 letters EEE  Tue
      Day Name in full  EEEE  Tuesday
      AM/PM/am/pm a pm
      Unix time - seconds since epoch unix 1190234095
      Unix time - milliseconds since epoch unixm 1190234095123

      Fetching date value from a folder or a file name

      The folder name generally consists of year, month, and date fields only. We will fetch the date value only if the log lines consist of hour, minute, and second values at the end.

      $DateTime:date:@folder(yyyy-MM-dd)HH:mm:ss$ $
      $DateTime:date:@file(yyyy-MM-dd)HH:mm:ss$ $
      $DateTime:date:@filepath(yyyy-MM-dd)HH:mm:ss$ $

      For example,
      Sample Log

      11:10:11 CassandraDaemon:init Logging initialized
      11:10:12 YamlConfigurationLoader:load Loading settings from file
      11:10:13 DatabaseDescriptor:data Data files directories

      Log Pattern
      $DateTime:date:@folder(yyyy-MM-dd)HH:mm:ss$ $ClassName$:$Method$ $Message$

      File Name: D:\MyWebApp\2020-01-15\process.log

      Here, the date value is present in the parent folder of the log file; hence @folder is mentioned in the log pattern.

      Collecting logs without the date value in the log line

      At times, log lines will have only the time field and not the date value. In such cases, you've to configure the below date pattern to collect logs.

      Sample Log

      11:10:11 CassandraDaemon:init Logging initialized
      11:10:12 YamlConfigurationLoader:load Loading settings from file
      11:10:13 DatabaseDescriptor:data Data files directories

      Log Pattern
      $DateTime:date:@filedate(yyyy-MM-dd)HH:mm:ss$ $ClassName$:$Method$ $Message$

      Here, @filedate will take the date value from the file's last modified date.

      Collecting logs without the date and time value in the log line

      At times, log lines will not have both the time and date field values. In such cases, you've to configure the below date pattern to collect logs.

      Sample Log

      CassandraDaemon:init Logging initialized
      YamlConfigurationLoader:load Loading settings from file
      DatabaseDescriptor:data Data files directories

      Log Pattern
      $DateTime:date:agent_time$ $ClassName$:$Method$ $Message$

      Here, agent_time will take the agent-installed machine's current time while reading the logs.

      Parsing date value from a different language

      For example, the below log lines contain date value in Portuguese language.

      Sample Logs
      Log Entry: 00:00:07 quinta-feira, 10 outubro 2019 Iniciando recebimento de mensagem
      Log Entry: 00:00:07 quinta-feira, 10 outubro 2019 Buscando mensagems na fila Quantidade=0
      Log Entry: 00:00:08 quinta-feira, 10 outubro 2019 Sucesso ao buscar quantidade: CM_OK

      Log Pattern
      Log Entry: $DateTime:date:pt(HH:mm:ss EEEE, dd MMMM yyyy)$ $Message$
      Here, "pt" denotes the language code for Portuguese.

      Refer to this document for locale codes for different languages.

    • Defining a Decimal Field

      ($FieldName:Decimal$)
      Here, Decimal is the Data Type of the value associated with the Field Name. Ex : 165.5
    • Defining an IP Field

      ($FieldName:ip$)
      Here, IP is the Data Type of the value associated with the Field Name. It can be either an IPv4 or IPv6 value. 
      Ex: 192.0.2.1, 2001:0db8:85a3:0:0:8a2e:0370:7334
    • Defining a Word Field

      ($Filename:word$)
      Here, Word is the Data Type of the value associated with the Field Name. Word is simply a subset of String, but the field should contain only one word. If more than one word exists, it should be defined as String.
    • Defining a Config field

      ($FileName:config:@file$)
      Here, @file is the config type associated with the Field Name.
      Ex: @folder, @file, @ip, @host

      ($FieldName:config:@filepath$)
      Ex: C:\Program Files\cassandra\logs\server.log
      Here, if your mention the file path, Site24x7 AppLogs will take the complete path of the file and insert it into that field. 
      In case, if you want to add a specific folder (Cassandra, for example), you can define the field as below:
      $FieldName:config:@filepath:2$
    • Defining a Pattern field

      This data type is exclusive to JSON files and is used to define the pattern for any of the json object values in the same log.
      Pattern 1:
      json $log:pattern:$RemoteHost$ $RemoteLogName$ $RemoteUser$ [$DateTimefield:date:dd/EEE/yyyy:HH:mm:ss$] $Method$ $RequestURI$ $Protocol$ $Status:number$ $ResponseSize:number$ $Referer$ $UserAgent$$ $stream$ $time$

      Here, the date field is inside the data type pattern field.

      Pattern 2:
      json $log:pattern:$RemoteHost$ $RemoteLogName$ $RemoteUser$ [$DateTimefield$] $Method$ $RequestURI$ $Protocol$ $Status:number$ $ResponseSize:number$ $Referer$ $UserAgent$$ $stream$ $time:date:yyyy-mm-dd'T'HH:mm:ss.SSS'Z'$

      Here, the date field is outside the data type pattern field.

      Sample log: 

      {"log":"172.21.163.159 - - [27/Jul/2020:19:53:11] GET /test.txt HTTP/1.1 200 12 - Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36","stream":"stdout","time":"2020-07-28T11:29:54.295671087Z"}
      {"log":"172.21.163.159 - - [27/Jul/2020:19:53:11] GET /test.txt HTTP/1.1 200 12 - Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36","stream":"stdout","time":"2020-07-28T11:29:54.295671087Z"}
      {"log":"172.21.163.159 - - [27/Jul/2020:19:53:11] GET /test.txt HTTP/1.1 200 12 - Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36","stream":"stdout","time":"2020-07-28T11:29:54.295671087Z"}

    • Escape Special Characters

      If the log line has special characters, use ESC(<special character>).

      Sample log with special characters:
      2022-01-12 22:00:29,793 GMT*16.2*Message
      2022-01-12 22:00:29,793 GMT*16.2*Message
      2022-01-12 22:00:29,793 GMT*16.2*Message

      Then the log pattern should be defined as:

      $Datetime:date:yyyy-MM-dd HH:mm:ss,S z$ESC(*)$Version$ESC(*)$Message$
  8. Send logs to AppLogs via HTTPS endpoint.
  9. Field Configurations:

    • In the Sample Output table, hover over a field name to find the Field configuration settings icon icon and click on it.
    • In the ThreadId - Field Configurations window that opens, choose the required field from the left pane and fill in the options on the right side.
    • Display Name: The name will be automatically appended based on the field you choose from the left pane.
    • Unit for this Field: Choose a suitable unit. This option is for number fields only.
    • Toggle to Yes against the following options, if you wish to enable them. For all the conditions you choose here (except for Ignore this Field), you can define the values in the Filter Log Lines option below:
      • Enable Groupby: Group the entries with the same value.
      • Character Length for Groupby: Specify the number of characters that should be displayed in Groupby query output. You can add upto 200 characters for a field.
      • Mark as Unique Field: Specify this field as a unique field if it contains unique values. You can find the particular unique field highlighted as a link in the AppLogs Search page. Click on the link to view the logs corresponding to that particular field.
      • Hide this Field from Search Result: Hide this particular field when you view the search results.
      • Enable Masking: Toggle to Yes to enable masking. Provide the expression for the data to be masked as a capture group in the regex and the mask string.
      • Enable Hashing: Toggle to Yes to enable hashing. Provide the expression and include the data to be hashed as a capture group in the regex. Learn more about masking and hashing.

    Filter Log Lines at Source:

    If you have chosen the options other than Ignore this Field in the above step, then you can define the values under the Filter Log Lines option. You can apply this filter for more than one option above.

    • Select Log Lines only if this Field: Choose Matches or Doesn't Match based on how you want to add the value below.
    • Any of these Values: Enter a value for the specified condition.
    • Ignore this Field at Source: Toggle to Yes to ignore that particular field at the agent-side itself, before uploading.
    • Click Apply.
      Update field configurations
  10. Click on Save and associate it to a Log Profile. You are done. You can start searching your logs.
    The storage duration for your logs is fixed at 30 days. To increase the log retention period you can re-index logs. Learn more.

Sample log patterns:

The following log patterns are supported:

In this pattern, all the fields are simply separated by space.

Sample log:
3489 M 04 Mar 09:13:40.537 # WARNING: The TCP backlog setting of 511 cannot be enforced

Log pattern:

$PID:number$ $Role$ $DateTime:date$ $LogLevel$ $Message$
Field Name Field Value
PID 3489
Role M
DateTime 04 Mar 09:13:40.537
LogLevel #
Message WARNING: The TCP backlog setting of 511 cannot be enforced
  • Log pattern with some default character:

In the cases where a fixed letter or a word gets repeated in all the log lines, they could be excluded by mentioning them in the log pattern. We have excluded the characters like [,],*,: in the below example.

Sample log:
2017/08/01 01:05:50 [error] 28148#1452: *154 FastCGI sent in stderr

Log pattern:

$DateTime:date$ [$LogLevel$] $ProcessId:number$#$ThreadId:number$: *$UniqueId:number$ $Message$
Field Name Field Value
DateTime 2017/08/01 01:05:50
LogLevel error
ProcessId 28148
ThreadId 1452
UniqueId 154
Message FastCGI sent in stderr
  • Log pattern with custom date format:

If the sample log contains a different date format, the user has to give the exact date format in the log pattern.

Sample log: 
demo_user demo_db 192.168.22.10 58241 2018-01-08 11:58:23 AEDT FATAL: no pghba.conf entry for host

Log pattern: 

$User$ $DB$ $RemoteIP$ $PID$ $DateTime:date:yyyy-MM-dd HH:mm:ss z$ $LogLevel$: $Message$
Field Name Field Value
User demo_user
DB demo_db
RemoteIP 192.168.22.10
PID 58241
DateTime 2018-01-08 11:58:23 AEDT
LogLevel FATAL
Message no pghba.conf entry for host
  • Log pattern with some field exclusion:

There are some cases where not all log lines have the same number of fields. Some log lines may have five fields while the other has four fields. In such a case, you must exclude the missing field using the '!' symbol in the log pattern. In the following example, "ProcessId" is missing in the second log line, so we have excluded that field in the log pattern.

Sample log: 
Aug  7 07:35:02 log-host systemd[1]: Stopping CUPS Scheduler
Aug  7 08:40:02 log-host kernel: 817216.167300] audit: type=1400

Log pattern: 

$DateTime:date$ $Host$ $Application$![$ProcessId$]!: $Message$
Field Name Field Value - Line 1 Field Value - Line 2
DateTime Aug  7 07:35:02 Aug  7 08:40:02
Host log-host log-host
Application systemd kernel
ProcessId 1 -
Message Stopping CUPS Scheduler 817216.167300] audit: type=1400

Related articles

Was this document helpful?
Thanks for taking the time to share your feedback. We’ll use your feedback to improve our online help resources.

Help Add Log Type