FortiGate Azure Technical Learning
YaseAraf
Staff
Staff

A guide to sending your logs from FortiAnalyzer to Microsoft Sentinel using the Azure Monitor Agent (AMA).

 

 

Introduction

 

Forwarding logs to FortiAnalyzer (FAZ) or a dedicated logging server is a widely recommended best practice to ensure centralized visibility, efficient monitoring, and enhanced threat analysis. However, some clients may require forwarding these logs to additional centralized hubs, such as Microsoft Sentinel, for further integration with their broader SIEM solutions. This dual-forwarding approach provides redundancy, advanced analytics, and supports diverse compliance or operational needs.

This guide presents three distinct scenarios for integrating FortiAnalyzer with Microsoft Sentinel, leveraging the following methods:

 

  • Azure Monitor Agent (AMA)
  • Log Ingestion API with Fluent Bit
  • Fluentd Plugin

Each approach is designed to meet specific use cases, ensuring seamless log forwarding and enhanced visibility within your security ecosystem.

 

Azure Monitor Agent (AMA)

 

Data Flow

 

To ingest Syslogs from FortiAnalyzer into Microsoft Sentinel, a dedicated Linux machine is configured to serve as proxy server for log collection and forwarding to the Microsoft Sentinel workspace.

The Linux machine is structured with two key components:

  • Syslog Daemon (Log Collector): Utilizing either rsyslog or syslog-ng, this daemon performs dual functions:

    • Actively listens for Syslog messages originating from FortiAnalyzer on TCP/UDP port 514.
    • Send logs to Azure Monitor Agent (AMA) on localhost, utilizing TCP port 28330.
  • Azure Monitor Agent (AMA): The agent parses the logs and then sends them to your Microsoft Sentinel (Log Analytics) workspace via HTTPS 443.

This setup requires also a Data Collection Rule (DCR) to define:

  • The Linux machine as a resource for log collection.
  • The Syslog table in the Log Analytics workspace as the destination for the collected logs.

For more details please review this link

FAZ-AMA-DataFlow.png

 

Deployment and Setup

 

Prerequisites

  • Log Analytics Workspace link.
  • Microsoft Sentinel onboarded with the Log Analytics Workspace link.
  • Dedicated linux VM link.
  • Fortigate with FortiAnalyzer Integration (optional) link.

Deployment Steps

 

Step 1: Install Syslog Data Connector

  • Navigate to Microsoft Sentinel workspace ---> Content management---> Content hub.
  • Search for 'Syslog' and install it. This will deploy syslog via AMA data connector.syslog-DataConnector.png

     

     

  • Open connector page for syslog via AMA.Syslog-via-AMA-page.png

     

     

Step2: Create DCR (if you don't have)

  • Use the same location as your log analytics workspace
  • Add linux machine as a resource
  • Collect facility log_local7 and set the min log level to be collectedcreate-dcr1.png

     

    create-dcr2.png

     

    create-dcr3.png

     

You can find below an ARM template example for DCR configuration:


{
    "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "dataCollectionRules_ya_dcr_syslog_name": {
            "defaultValue": "ya-dcr-syslog",
            "type": "String"
        },
        "workspaces_ya_ama_externalid": {
            "defaultValue": "/subscriptions/f7f4728a-781f-470f-b029-bac8a9df75af/resourceGroups/ya-faz-sentinel-ama/providers/Microsoft.OperationalInsights/workspaces/ya-ama",
            "type": "String"
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Insights/dataCollectionRules",
            "apiVersion": "2023-03-11",
            "name": "[parameters('dataCollectionRules_ya_dcr_syslog_name')]",
            "location": "westeurope",
            "tags": {
                "createdBy": "Sentinel"
            },
            "kind": "Linux",
            "properties": {
                "dataSources": {
                    "syslog": [
                        {
                            "streams": [
                                "Microsoft-Syslog"
                            ],
                            "facilityNames": [
                                "local7"
                            ],
                            "logLevels": [
                                "Notice",
                                "Warning",
                                "Error",
                                "Critical",
                                "Alert",
                                "Emergency"
                            ],
                            "name": "sysLogsDataSource-1039681479"
                        },
                        {
                            "streams": [
                                "Microsoft-Syslog"
                            ],
                            "facilityNames": [
                                "nopri"
                            ],
                            "logLevels": [
                                "Emergency"
                            ],
                            "name": "sysLogsDataSource-1697966155"
                        }
                    ]
                },
                "destinations": {
                    "logAnalytics": [
                        {
                            "workspaceResourceId": "[parameters('workspaces_ya_ama_externalid')]",
                            "name": "DataCollectionEvent"
                        }
                    ]
                },
                "dataFlows": [
                    {
                        "streams": [
                            "Microsoft-Syslog"
                        ],
                        "destinations": [
                            "DataCollectionEvent"
                        ]
                    }
                ]
            }
        }
    ]
}
 

Step3: Log Collector Installation on Linux

Run the following command to install and apply log collector:

sudo wget -O Forwarder_AMA_installer.py https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/Syslog/Forwarder_AMA_installer.py&&sudo python Forwarder_AMA_installer.py
  
 

Step4: Configure FortiAnalyzer

After completing the setup on the Linux VM, configure your FortiAnalyzer device to forward Syslog messages with TCP 514. Use the following settings:

  config system log-forward
      edit 1
          set mode forwarding
          set fwd-max-delay realtime
          set server-name "linux syslog"
          set server-addr "liux VM IP address"
          set fwd-server-type syslog
          set fwd-reliable enable
          set fwd-facility local7
          set signature 6581725315585679982
      next
  end
  

 

Validation and Troubleshooting

 

  • From the FortiAnalyzer CLI, use the following command to verify the log forwarding status:
diagnose test application logfwd 4
Visit the link for more details.
  • Restart rsyslog
sudo systemctl restart rsyslog
 
  • Validate that the syslog daemon is running on the TCP port and that the AMA is listening by reviewing the configuration file /etc/rsyslog.conf . After verification, use the following command to confirm:
netstat -lnptv
port-validation-ama.png

 

  • Use this command to capture messages sent from a logger or connected device :
tcpdump -i any port 514 -A -vv &
       After completing the validation, stop the tcpdump process by typing fg followed by Ctrl+C.
  • Run the Sentinel troubleshooting script to check if the connector is installed correctly :
sudo wget -O Sentinel_AMA_troubleshoot.py https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/Syslog/Sentinel_AMA_troubleshoot.py&&sudo python3 Sentinel_AMA_troubleshoot.py --SYSLOG
 

troubleshooting-ama.png

 

  •  Confirm that the Data Collection Rule (DCR) is correctly assigned and that logs are being ingested into the Syslog table.

syslog-dataconnector-validation.png

 

 

 

syslog.png

 

  • From the Azure portal, navigate to the DCR's Monitoring > Metrics section and set the metric to "Log Ingestion per Minute" to validate log flow. Set the metric to log ingestion per min.DCR-metrics.png

     

    For additional details on DCR validation, review the link

    You can visit the official Microsoft Sentinel documentation link to explore more about syslog ingestion with Microsoft Sentinel via AMA.

Log Ingestion API with Fluent Bit

 

Data Flow

 

Fluent Bit is a lightweight, open-source telemetry agent designed to efficiently collect, process, and forward logs, metrics, and traces with minimal resource usage and seamless ecosystem integration. Learn more

The Azure Logs Ingestion plugin allows Fluent Bit to send logs to Azure Sentinel via the Logs Ingestion API, directing data to supported Azure tables or custom tables you define. More details here.

To integrate FortiAnalyzer with Sentinel via Logs Ingestion API, install Fluent Bit on a dedicated Linux machine and ensure the following components are configured (in addition to a Log Analytics Workspace):

  • Data Collection Endpoint (DCE): defines where and how telemetry data, like logs and metrics, is sent for processing and ingestion into Azure services. It acts as a connection point for data collection.

  • Data Collection Rule (DCR): specifies how data should be collected, transformed, and sent to a destination, such as Log Analytics workspaces or storage. It defines the data sources, destinations, and any processing rules applied to the incoming data.

Once Fluent Bit receives logs from FortiAnalyzer via the syslog daemon, it forwards the logs to the Data Collection Endpoint (DCE) using HTTPS requests. The incoming data is then processed and transformed based on the configurations defined in the Data Collection Rule (DCR) before being ingested into the destination, such as a Log Analytics Workspace.

For further details about log ingesion API, visit the following link.

 

FAZ-Fluentbit-Dataflow.png

 

Deployment and Setup

 

Prerequisites

  • Log Analytics Workspace link.
  • Microsoft Sentinel added to Log Analytics Workspace link.
  • A Microsoft Entra application to authenticate against the API link and:
    • A service principal on the Microsoft Entra application
    • A secret for the Microsoft Entra application
  • A data collection endpoint (DCE) in same region as Log Analytics workspace, to receive data link.
  • Grants the app Contributor permissions to:
    • The Log Analytics workspace
    • The resource group for data collection rules
    • The resource group for data collection endpoints

You can use powershell script to create and configure the previous requirements link.

  • Dedicated linux VM link.
  • Fortigate with FortiAnalyzer Integration (optional) link.

Deployment Steps

 

Step 1: Create DCR and Custom Table(based on DCR)

  •    Navigate to Log Analytics Workspace -> Settings -> Tables Then select: Create -> New custom log (DCR-based)

 

customtable1.png

  •  Create a New DCR and assign it to your custom table.create-dcr-from-customtable.png

     

  • Use the sample file to define the schema for your custom table. Ensure the sample aligns with the structure of syslog messages forwarded from FortiAnalyzer. The sample Below should match rfc-5424.
{
  "pri": "189",
  "host": "172.19.0.4",
  "app": "-",
  "pid": "-",
  "msgid": "-",
  "message": "- logver=706003401 timestamp=1734059922 devname=\"ya-fgt\" devid=\"FGVM4VTM24000495\" vd=\"root\" date=2024-12-13 time=03:18:42 eventtime=1734088722709530851 tz=\"-0800\" logid=\"0001000014\" type=\"traffic\" subtype=\"local\" level=\"notice\" srcip=172.19.0.4 srcport=7634 srcintf=\"root\" srcintfrole=\"undefined\" dstip=168.63.129.16 dstport=32526 dstintf=\"port1\" dstintfrole=\"undefined\" srccountry=\"Reserved\" dstcountry=\"United States\" sessionid=1391 proto=6 action=\"close\" policyid=0 service=\"tcp/32526\" trandisp=\"noop\" app=\"tcp/32526\" duration=1 sentbyte=2662 rcvdbyte=351 sentpkt=7 rcvdpkt=4"
}
  • Apply transformations to extract columns from the message field.
source
| extend
    Date = extract(@"date=(\S+)", 1, message),
    Time = extract(@"time=(\S+)", 1, message),
    EventTime = extract(@"eventtime=(\S+)", 1, message),
    Timestamp = extract(@"timestamp=(\d+)", 1, message),
    LogId = extract(@"logid=""([^""]+)""", 1, message),
    DeviceName = extract(@"devname=""([^""]+)""", 1, message),
    DeviceId = extract(@"devid=""([^""]+)""", 1, message),
    Vd = extract(@"vd=""([^""]+)""", 1, message),
    Tz = extract(@"tz=""([^""]+)""", 1, message),
    LogType = extract(@"type=""([^""]+)""", 1, message),
    Subtype = extract(@"subtype=""([^""]+)""", 1, message),
    Level = extract(@"level=""([^""]+)""", 1, message),
    SrcIp = extract(@"srcip=(\d+\.\d+\.\d+\.\d+)", 1, message),
    SrcPort = extract(@"srcport=(\d+)", 1, message),
    SrcIntf = extract(@"srcintf=""([^""]+)""", 1, message),
    SrcIntfRole = extract(@"srcintfrole=""([^""]+)""", 1, message),
    DstIp = extract(@"dstip=(\d+\.\d+\.\d+\.\d+)", 1, message),
    DstPort = extract(@"dstport=(\d+)", 1, message),
    DstIntf = extract(@"dstintf=""([^""]+)""", 1, message),
    DstIntfRole = extract(@"dstintfrole=""([^""]+)""", 1, message),
    SrcCountry = extract(@"srccountry=""([^""]+)""", 1, message),
    DstCountry = extract(@"dstcountry=""([^""]+)""", 1, message),
    SessionId = extract(@"sessionid=(\d+)", 1, message),
    Proto = extract(@"proto=(\d+)", 1, message),
    Action = extract(@"action=""([^""]+)""", 1, message),
    PolicyId = extract(@"policyid=(\d+)", 1, message),
    Service = extract(@"service=""([^""]+)""", 1, message),
    TranDisp = extract(@"trandisp=""([^""]+)""", 1, message),
    App = extract(@"app=""([^""]+)""", 1, message),
    Duration = extract(@"duration=(\d+)", 1, message),
    SentByte = extract(@"sentbyte=(\d+)", 1, message),
    RcvdByte = extract(@"rcvdbyte=(\d+)", 1, message),
    SentPkt = extract(@"sentpkt=(\d+)", 1, message),
    RcvdPkt = extract(@"rcvdpkt=(\d+)", 1, message)
| project
TimeGenerated, Date, Time, EventTime, Timestamp, LogId, DeviceName, DeviceId, Vd, Tz, LogType, Subtype, Level, SrcIp, SrcPort, SrcIntf, SrcIntfRole, DstIp, DstPort, DstIntf, DstIntfRole, SrcCountry, DstCountry, SessionId, Proto, Action, PolicyId, Service, TranDisp, App, Duration, SentByte, RcvdByte, SentPkt, RcvdPkt

customtable-transformation-editor.png

 

 

  • You can create multiple custom tables attached to same DCR. ARM template for DCR example:
{
    "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "dataCollectionRules_ya_dcr_fazsyslog_name": {
            "defaultValue": "ya-dcr-fazsyslog",
            "type": "String"
        },
        "dataCollectionEndpoints_ya_dce_log_ingestion_externalid": {
            "defaultValue": "/subscriptions/xxxxxxxxxxxxxxxxxx/resourceGroups/ya-faz-fluentbit/providers/Microsoft.Insights/dataCollectionEndpoints/ya-dce-log-ingestion",
            "type": "String"
        },
        "workspaces_ya_faz_fluentbit_externalid": {
            "defaultValue": "/subscriptions/xxxxxxxxxxxxxxxxxxx/resourceGroups/ya-faz-fluentbit/providers/microsoft.operationalinsights/workspaces/ya-faz-fluentbit",
            "type": "String"
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Insights/dataCollectionRules",
            "apiVersion": "2023-03-11",
            "name": "[parameters('dataCollectionRules_ya_dcr_fazsyslog_name')]",
            "location": "westeurope",
            "identity": {
                "type": "SystemAssigned"
            },
            "properties": {
                "dataCollectionEndpointId": "[parameters('dataCollectionEndpoints_ya_dce_log_ingestion_externalid')]",
                "streamDeclarations": {
                    "Custom-fazsyslog_CL": {
                        "columns": [
                            {
                                "name": "TenantId",
                                "type": "string"
                            },
                            {
                                "name": "SourceSystem",
                                "type": "string"
                            },
                            {
                                "name": "TimeGenerated",
                                "type": "datetime"
                            },
                            {
                                "name": "Computer",
                                "type": "string"
                            },
                            {
                                "name": "EventTime_UTC",
                                "type": "datetime"
                            },
                            {
                                "name": "Facility",
                                "type": "string"
                            },
                            {
                                "name": "HostName",
                                "type": "string"
                            },
                            {
                                "name": "SeverityLevel",
                                "type": "string"
                            },
                            {
                                "name": "SyslogMessage",
                                "type": "string"
                            },
                            {
                                "name": "HostIP",
                                "type": "string"
                            },
                            {
                                "name": "MG",
                                "type": "string"
                            },
                            {
                                "name": "CollectorHostName",
                                "type": "string"
                            },
                            {
                                "name": "Type",
                                "type": "string"
                            },
                            {
                                "name": "_ResourceId",
                                "type": "string"
                            }
                        ]
                    },
                    "Custom-faztest_CL": {
                        "columns": [
                            {
                                "name": "TimeGenerated",
                                "type": "datetime"
                            },
                            {
                                "name": "pri",
                                "type": "string"
                            },
                            {
                                "name": "host",
                                "type": "string"
                            },
                            {
                                "name": "app",
                                "type": "string"
                            },
                            {
                                "name": "pid",
                                "type": "string"
                            },
                            {
                                "name": "msgid",
                                "type": "string"
                            },
                            {
                                "name": "message",
                                "type": "string"
                            }
                        ]
                    },
                    "Custom-faztransform_CL": {
                        "columns": [
                            {
                                "name": "TimeGenerated",
                                "type": "datetime"
                            },
                            {
                                "name": "pri",
                                "type": "string"
                            },
                            {
                                "name": "host",
                                "type": "string"
                            },
                            {
                                "name": "app",
                                "type": "string"
                            },
                            {
                                "name": "pid",
                                "type": "string"
                            },
                            {
                                "name": "msgid",
                                "type": "string"
                            },
                            {
                                "name": "message",
                                "type": "string"
                            }
                        ]
                    },
                    "Custom-table1_CL": {
                        "columns": [
                            {
                                "name": "TimeGenerated",
                                "type": "datetime"
                            },
                            {
                                "name": "pri",
                                "type": "string"
                            },
                            {
                                "name": "host",
                                "type": "string"
                            },
                            {
                                "name": "app",
                                "type": "string"
                            },
                            {
                                "name": "pid",
                                "type": "string"
                            },
                            {
                                "name": "msgid",
                                "type": "string"
                            },
                            {
                                "name": "message",
                                "type": "string"
                            }
                        ]
                    }
                },
                "dataSources": {},
                "destinations": {
                    "logAnalytics": [
                        {
                            "workspaceResourceId": "[parameters('workspaces_ya_faz_fluentbit_externalid')]",
                            "name": "4c11d0df4293420da6212e470364eaae"
                        }
                    ]
                },
                "dataFlows": [
                    {
                        "streams": [
                            "Custom-fazsyslog_CL"
                        ],
                        "destinations": [
                            "4c11d0df4293420da6212e470364eaae"
                        ],
                        "transformKql": "source\n| where SyslogMessage startswith \"logver=\"\n| extend \n    SourceIP = extract(@\"srcip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, SyslogMessage),\n    DestinationIP = extract(@\"dstip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, SyslogMessage),\n    SourcePort = extract(@\"srcport=(\\d+)\", 1, SyslogMessage),\n    DestinationPort = extract(@\"dstport=(\\d+)\", 1, SyslogMessage),\n    DeviceId = extract(@\"devid=\"\"([^\"\"]+)\"\"\", 1, SyslogMessage),\n    Severity = extract(@\"level=\"\"([^\"\"]+)\"\"\", 1, SyslogMessage)\n| project TimeGenerated,SourceIP, DestinationIP, SourcePort, DestinationPort, DeviceId, Severity\n",
                        "outputStream": "Custom-fazsyslog_CL"
                    },
                    {
                        "streams": [
                            "Custom-faztest_CL"
                        ],
                        "destinations": [
                            "4c11d0df4293420da6212e470364eaae"
                        ],
                        "transformKql": "source | extend TimeGenerated = now()",
                        "outputStream": "Custom-faztest_CL"
                    },
                    {
                        "streams": [
                            "Custom-faztransform_CL"
                        ],
                        "destinations": [
                            "4c11d0df4293420da6212e470364eaae"
                        ],
                        "transformKql": "source\n| extend TimeGenerated = now(),\n    DestinationIP = extract(@\"dstip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, message),\n    SourcePort = extract(@\"srcport=(\\d+)\", 1, message)\n",
                        "outputStream": "Custom-faztransform_CL"
                    },
                    {
                        "streams": [
                            "Custom-table1_CL"
                        ],
                        "destinations": [
                            "4c11d0df4293420da6212e470364eaae"
                        ],
                        "transformKql": "source\n| extend\n    Date = extract(@\"ate=(\\S+)\", 1, message),\n    Time = extract(@\"time=(\\S+)\", 1, message),\n    EventTime = extract(@\"eventtime=(\\S+)\", 1, message),\n    Timestamp = extract(@\"timestamp=(\\d+)\", 1, message),\n    LogId = extract(@\"logid=\"\"([^\"\"]+)\"\"\", 1, message),\n    DeviceName = extract(@\"devname=\"\"([^\"\"]+)\"\"\", 1, message),\n    DeviceId = extract(@\"devid=\"\"([^\"\"]+)\"\"\", 1, message),\n    Vd = extract(@\"vd=\"\"([^\"\"]+)\"\"\", 1, message),\n    Tz = extract(@\"tz=\"\"([^\"\"]+)\"\"\", 1, message),\n    LogType = extract(@\"type=\"\"([^\"\"]+)\"\"\", 1, message),\n    Subtype = extract(@\"subtype=\"\"([^\"\"]+)\"\"\", 1, message),\n    Level = extract(@\"level=\"\"([^\"\"]+)\"\"\", 1, message),\n    SrcIp = extract(@\"srcip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, message),\n    SrcPort = extract(@\"srcport=(\\d+)\", 1, message),\n    SrcIntf = extract(@\"srcintf=\"\"([^\"\"]+)\"\"\", 1, message),\n    SrcIntfRole = extract(@\"srcintfrole=\"\"([^\"\"]+)\"\"\", 1, message),\n    DstIp = extract(@\"dstip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, message),\n    DstPort = extract(@\"dstport=(\\d+)\", 1, message),\n    DstIntf = extract(@\"dstintf=\"\"([^\"\"]+)\"\"\", 1, message),\n    DstIntfRole = extract(@\"dstintfrole=\"\"([^\"\"]+)\"\"\", 1, message),\n    SrcCountry = extract(@\"srccountry=\"\"([^\"\"]+)\"\"\", 1, message),\n    DstCountry = extract(@\"dstcountry=\"\"([^\"\"]+)\"\"\", 1, message),\n    SessionId = extract(@\"sessionid=(\\d+)\", 1, message),\n    Proto = extract(@\"proto=(\\d+)\", 1, message),\n    Action = extract(@\"action=\"\"([^\"\"]+)\"\"\", 1, message),\n    PolicyId = extract(@\"policyid=(\\d+)\", 1, message),\n    Service = extract(@\"service=\"\"([^\"\"]+)\"\"\", 1, message),\n    TranDisp = extract(@\"trandisp=\"\"([^\"\"]+)\"\"\", 1, message),\n    App = extract(@\"app=\"\"([^\"\"]+)\"\"\", 1, message),\n    Duration = extract(@\"duration=(\\d+)\", 1, message),\n    SentByte = extract(@\"sentbyte=(\\d+)\", 1, message),\n    RcvdByte = extract(@\"rcvdbyte=(\\d+)\", 1, message),\n    SentPkt = extract(@\"sentpkt=(\\d+)\", 1, message),\n    RcvdPkt = extract(@\"rcvdpkt=(\\d+)\", 1, message)\n| project\n    TimeGenerated,\n    Date,\n    Time,\n    EventTime,\n    Timestamp,\n    LogId,\n    DeviceName,\n    DeviceId,\n    Vd,\n    Tz,\n    LogType,\n    Subtype,\n    Level,\n    SrcIp,\n    SrcPort,\n    SrcIntf,\n    SrcIntfRole,\n    DstIp,\n    DstPort,\n    DstIntf,\n    DstIntfRole, \n    SrcCountry,\n    DstCountry,\n    SessionId,\n    Proto,\n    Action,\n    PolicyId,\n    Service,\n    TranDisp,\n    App,\n    Duration,\n    SentByte,\n    RcvdByte,\n    SentPkt,\n    RcvdPkt\n\n",
                        "outputStream": "Custom-table1_CL"
                    }
                ]
            }
        }
    ]
}

 

 

Step 2: Configure Access Control

 

  • Navigate to Access Control (IAM) section for the DCR.
  • Add role assignment.
  • Select: Monitoring Metrics Publisher > Next.
  • Select User, group, or service principal for Assign access to and choose Select members.
  • Choose the application that you created and then click Select to confirm assignment.

Step 3: Install and configure Fluent Bit on linux VM

 

  • Run the installation script for the latest version:
curl https://raw.githubusercontent.com/fluent/fluent-bit/master/install.sh | sh
 
  • Start Fluent-Bit
sudo systemctl start fluent-bit
 
  • Update apt database
sudo apt-get update

Refer to the Fluent Bit Installation Guide for more details link.

  • Edit the parsers.conf file and add parser for Syslog-rfc5424
sudo nano /etc/fluent-bit/parsers.conf
 
[PARSER]
  Name         mysyslog-rfc5424
  Format       regex
  Regex        ^<(?[0-9]+)>1 (?[^ ]+) (?[^ ]+) (?[^\s]+) (?[^\s]+) (?[^\s]+) (?[^\]]*\])?(?.+)$
  Time_Key     time
  Time_Format  %Y-%m-%dT%H:%M:%S%z
 

 

  • Edit the fluent-bit.conf file and configure fluent-bit to forward logging to log analytics workspace

    sudo nano /etc/fluent-bit/fluent-bit.conf
    
     
The screenshots below illustrate the required configuration parameters for client_id, client_secret, dce_url, and dcr_id.

 

app-registeration.png

 

client-secret.png

 

dce-url.png

 

dcr-id.png

 

  • You can find below an example about configuration file:
[INPUT]
    Name   syslog
    Mode   udp
    Listen 0.0.0.0
    Port   514
    Parser mysyslog-rfc5424
    Tag    faz

[OUTPUT]
    Name            azure_logs_ingestion
    Match           faz
    client_id       **************************
    client_secret   **************************
    tenant_id       **************************
    dce_url         https://ya-dce-log-ingestion-ebl4.westeurope-1.ingest.monitor.azure.com
    dcr_id          dcr-f43cbb987d6c45efa8319f5d0c0c1aee
    table_name      table1_CL
    time_generated  true
    time_key        TimeGenerated
    Compress        true
Step 4 : Configure FortiAnalyzer to forward logging to fluent bit linux machine
 
config system log-forward
    edit 1
        set mode forwarding
        set fwd-max-delay realtime
        set server-name "dce"
        set server-addr "fluent-bit IP address"
        set server-port 514
        set fwd-server-type syslog
        set fwd-syslog-format rfc-5424
        set signature 3799479601930374274
    next
end
 

For more details about log ingestion API deployment, refer to the Azure Documentation link

 

Validation and Troubleshooting

  • Start fluent-bit
sudo systemctl start fluent-bit
 
  • fluent-bit status
systemctl status fluent-bit
 

The output should be similar to this:


 fluent-bit.service - Fluent Bit
     Loaded: loaded (/usr/lib/systemd/system/fluent-bit.service; disabled; preset: enabled)
     Active: active (running) since Thu 2024-12-05 10:27:05 UTC; 43min ago
       Docs: https://docs.fluentbit.io/manual/
   Main PID: 1903 (fluent-bit)
      Tasks: 4 (limit: 19120)
     Memory: 16.0M (peak: 18.5M)
        CPU: 1.423s
     CGroup: /system.slice/fluent-bit.service
             └─1903 /opt/fluent-bit/bin/fluent-bit -c //etc/fluent-bit/fluent-bit.conf

Dec 05 11:10:30 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397029.646640947, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:31 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397030.646687865, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:32 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397031.646783683, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:33 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397032.646658300, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:34 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397033.646643817, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:35 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397034.646651935, {}], {"cpu_p"=>0.250000, "user_p"=>0.000000, "system_p"=>0.250000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
 
  • Restart fluent-bit
sudo systemctl restart fluent-bit
 
  • Check logging for troubleshooting
sudo journalctl -u fluent-bit -f
 

 

  • Table validayion from log analytics workspacetable-validation-log-ingestion-api.png

 

  • DCR Metrics Validation

     

dcr-metrics-log-ingestion-api.png

 

Fluentd Plugin

 

Data Flow

 

Please note that Microsoft has announced the deprecation of the HTTP Data Collector API. This API will no longer function as of September 14, 2026. As a result, Fluentd integration scenarios relying on this API will also cease to function on the same date. The recommended replacement is the Logs Ingestion API, which offers enhanced capabilities for log integration moving forward.

Starting from version 7.4.0, FortiAnalyzer introduced support for log forwarding to log analytics workspace and other public cloud services through Fleuntd. You can visit the link for more details.

FortiAnalyzer seamlessly integrates with Microsoft Sentinel, offering enhanced support through log streaming to multiple destinations using the Fluentd output plugin. Fluentd, an open-source data collector, serves as a comprehensive solution that unifies the process of collecting and consuming data. For additional details, please check the following link.

This integration enables the logs forwarding to public cloud services. The plugin efficiently aggregates semi-structured data in real-time, facilitating the buffered data's transmission to Azure Log Analytics.

FortiGate establishes communication with FortiAnalyzer and transmits logs via TCP port 514. Then FortiAnalyzer, leveraging Fluentd as a data collector, adeptly aggregates, filters, and securely transmits data to Azure Log Analytics workspace.

Fleuntd send logs to a log analytics workspace in Azure monitor by using HTTP data collector API. This involves creating POST request with URL:


https://"log analytics workspace-id".ods.opinsights.azure.com/api/logs?api-version=2016-04-01
 

For additional details, you can refer to the provided link

The seamless integration of Fluentd with FortiAnalyzer removes the need for an additional proxy server, streamlining the installation process of a data collector between FortiAnalyzer and the Azure Log Analytics workspace. This approach offers an efficient way to manage log transmission and analysis.

FAZ-Fuentd-DataFlow.png

 

Deployment and Setup

 

Prerequisites

  • Log Analytics Workspace link.
  • Microsoft Sentinel onboarded with the Log Analytics Workspace link.
  • Fortigate with FortiAnalyzer Integration (optional) link.

No configuration for data connector is required for the FortiAnalyzer integration, as Fluentd will directly transmit logs to the Log Analytics Workspace. Additional guidance on this step is available in the link.

 

Configuration Steps

 

Step1: Create an output profile

  • Navigate to System Settings -> Advanced -> Log Forwarding -> Output Profile and create a new output profile.

FAZ_outputprofile.PNG

 

  • Specify the type as "Azure Log Analytics" and utilize the default configuration. Subsequently, fill in the customer ID with the Workspace ID and the primary key value into the shared_key field.
  • Retrieve the ID and key for the Log Analytics Workspace from Settings -> Agents, as illustrated in the provided screenshot.loganalyticsworkspace-id-key.PNG

Step2: Create a new log Forwarding

  •  Move to System Settings -> Advanced -> Log Forwarding -> Settings.FAZ-logforwarding-settings.PNG

     

  • Configure the remote server type as "Forward via Output Plugin" and select your designated output profile.

FortiAnalyzer can ingest logs into the log analytics workspace using the Apache access log format. However, extracting the essential data from the message still requires additional steps.

One approach is to utilize Azure functions for this purpose. For instance, to extract the Source Information (SrcInf) from the message, you can employ the following query and subsequently save it as a function:

Table_name
| extend SrcInf = extract(@'srcintf=\"(\S+)\"', 1, Message)

 

Validation and Troubleshooting

 
  • To verify Fluentd write status, execute the command:
diagnose test application fwdplugind 4
 
  • To ensure the presence of Fluentd log files, utilize the following command:
diagnose sql fluentd log-tail
 
  • Enable Fluentd logging with the command:
diagnose test application fwdplugind 201 log enable
 
  • After one minute, rewrite the command:
diagnose test application fwdplugind 201 log enable
 
  • To display processed events, use the command:
diagnose sql fluentd log-tail

FAZ-diagnose.PNG

 

  • Review the received logs from the Log Analytics Workspace, as depicted in the screenshot. 

 

loganalyticsworkspace-logs-verification.PNG

 

Log Filtering

 

Log forwarding to Microsoft Sentinel can lead to significant costs, making it essential to implement an efficient filtering mechanism.

  • FortiAnalyzer Log Filtering

    FortiAnalyzer provides an intuitive graphical user interface (GUI) for managing and optimizing log forwarding to the Log Analytics Workspace. FortiAnalyzer allows users to set up device-specific filters based on configurable criteria. Additionally, users can apply free-text filtering directly from the GUI, simplifying the process of customizing log forwarding.FAZ-log-filtering.PNG

     

     

  • FortiGate Log Filtering

On FortiGate devices, log forwarding settings can be adjusted directly via the GUI. Users can:

- Enable or disable traffic logs.
- Forward logs to FortiAnalyzer or a syslog server.
- Specify the desired severity level.

 

For more advanced filtering, FortiGate's CLI provides enhanced flexibility, enabling tailored filtering based on specific values.

For detailed guidance on log filtering and optimization, refer to the following resources:

Log FortiAnalyzer filter

Exclude specific logs to be sent to FortiAnalyzer from Fortigate.

Minimize the forwarded logs from Fortigate to FortiAnalyzer