A guide to sending your logs from FortiAnalyzer to Microsoft Sentinel using the Azure Monitor Agent (AMA).
Forwarding logs to FortiAnalyzer (FAZ) or a dedicated logging server is a widely recommended best practice to ensure centralized visibility, efficient monitoring, and enhanced threat analysis. However, some clients may require forwarding these logs to additional centralized hubs, such as Microsoft Sentinel, for further integration with their broader SIEM solutions. This dual-forwarding approach provides redundancy, advanced analytics, and supports diverse compliance or operational needs.
This guide presents three distinct scenarios for integrating FortiAnalyzer with Microsoft Sentinel, leveraging the following methods:
Each approach is designed to meet specific use cases, ensuring seamless log forwarding and enhanced visibility within your security ecosystem.
To ingest Syslogs from FortiAnalyzer into Microsoft Sentinel, a dedicated Linux machine is configured to serve as proxy server for log collection and forwarding to the Microsoft Sentinel workspace.
The Linux machine is structured with two key components:
Syslog Daemon (Log Collector): Utilizing either rsyslog or syslog-ng, this daemon performs dual functions:
Azure Monitor Agent (AMA): The agent parses the logs and then sends them to your Microsoft Sentinel (Log Analytics) workspace via HTTPS 443.
This setup requires also a Data Collection Rule (DCR) to define:
For more details please review this link
Prerequisites
Deployment Steps
Step 1: Install Syslog Data Connector
Step2: Create DCR (if you don't have)
You can find below an ARM template example for DCR configuration:
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"dataCollectionRules_ya_dcr_syslog_name": {
"defaultValue": "ya-dcr-syslog",
"type": "String"
},
"workspaces_ya_ama_externalid": {
"defaultValue": "/subscriptions/f7f4728a-781f-470f-b029-bac8a9df75af/resourceGroups/ya-faz-sentinel-ama/providers/Microsoft.OperationalInsights/workspaces/ya-ama",
"type": "String"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Insights/dataCollectionRules",
"apiVersion": "2023-03-11",
"name": "[parameters('dataCollectionRules_ya_dcr_syslog_name')]",
"location": "westeurope",
"tags": {
"createdBy": "Sentinel"
},
"kind": "Linux",
"properties": {
"dataSources": {
"syslog": [
{
"streams": [
"Microsoft-Syslog"
],
"facilityNames": [
"local7"
],
"logLevels": [
"Notice",
"Warning",
"Error",
"Critical",
"Alert",
"Emergency"
],
"name": "sysLogsDataSource-1039681479"
},
{
"streams": [
"Microsoft-Syslog"
],
"facilityNames": [
"nopri"
],
"logLevels": [
"Emergency"
],
"name": "sysLogsDataSource-1697966155"
}
]
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "[parameters('workspaces_ya_ama_externalid')]",
"name": "DataCollectionEvent"
}
]
},
"dataFlows": [
{
"streams": [
"Microsoft-Syslog"
],
"destinations": [
"DataCollectionEvent"
]
}
]
}
}
]
}
Step3: Log Collector Installation on Linux
Run the following command to install and apply log collector:
sudo wget -O Forwarder_AMA_installer.py https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/Syslog/Forwarder_AMA_installer.py&&sudo python Forwarder_AMA_installer.py
Step4: Configure FortiAnalyzer
After completing the setup on the Linux VM, configure your FortiAnalyzer device to forward Syslog messages with TCP 514. Use the following settings:
config system log-forward
edit 1
set mode forwarding
set fwd-max-delay realtime
set server-name "linux syslog"
set server-addr "liux VM IP address"
set fwd-server-type syslog
set fwd-reliable enable
set fwd-facility local7
set signature 6581725315585679982
next
end
diagnose test application logfwd 4
sudo systemctl restart rsyslog
netstat -lnptv
tcpdump -i any port 514 -A -vv &
sudo wget -O Sentinel_AMA_troubleshoot.py https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/Syslog/Sentinel_AMA_troubleshoot.py&&sudo python3 Sentinel_AMA_troubleshoot.py --SYSLOG
For additional details on DCR validation, review the link
You can visit the official Microsoft Sentinel documentation link to explore more about syslog ingestion with Microsoft Sentinel via AMA.
Fluent Bit is a lightweight, open-source telemetry agent designed to efficiently collect, process, and forward logs, metrics, and traces with minimal resource usage and seamless ecosystem integration. Learn more
The Azure Logs Ingestion plugin allows Fluent Bit to send logs to Azure Sentinel via the Logs Ingestion API, directing data to supported Azure tables or custom tables you define. More details here.
To integrate FortiAnalyzer with Sentinel via Logs Ingestion API, install Fluent Bit on a dedicated Linux machine and ensure the following components are configured (in addition to a Log Analytics Workspace):
Data Collection Endpoint (DCE): defines where and how telemetry data, like logs and metrics, is sent for processing and ingestion into Azure services. It acts as a connection point for data collection.
Data Collection Rule (DCR): specifies how data should be collected, transformed, and sent to a destination, such as Log Analytics workspaces or storage. It defines the data sources, destinations, and any processing rules applied to the incoming data.
Once Fluent Bit receives logs from FortiAnalyzer via the syslog daemon, it forwards the logs to the Data Collection Endpoint (DCE) using HTTPS requests. The incoming data is then processed and transformed based on the configurations defined in the Data Collection Rule (DCR) before being ingested into the destination, such as a Log Analytics Workspace.
For further details about log ingesion API, visit the following link.
Prerequisites
You can use powershell script to create and configure the previous requirements link.
Deployment Steps
Step 1: Create DCR and Custom Table(based on DCR)
{
"pri": "189",
"host": "172.19.0.4",
"app": "-",
"pid": "-",
"msgid": "-",
"message": "- logver=706003401 timestamp=1734059922 devname=\"ya-fgt\" devid=\"FGVM4VTM24000495\" vd=\"root\" date=2024-12-13 time=03:18:42 eventtime=1734088722709530851 tz=\"-0800\" logid=\"0001000014\" type=\"traffic\" subtype=\"local\" level=\"notice\" srcip=172.19.0.4 srcport=7634 srcintf=\"root\" srcintfrole=\"undefined\" dstip=168.63.129.16 dstport=32526 dstintf=\"port1\" dstintfrole=\"undefined\" srccountry=\"Reserved\" dstcountry=\"United States\" sessionid=1391 proto=6 action=\"close\" policyid=0 service=\"tcp/32526\" trandisp=\"noop\" app=\"tcp/32526\" duration=1 sentbyte=2662 rcvdbyte=351 sentpkt=7 rcvdpkt=4"
}
source
| extend
Date = extract(@"date=(\S+)", 1, message),
Time = extract(@"time=(\S+)", 1, message),
EventTime = extract(@"eventtime=(\S+)", 1, message),
Timestamp = extract(@"timestamp=(\d+)", 1, message),
LogId = extract(@"logid=""([^""]+)""", 1, message),
DeviceName = extract(@"devname=""([^""]+)""", 1, message),
DeviceId = extract(@"devid=""([^""]+)""", 1, message),
Vd = extract(@"vd=""([^""]+)""", 1, message),
Tz = extract(@"tz=""([^""]+)""", 1, message),
LogType = extract(@"type=""([^""]+)""", 1, message),
Subtype = extract(@"subtype=""([^""]+)""", 1, message),
Level = extract(@"level=""([^""]+)""", 1, message),
SrcIp = extract(@"srcip=(\d+\.\d+\.\d+\.\d+)", 1, message),
SrcPort = extract(@"srcport=(\d+)", 1, message),
SrcIntf = extract(@"srcintf=""([^""]+)""", 1, message),
SrcIntfRole = extract(@"srcintfrole=""([^""]+)""", 1, message),
DstIp = extract(@"dstip=(\d+\.\d+\.\d+\.\d+)", 1, message),
DstPort = extract(@"dstport=(\d+)", 1, message),
DstIntf = extract(@"dstintf=""([^""]+)""", 1, message),
DstIntfRole = extract(@"dstintfrole=""([^""]+)""", 1, message),
SrcCountry = extract(@"srccountry=""([^""]+)""", 1, message),
DstCountry = extract(@"dstcountry=""([^""]+)""", 1, message),
SessionId = extract(@"sessionid=(\d+)", 1, message),
Proto = extract(@"proto=(\d+)", 1, message),
Action = extract(@"action=""([^""]+)""", 1, message),
PolicyId = extract(@"policyid=(\d+)", 1, message),
Service = extract(@"service=""([^""]+)""", 1, message),
TranDisp = extract(@"trandisp=""([^""]+)""", 1, message),
App = extract(@"app=""([^""]+)""", 1, message),
Duration = extract(@"duration=(\d+)", 1, message),
SentByte = extract(@"sentbyte=(\d+)", 1, message),
RcvdByte = extract(@"rcvdbyte=(\d+)", 1, message),
SentPkt = extract(@"sentpkt=(\d+)", 1, message),
RcvdPkt = extract(@"rcvdpkt=(\d+)", 1, message)
| project
TimeGenerated, Date, Time, EventTime, Timestamp, LogId, DeviceName, DeviceId, Vd, Tz, LogType, Subtype, Level, SrcIp, SrcPort, SrcIntf, SrcIntfRole, DstIp, DstPort, DstIntf, DstIntfRole, SrcCountry, DstCountry, SessionId, Proto, Action, PolicyId, Service, TranDisp, App, Duration, SentByte, RcvdByte, SentPkt, RcvdPkt
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"dataCollectionRules_ya_dcr_fazsyslog_name": {
"defaultValue": "ya-dcr-fazsyslog",
"type": "String"
},
"dataCollectionEndpoints_ya_dce_log_ingestion_externalid": {
"defaultValue": "/subscriptions/xxxxxxxxxxxxxxxxxx/resourceGroups/ya-faz-fluentbit/providers/Microsoft.Insights/dataCollectionEndpoints/ya-dce-log-ingestion",
"type": "String"
},
"workspaces_ya_faz_fluentbit_externalid": {
"defaultValue": "/subscriptions/xxxxxxxxxxxxxxxxxxx/resourceGroups/ya-faz-fluentbit/providers/microsoft.operationalinsights/workspaces/ya-faz-fluentbit",
"type": "String"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Insights/dataCollectionRules",
"apiVersion": "2023-03-11",
"name": "[parameters('dataCollectionRules_ya_dcr_fazsyslog_name')]",
"location": "westeurope",
"identity": {
"type": "SystemAssigned"
},
"properties": {
"dataCollectionEndpointId": "[parameters('dataCollectionEndpoints_ya_dce_log_ingestion_externalid')]",
"streamDeclarations": {
"Custom-fazsyslog_CL": {
"columns": [
{
"name": "TenantId",
"type": "string"
},
{
"name": "SourceSystem",
"type": "string"
},
{
"name": "TimeGenerated",
"type": "datetime"
},
{
"name": "Computer",
"type": "string"
},
{
"name": "EventTime_UTC",
"type": "datetime"
},
{
"name": "Facility",
"type": "string"
},
{
"name": "HostName",
"type": "string"
},
{
"name": "SeverityLevel",
"type": "string"
},
{
"name": "SyslogMessage",
"type": "string"
},
{
"name": "HostIP",
"type": "string"
},
{
"name": "MG",
"type": "string"
},
{
"name": "CollectorHostName",
"type": "string"
},
{
"name": "Type",
"type": "string"
},
{
"name": "_ResourceId",
"type": "string"
}
]
},
"Custom-faztest_CL": {
"columns": [
{
"name": "TimeGenerated",
"type": "datetime"
},
{
"name": "pri",
"type": "string"
},
{
"name": "host",
"type": "string"
},
{
"name": "app",
"type": "string"
},
{
"name": "pid",
"type": "string"
},
{
"name": "msgid",
"type": "string"
},
{
"name": "message",
"type": "string"
}
]
},
"Custom-faztransform_CL": {
"columns": [
{
"name": "TimeGenerated",
"type": "datetime"
},
{
"name": "pri",
"type": "string"
},
{
"name": "host",
"type": "string"
},
{
"name": "app",
"type": "string"
},
{
"name": "pid",
"type": "string"
},
{
"name": "msgid",
"type": "string"
},
{
"name": "message",
"type": "string"
}
]
},
"Custom-table1_CL": {
"columns": [
{
"name": "TimeGenerated",
"type": "datetime"
},
{
"name": "pri",
"type": "string"
},
{
"name": "host",
"type": "string"
},
{
"name": "app",
"type": "string"
},
{
"name": "pid",
"type": "string"
},
{
"name": "msgid",
"type": "string"
},
{
"name": "message",
"type": "string"
}
]
}
},
"dataSources": {},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "[parameters('workspaces_ya_faz_fluentbit_externalid')]",
"name": "4c11d0df4293420da6212e470364eaae"
}
]
},
"dataFlows": [
{
"streams": [
"Custom-fazsyslog_CL"
],
"destinations": [
"4c11d0df4293420da6212e470364eaae"
],
"transformKql": "source\n| where SyslogMessage startswith \"logver=\"\n| extend \n SourceIP = extract(@\"srcip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, SyslogMessage),\n DestinationIP = extract(@\"dstip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, SyslogMessage),\n SourcePort = extract(@\"srcport=(\\d+)\", 1, SyslogMessage),\n DestinationPort = extract(@\"dstport=(\\d+)\", 1, SyslogMessage),\n DeviceId = extract(@\"devid=\"\"([^\"\"]+)\"\"\", 1, SyslogMessage),\n Severity = extract(@\"level=\"\"([^\"\"]+)\"\"\", 1, SyslogMessage)\n| project TimeGenerated,SourceIP, DestinationIP, SourcePort, DestinationPort, DeviceId, Severity\n",
"outputStream": "Custom-fazsyslog_CL"
},
{
"streams": [
"Custom-faztest_CL"
],
"destinations": [
"4c11d0df4293420da6212e470364eaae"
],
"transformKql": "source | extend TimeGenerated = now()",
"outputStream": "Custom-faztest_CL"
},
{
"streams": [
"Custom-faztransform_CL"
],
"destinations": [
"4c11d0df4293420da6212e470364eaae"
],
"transformKql": "source\n| extend TimeGenerated = now(),\n DestinationIP = extract(@\"dstip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, message),\n SourcePort = extract(@\"srcport=(\\d+)\", 1, message)\n",
"outputStream": "Custom-faztransform_CL"
},
{
"streams": [
"Custom-table1_CL"
],
"destinations": [
"4c11d0df4293420da6212e470364eaae"
],
"transformKql": "source\n| extend\n Date = extract(@\"ate=(\\S+)\", 1, message),\n Time = extract(@\"time=(\\S+)\", 1, message),\n EventTime = extract(@\"eventtime=(\\S+)\", 1, message),\n Timestamp = extract(@\"timestamp=(\\d+)\", 1, message),\n LogId = extract(@\"logid=\"\"([^\"\"]+)\"\"\", 1, message),\n DeviceName = extract(@\"devname=\"\"([^\"\"]+)\"\"\", 1, message),\n DeviceId = extract(@\"devid=\"\"([^\"\"]+)\"\"\", 1, message),\n Vd = extract(@\"vd=\"\"([^\"\"]+)\"\"\", 1, message),\n Tz = extract(@\"tz=\"\"([^\"\"]+)\"\"\", 1, message),\n LogType = extract(@\"type=\"\"([^\"\"]+)\"\"\", 1, message),\n Subtype = extract(@\"subtype=\"\"([^\"\"]+)\"\"\", 1, message),\n Level = extract(@\"level=\"\"([^\"\"]+)\"\"\", 1, message),\n SrcIp = extract(@\"srcip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, message),\n SrcPort = extract(@\"srcport=(\\d+)\", 1, message),\n SrcIntf = extract(@\"srcintf=\"\"([^\"\"]+)\"\"\", 1, message),\n SrcIntfRole = extract(@\"srcintfrole=\"\"([^\"\"]+)\"\"\", 1, message),\n DstIp = extract(@\"dstip=(\\d+\\.\\d+\\.\\d+\\.\\d+)\", 1, message),\n DstPort = extract(@\"dstport=(\\d+)\", 1, message),\n DstIntf = extract(@\"dstintf=\"\"([^\"\"]+)\"\"\", 1, message),\n DstIntfRole = extract(@\"dstintfrole=\"\"([^\"\"]+)\"\"\", 1, message),\n SrcCountry = extract(@\"srccountry=\"\"([^\"\"]+)\"\"\", 1, message),\n DstCountry = extract(@\"dstcountry=\"\"([^\"\"]+)\"\"\", 1, message),\n SessionId = extract(@\"sessionid=(\\d+)\", 1, message),\n Proto = extract(@\"proto=(\\d+)\", 1, message),\n Action = extract(@\"action=\"\"([^\"\"]+)\"\"\", 1, message),\n PolicyId = extract(@\"policyid=(\\d+)\", 1, message),\n Service = extract(@\"service=\"\"([^\"\"]+)\"\"\", 1, message),\n TranDisp = extract(@\"trandisp=\"\"([^\"\"]+)\"\"\", 1, message),\n App = extract(@\"app=\"\"([^\"\"]+)\"\"\", 1, message),\n Duration = extract(@\"duration=(\\d+)\", 1, message),\n SentByte = extract(@\"sentbyte=(\\d+)\", 1, message),\n RcvdByte = extract(@\"rcvdbyte=(\\d+)\", 1, message),\n SentPkt = extract(@\"sentpkt=(\\d+)\", 1, message),\n RcvdPkt = extract(@\"rcvdpkt=(\\d+)\", 1, message)\n| project\n TimeGenerated,\n Date,\n Time,\n EventTime,\n Timestamp,\n LogId,\n DeviceName,\n DeviceId,\n Vd,\n Tz,\n LogType,\n Subtype,\n Level,\n SrcIp,\n SrcPort,\n SrcIntf,\n SrcIntfRole,\n DstIp,\n DstPort,\n DstIntf,\n DstIntfRole, \n SrcCountry,\n DstCountry,\n SessionId,\n Proto,\n Action,\n PolicyId,\n Service,\n TranDisp,\n App,\n Duration,\n SentByte,\n RcvdByte,\n SentPkt,\n RcvdPkt\n\n",
"outputStream": "Custom-table1_CL"
}
]
}
}
]
}
Step 2: Configure Access Control
Step 3: Install and configure Fluent Bit on linux VM
curl https://raw.githubusercontent.com/fluent/fluent-bit/master/install.sh | sh
sudo systemctl start fluent-bit
sudo apt-get update
Refer to the Fluent Bit Installation Guide for more details link.
sudo nano /etc/fluent-bit/parsers.conf
[PARSER]
Name mysyslog-rfc5424
Format regex
Regex ^<(?[0-9]+)>1 (?[^ ]+) (?[^ ]+) (?[^\s]+) (?[^\s]+) (?[^\s]+) (?[^\]]*\])?(?.+)$
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S%z
Edit the fluent-bit.conf file and configure fluent-bit to forward logging to log analytics workspace
sudo nano /etc/fluent-bit/fluent-bit.conf
[INPUT]
Name syslog
Mode udp
Listen 0.0.0.0
Port 514
Parser mysyslog-rfc5424
Tag faz
[OUTPUT]
Name azure_logs_ingestion
Match faz
client_id **************************
client_secret **************************
tenant_id **************************
dce_url https://ya-dce-log-ingestion-ebl4.westeurope-1.ingest.monitor.azure.com
dcr_id dcr-f43cbb987d6c45efa8319f5d0c0c1aee
table_name table1_CL
time_generated true
time_key TimeGenerated
Compress true
config system log-forward
edit 1
set mode forwarding
set fwd-max-delay realtime
set server-name "dce"
set server-addr "fluent-bit IP address"
set server-port 514
set fwd-server-type syslog
set fwd-syslog-format rfc-5424
set signature 3799479601930374274
next
end
For more details about log ingestion API deployment, refer to the Azure Documentation link
sudo systemctl start fluent-bit
systemctl status fluent-bit
The output should be similar to this:
fluent-bit.service - Fluent Bit
Loaded: loaded (/usr/lib/systemd/system/fluent-bit.service; disabled; preset: enabled)
Active: active (running) since Thu 2024-12-05 10:27:05 UTC; 43min ago
Docs: https://docs.fluentbit.io/manual/
Main PID: 1903 (fluent-bit)
Tasks: 4 (limit: 19120)
Memory: 16.0M (peak: 18.5M)
CPU: 1.423s
CGroup: /system.slice/fluent-bit.service
└─1903 /opt/fluent-bit/bin/fluent-bit -c //etc/fluent-bit/fluent-bit.conf
Dec 05 11:10:30 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397029.646640947, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:31 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397030.646687865, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:32 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397031.646783683, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:33 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397032.646658300, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:34 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397033.646643817, {}], {"cpu_p"=>0.000000, "user_p"=>0.000000, "system_p"=>0.000000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
Dec 05 11:10:35 ya-fluentbit fluent-bit[1903]: [0] cpu.local: [[1733397034.646651935, {}], {"cpu_p"=>0.250000, "user_p"=>0.000000, "system_p"=>0.250000, "cpu0.p_cpu"=>0.000000, "cpu0.p_user"=>0.000000, "cpu0.p_system"=>0.000000, "cpu1.p>
sudo systemctl restart fluent-bit
sudo journalctl -u fluent-bit -f
Table validayion from log analytics workspace
Please note that Microsoft has announced the deprecation of the HTTP Data Collector API. This API will no longer function as of September 14, 2026. As a result, Fluentd integration scenarios relying on this API will also cease to function on the same date. The recommended replacement is the Logs Ingestion API, which offers enhanced capabilities for log integration moving forward.
Starting from version 7.4.0, FortiAnalyzer introduced support for log forwarding to log analytics workspace and other public cloud services through Fleuntd. You can visit the link for more details.
FortiAnalyzer seamlessly integrates with Microsoft Sentinel, offering enhanced support through log streaming to multiple destinations using the Fluentd output plugin. Fluentd, an open-source data collector, serves as a comprehensive solution that unifies the process of collecting and consuming data. For additional details, please check the following link.
This integration enables the logs forwarding to public cloud services. The plugin efficiently aggregates semi-structured data in real-time, facilitating the buffered data's transmission to Azure Log Analytics.
FortiGate establishes communication with FortiAnalyzer and transmits logs via TCP port 514. Then FortiAnalyzer, leveraging Fluentd as a data collector, adeptly aggregates, filters, and securely transmits data to Azure Log Analytics workspace.
Fleuntd send logs to a log analytics workspace in Azure monitor by using HTTP data collector API. This involves creating POST request with URL:
https://"log analytics workspace-id".ods.opinsights.azure.com/api/logs?api-version=2016-04-01
For additional details, you can refer to the provided link
The seamless integration of Fluentd with FortiAnalyzer removes the need for an additional proxy server, streamlining the installation process of a data collector between FortiAnalyzer and the Azure Log Analytics workspace. This approach offers an efficient way to manage log transmission and analysis.
Prerequisites
No configuration for data connector is required for the FortiAnalyzer integration, as Fluentd will directly transmit logs to the Log Analytics Workspace. Additional guidance on this step is available in the link.
Configuration Steps
Step1: Create an output profile
Step2: Create a new log Forwarding
FortiAnalyzer can ingest logs into the log analytics workspace using the Apache access log format. However, extracting the essential data from the message still requires additional steps.
One approach is to utilize Azure functions for this purpose. For instance, to extract the Source Information (SrcInf) from the message, you can employ the following query and subsequently save it as a function:
Table_name
| extend SrcInf = extract(@'srcintf=\"(\S+)\"', 1, Message)
diagnose test application fwdplugind 4
diagnose sql fluentd log-tail
diagnose test application fwdplugind 201 log enable
diagnose test application fwdplugind 201 log enable
diagnose sql fluentd log-tail
Log forwarding to Microsoft Sentinel can lead to significant costs, making it essential to implement an efficient filtering mechanism.
FortiAnalyzer Log Filtering
FortiAnalyzer provides an intuitive graphical user interface (GUI) for managing and optimizing log forwarding to the Log Analytics Workspace. FortiAnalyzer allows users to set up device-specific filters based on configurable criteria. Additionally, users can apply free-text filtering directly from the GUI, simplifying the process of customizing log forwarding.
On FortiGate devices, log forwarding settings can be adjusted directly via the GUI. Users can:
- Enable or disable traffic logs.
- Forward logs to FortiAnalyzer or a syslog server.
- Specify the desired severity level.
For more advanced filtering, FortiGate's CLI provides enhanced flexibility, enabling tailored filtering based on specific values.
For detailed guidance on log filtering and optimization, refer to the following resources:
Exclude specific logs to be sent to FortiAnalyzer from Fortigate.
Minimize the forwarded logs from Fortigate to FortiAnalyzer
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
The Fortinet Security Fabric brings together the concepts of convergence and consolidation to provide comprehensive cybersecurity protection for all users, devices, and applications and across all network edges.
Copyright 2025 Fortinet, Inc. All Rights Reserved.