FortiAnalyzer
FortiAnalyzer can receive logs and Windows host events directly from endpoints connected to EMS, and you can use FortiAnalyzer to analyze the logs and run reports.
ckarwei
Staff
Staff
Article Id 191436

Description


This article describes how to troubleshoot the Cloud-out connector in FortiAnalyzer.

 

Scope

 

FortiAnalyzer.

Solution

 

  1. Verify storage connector service license. To verify the license, use the CLI command '# diagnose fmupdate dbcontract'. Ensure that 'SCPC' is available under 'Contract' and check the expiry date.

 

FAZ-VMTM00000000 [SERIAL_NO]

  AccountID: user@aaa.com
  Industry:  Test
  Company:   AAA
  Contract:  6
    COMP-1-20-20230831
    ENHN-1-20-20230831
    FMWR-1-06-20230831
    FRVS-1-06-20230831
    SCPC-1-06-20230831
    SPRT-1-20-20230831 

 

Ensure that 'Upload logs to cloud storage' is not greyed out at System Settings -> Device Log Settings.


  
  1. Verify CA certificates. Verify that CA certificates have been imported at System Settings -> CA Certificates. Before logs can be uploaded to cloud storage using Amazon S3, Azure Blob, or Google connectors, the cloud provider's CA certificate(s) must be imported into FortiAnalyzer.
    Third-party CA certificates, for example, GlobalSign and CyberTrust, may be required.

    Check with the Cloud storage provider to see which CA certificates are supported. In Amazon S3's case, one of the root CAs 'Starfield Services Root Certificate Authority - G2' needs to be imported in FortiAnalyzer to work, which can be downloaded from here: https://www.amazontrust.com/repository/.
 
 
  1. Check the policies related to Amazon S3 access. Check that the IAM user or role has s3:GetBucketPolicy permission to view the bucket policy and s3:PutBucketPolicy permission to edit it.
    FortiAnalyzer uses Rclone to manage files on cloud storage Rclone permissions.

  2. Verify Fabric Connector configuration. Verify fabric connectors settings in Fabric View -> Fabric Connectors and status is set to 'ON'. Below are sample details required for each cloud provider:

 

AWS S3:
Provider: AWS.
Region: AWS region (ex. us-west-1).
Access Key ID: IAM user account access key.
Secret Access Key: IAM user account secret access key.

Azure Blob:

Storage Account Name: Microsoft Azure account name.
Account Key: Microsoft Azure account key.

Google:
Cloud Project Number: Google account project number.
Service Account Credentials: Google account JSON key.
Cloud Location: Bucket locations (ex. us-east1).

Use the CLI command 'diagnose test application uploadd 62 <connector> <remote path>' to perform upload test to cloud storage.

A dummy file will show in cloud storage if upload successful.

Example:
 
FAZ # diagnose test application uploadd 62 faz-s3bucket fazapacstorage
s1) copy file. uuid[b9647d0c-32b1-11eb-8bfa-0a17955e07c8]
s-) result not ready. uuid[b9647d0c-32b1-11eb-8bfa-0a17955e07c8]
s2) rc=0 message[success]4) Verify unit log settings configuration.
 
At System Settings -> Device Log Settings, ensure that 'Upload logs to cloud storage' has been selected. Choose the 'Cloud Storage Connector' created earlier.
In the 'Remote Path' box, type the bucket or container name from the storage account.
 
 
Run the following debug during the upload process. A similar output will appear in the debug if the upload process succeeds.
 
FAZ # diagnose debug application uploadd 8
FAZ # diagnose debug enable

cmd_proxy:97: 1622182565 cmd "/usr/local/bin/rclone --config=/drive0/private/rclone.cfg copy /drive0/private/uploadd_repo_cloud_storage/Amazon S3 storage - fortiazure fortiazure:fazapacstorage" sent successfully! uuid=2f1ff2f4-bf7c-11eb-843e-0050568ab88d
 
If debug shows success but log file does not appear in the cloud bucket, create a new Cloud storage and delete the old configuration at System Settings -> Device Log Settings.
 

 
 
  1. Useful cli commands for troubleshoot FortiAnalyzer Cloud-out connector.
 
diagnose fmupdate dbcontract                                      <----- Verify storage connector service license.
diagnose test application uploadd 6                              <----- Cloud storage backlog.
diagnose test application uploadd 62 <connector> <remote path>    <----- upload test.
diagnose test application uploadd 63                              <----- Cloud storage usage info.
diagnose debug application uploadd 8                              <----- Cloud storage upload debug.