Support Forum
The Forums are a place to find answers on a range of Fortinet products from peers and product experts.
Mohammed_Waked
New Contributor

get logs from fortianalyzer

hello guys, i have this part of script where I try to get all logs from fortianalyzer to save in text file 
When i run script, i just get the first 100 logs and any other logs can get it  Knowing that when I changed the offset value to 500, it returned 100 logs different from the first 100. This means that when we change the offset value manually, it works, but when I do this in the script, it only takes 100 logs.

Note that when I do a debug in the Forti Analyzer, it actually shows what is happening in the script and returns only the first 100 logs, but any other request does not return a log.

 

# -----------------------
# Step 4: Retrieve logs using pagination with limit = 100
# -----------------------
all_logs = []
offset = 0
limit = 100  # Use 100 logs per page
max_attempts = 3  # Maximum attempts to retry fetching logs

print("\nRetrieving logs in pages...")
while offset < total_logs:
    logs_payload = {
        "id": "123456789",
        "jsonrpc": "2.0",
        "method": "get",
        "params": [
            {
                "apiver": 3,
                "offset": offset,
                "limit": limit,

                "url": f"/logview/adom/root/logsearch/{tid}"
            }
        ],
        "session": session_token
    }

    # Retry mechanism for fetching logs
    attempt = 0
    while attempt < max_attempts:
        logs_resp = requests.post(BASE_URL, json=logs_payload, verify=False)
        logs_data = logs_resp.json()

        # Debugging: Print the API response
        print(f"API Response at offset {offset}:")

        # Ensure the data exists in the response
        if not logs_data.get("result"):
            print(f"Error fetching logs at offset {offset}, response: {logs_data}")
            attempt += 1
            time.sleep(2)  # Wait before retrying
            continue

        data = logs_data["result"].get("data", [])
        print(f"Retrieved {len(data)} logs at offset {offset}")

        if not data:
            # If no data is returned, break the retry loop and move to the next offset
            break

        all_logs.extend(data)
        break  # Exit the retry loop if data is fetched successfully

    offset += limit  # Move to the next batch

print(f"\nTotal logs retrieved: {len(all_logs)}")

# Save all logs to a text file in JSON format
output_filename = "all_logs.txt"
with open(output_filename, "w", encoding="utf-8") as f:
    json.dump(all_logs, f, indent=4)

print(f"\nSaved {len(all_logs)} logs to {output_filename}")

 

3 REPLIES 3
Jean-Philippe_P
Moderator
Moderator

Hello Mohammed_Waked, 

 

Thank you for using the Community Forum. I will seek to get you an answer or help. We will reply to this thread with an update as soon as possible. 

 

Thanks, 

Jean-Philippe - Fortinet Community Team
Jean-Philippe_P
Moderator
Moderator

Hello,

 

We are still looking for an answer to your question.

 

We will come back to you ASAP.

 

Thanks,

Jean-Philippe - Fortinet Community Team
vraev
Staff
Staff

Hi @Mohammed_Waked ,

 

Please review:
https://community.fortinet.com/t5/Support-Forum/get-logs-from-fortianalyzer/td-p/379622

{

"id": "123456789",

"jsonrpc": "2.0",

"method": "get",

"params": [

{

"apiver": 3,

"limit": 1000,

"offset": 0,

"url": "/logview/adom/{{adom}}/logsearch/{{tid}}"

}

],

"session": "{{session}}"

}

V.R.
Announcements
Check out our Community Chatter Blog! Click here to get involved
Labels
Top Kudoed Authors