What We’re Testing
The Audit Log page provides three client-side export buttons — CSV, JSON, and PDF — rendered in the page header. All three export functions operate on the logs array that is currently loaded in the React state (AuditLogPage.tsx). They do not make a second API call; they serialise whatever is on the current page.
Key implementation details:
- CSV (
exportCSV): constructs a comma-separated string with six columns: Time, Action, Resource Type, Resource ID, User ID, Details. Each cell is double-quote-escaped. UsesURL.createObjectURLto trigger a browser download asaudit-log.csv. - JSON (
exportJSON): serialises thelogsarray withJSON.stringify(logs, null, 2)and downloads asaudit-log.json. - PDF (
exportPDF): usesjsPDF(lazy-imported) to produce a landscape A4 document with four columns: Time, Action, Resource, Details. The header line reads “Audit Log Report” and includes a “Generated: / Total events: N” subtitle. Downloads asaudit-log.pdf.
Because exports reflect only the current page (50 rows by default), testing exports on a filtered view verifies that the filters affect export output.
There is also a separate server-side export endpoint at POST /api/export (handled by handlers/export-data.ts), but the Audit Log page does not use it — the three buttons are purely client-side operations on the Loki-fetched data already in memory.
Your Test Setup
| Machine | Role |
|---|---|
| ⊞ Win-A | Browser (dashboard) — all export actions are browser-side |
Prerequisites:
- At least five audit events loaded on the page
- Browser downloads are not blocked (check that
login.quickztna.comis not in the browser’s blocked-downloads list)
ST1 — Export as CSV and Verify Structure
What it verifies: The CSV export downloads a file with the correct six-column header and one data row per visible log entry.
Steps:
-
On ⊞ Win-A , navigate to
https://login.quickztna.com/audit-log. Wait for the table to load. -
Note the number of rows visible in the table (up to 50, or the total event count if fewer than 50).
-
Click the CSV button in the top-right of the page.
-
The browser downloads
audit-log.csv. Open it in Notepad or any text editor. -
Verify the first line (header row):
"Time","Action","Resource Type","Resource ID","User ID","Details"
-
Count the data rows (lines after the header). The count must equal the number of rows visible in the table.
-
Inspect the first data row:
- Column 1 (Time): an ISO 8601 timestamp, e.g.
"2026-03-17T09:00:00.000Z" - Column 2 (Action): an audit action string, e.g.
"auth.login" - Column 3 (Resource Type): a resource type, e.g.
"user" - Column 4 (Resource ID): a UUID or empty string
- Column 5 (User ID): a UUID or empty string
- Column 6 (Details): a JSON string, e.g.
"{""email"":""you@example.com"",""mfa"":false}"
Note: double quotes inside values are escaped as "" (standard CSV escaping).
Pass: Header matches exactly; data row count equals visible table row count; timestamps are ISO strings; Details is a JSON-escaped string.
Fail / Common issues:
- Browser blocks the download — check
chrome://settings/content/automaticDownloadsor equivalent. Allow downloads fromlogin.quickztna.com. - CSV has fewer rows than the table — some
logentries may have a null or undefinedactionfield; the table filters these withlogs.filter(log => log && log.action)but the CSV export iterates alllogs. This is an edge case; confirm by checking if the difference equals the count of null-action entries. - Details column shows
"{}"or"—"— the details object was empty for that event. This is correct behaviour.
ST2 — Export as JSON and Verify Structure
What it verifies: The JSON export downloads a valid JSON array where each element has all expected AuditLog fields.
Steps:
-
On ⊞ Win-A , with the Audit Log page loaded (same state as ST1), click the JSON button.
-
The browser downloads
audit-log.json. Open it in a text editor or browser JSON viewer. -
Confirm the file is a JSON array (starts with
[and ends with]). -
Inspect the first element. It must have these fields:
{
"id": "1742000000000000000",
"action": "auth.login",
"resource_type": "user",
"resource_id": "abc123",
"details": { "email": "you@example.com", "mfa": false },
"user_id": "abc123",
"created_at": "2026-03-17T09:00:00.000Z"
}
-
Confirm the array length equals the number of rows shown in the table.
-
Validate the JSON is well-formed using PowerShell:
$content = Get-Content "$env:USERPROFILE\Downloads\audit-log.json" -Raw
$parsed = $content | ConvertFrom-Json
Write-Host "Entry count: $($parsed.Count)"
Write-Host "First action: $($parsed[0].action)"
Write-Host "First created_at: $($parsed[0].created_at)"
Pass: JSON parses without error; entry count matches the table; every entry has action, resource_type, and created_at fields.
Fail / Common issues:
ConvertFrom-Jsonthrows a parse error — thedetailsfield may contain a raw string (not an object) if the Loki entry was malformed. This is a data quality issue, not an export bug.idfield contains a large integer string (e.g.,"1742000000000000000") rather than a UUID — this is expected. Theidis the Loki nanosecond timestamp used as a pseudo-ID, as documented inservices/loki.ts.
ST3 — Export as PDF and Verify Content
What it verifies: The PDF export downloads a landscape-orientation document with the correct title, subtitle, column headers, and data rows.
Steps:
-
On ⊞ Win-A , with the Audit Log page loaded, click the PDF button.
-
The browser downloads
audit-log.pdf(jsPDF loads lazily on first click; allow 2-3 seconds). -
Open
audit-log.pdfin a PDF viewer (Edge, Acrobat, or browser built-in). -
Verify the first page:
- Title text: “Audit Log Report”
- Subtitle includes “Generated:” followed by a date/time and “Total events:” followed by the total count
- Four column headers: “Time”, “Action”, “Resource”, “Details”
- A horizontal rule under the column headers
- Data rows below, one per log entry
-
If there are more than approximately 37 entries on a landscape A4 page (190mm usable height / 5mm per row), confirm a second page is generated.
-
Verify the “Total events: N” in the subtitle matches the
totalcount shown in the page heading (this is the Loki total, not just the current page).
Pass: PDF opens without error; correct title and subtitle; four columns are present; rows are populated.
Fail / Common issues:
- PDF button does nothing for 10+ seconds — jsPDF failed to lazy-load. Check the browser console (
F12 → Console) for a network error on the jsPDF chunk. - “Total events: 0” in the subtitle — the
totalstate was 0 when the export ran. This happens if the page had not fully loaded. Wait for the table to render before clicking PDF. - Text truncation in the Details column — the PDF export slices the
detailsJSON to 80 characters (JSON.stringify(log.details).slice(0, 80)). This is by design and is not a defect.
ST4 — Export Respects Active Filters
What it verifies: When a filter (action category or date range) is active, the export contains only the filtered rows currently visible in the table, not all events.
Steps:
-
On ⊞ Win-A , open the Filters panel and select “Today” as the date preset.
-
Note the (smaller) event count in the heading.
-
Export as CSV by clicking the CSV button.
-
Open the downloaded
audit-log.csvand count the data rows. -
Confirm the row count matches the filtered count displayed in the heading (not the unfiltered total).
-
Confirm every row’s Time column contains today’s date:
$csv = Import-Csv "$env:USERPROFILE\Downloads\audit-log.csv"
$today = (Get-Date).ToString("yyyy-MM-dd")
$csv | ForEach-Object {
if ($_.Time -notlike "*$today*") {
Write-Host "FAIL: $($_.Time) is not today"
}
}
- Repeat steps 3-6 with JSON and PDF exports.
Pass: All exported rows are from today; export row count matches the filtered table count.
Fail / Common issues:
- Export contains more rows than the filtered count — the export always exports the full
logsReact state for the current page, which is the same data as the filtered table. If counts differ, the table’s filter render (.filter(log => log && log.action)) may be excluding null-action entries that the CSV export includes. - Export contains zero rows despite events visible in the table — this would indicate a state timing issue. Try exporting again after the page fully stabilises.
ST5 — Export Page 2 Contains Correct Events
What it verifies: Navigating to page 2 and exporting gives page-2 entries, not page-1 entries — confirming exports are tied to the current React page state.
Steps:
-
On ⊞ Win-A , ensure more than 50 events exist (generate them in the ST4 section of the previous chapter if needed).
-
Load the Audit Log page. Note the first entry’s
created_attimestamp (the most recent event). -
Click Next to advance to page 2. Wait for the table to reload.
-
Note the first entry on page 2 — it should be older than the entries on page 1.
-
Click the JSON button to export page 2.
-
Open
audit-log.jsonand check the first entry’screated_at:
$j = Get-Content "$env:USERPROFILE\Downloads\audit-log.json" -Raw | ConvertFrom-Json
Write-Host "Page-2 first entry: $($j[0].created_at)"
- Confirm the
created_atis earlier than the page-1 first entry recorded in step 2.
Pass: Page-2 export contains events that are older than those on page 1; the export reflects the current page, not all pages.
Fail / Common issues:
- Page-2 export contains the same events as page-1 — the
logsstate was not updated before the export triggered. Ensure the page-2 load completed (table must show different rows) before clicking the export button.
Summary
| Sub-test | What it proves | Key assertion |
|---|---|---|
| ST1 | CSV export has correct header and row count | Six-column header; row count equals visible table rows; timestamps are ISO strings |
| ST2 | JSON export is valid and complete | Parses without error; entry count matches table; all AuditLog fields present |
| ST3 | PDF export renders with correct layout | Title, subtitle with total count, four columns, data rows |
| ST4 | Exports respect active filters | Exported rows match filtered count; no out-of-range events included |
| ST5 | Exporting from page 2 gives page-2 events | Page-2 entries are older than page-1 entries |