Skip to content

Commit

Permalink
Add auto subscription, retries for content retrieval and doc update
Browse files Browse the repository at this point in the history
  • Loading branch information
ddbnl committed Apr 13, 2022
1 parent 0b0d9f2 commit 0a0b6be
Show file tree
Hide file tree
Showing 17 changed files with 217 additions and 155 deletions.
14 changes: 14 additions & 0 deletions ConfigExamples/azureLogAnalytics.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
collect: # Settings determining which audit logs to collect and how to do it
contentTypes:
Audit.General: True
Audit.AzureActiveDirectory: True
Audit.Exchange: True
Audit.SharePoint: True
DLP.All: True
skipKnownLogs: True
resume: True
output:
azureLogAnalytics:
enabled: True
workspaceId: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
sharedKey: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
4 changes: 0 additions & 4 deletions ConfigExamples/fileOutput.yaml
Original file line number Diff line number Diff line change
@@ -1,16 +1,12 @@
log:
path: 'collector.log'
collect:
contentTypes:
Audit.General: True
Audit.AzureActiveDirectory: True
Audit.Exchange: True
Audit.SharePoint: True
DLP.All: True
autoSubscribe: True
skipKnownLogs: True
resume: True
hoursToCollect: 24
output:
file:
enabled: True
Expand Down
12 changes: 3 additions & 9 deletions ConfigExamples/filteredFileOutput.yaml
Original file line number Diff line number Diff line change
@@ -1,25 +1,19 @@
log:
path: 'collector.log'
collect:
contentTypes:
Audit.General: True
Audit.AzureActiveDirectory: True
Audit.Exchange: True
Audit.SharePoint: True
DLP.All: True
autoSubscribe: True
skipKnownLogs: True
resume: True
hoursToCollect: 24
# Collect logs concerning spoofing prevention in Audit.General, deleted files from Audit.SharePoint
# and login failures from Audit.AzureActiveDirectory
filter:
Audit.General:
- Policy: Spoof
Policy: Spoof
Audit.AzureActiveDirectory:
- Operation: UserLoginFailed
Operation: UserLoginFailed
Audit.SharePoint:
- Operation: FileDeleted
Operation: FileDeleted
# Output only to file
output:
file:
Expand Down
6 changes: 4 additions & 2 deletions Source/config.yaml → ConfigExamples/fullConfig.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,10 @@ collect: # Settings determining which audit logs to collect and how to do it
Audit.Exchange: True
Audit.SharePoint: True
DLP.All: True
maxThreads: 20
autoSubscribe: True # Automatically subscribe to collected content types
maxThreads: 50
retries: 3 # Times to retry retrieving a content blob if it fails
retryCooldown: 3 # Seconds to wait before retrying retrieving a content blob
autoSubscribe: True # Automatically subscribe to collected content types. Never unsubscribes from anything.
skipKnownLogs: True # Remember retrieved log ID's, don't collect them twice
resume: True # Remember last run time, resume collecting from there next run
hoursToCollect: 24 # Look back this many hours for audit logs (can be overwritten by resume)
Expand Down
4 changes: 0 additions & 4 deletions ConfigExamples/graylog.yaml
Original file line number Diff line number Diff line change
@@ -1,16 +1,12 @@
log:
path: 'collector.log'
collect:
contentTypes:
Audit.General: True
Audit.AzureActiveDirectory: True
Audit.Exchange: True
Audit.SharePoint: True
DLP.All: True
autoSubscribe: True
skipKnownLogs: True
resume: True
hoursToCollect: 24
output:
graylog:
enabled: False
Expand Down
13 changes: 4 additions & 9 deletions ConfigExamples/prtg.yaml
Original file line number Diff line number Diff line change
@@ -1,14 +1,9 @@
log:
path: 'collector.log'
collect:
contentTypes:
Audit.General: True
Audit.AzureActiveDirectory: True
Audit.Exchange: True
Audit.SharePoint: True
DLP.All: True
autoSubscribe: True
skipKnownLogs: True
skipKnownLogs: False # Take all logs each time to count the number of active filter hits each interval
resume: False # Take all logs each time to count the number of active filter hits each interval
hoursToCollect: 1 # Period over which to alert, e.g. failed AAD logins over the last hour
# The PRTG output defines channels which have filters associated to them. The output of the channel will be
Expand All @@ -20,12 +15,12 @@ output:
- name: Deleted Sharepoint files
filters:
Audit.SharePoint:
- Operation: FileDeleted
Operation: FileDeleted
- name: Failed Azure AD logins
filters:
Audit.AzureActiveDirectory:
- Operation: UserLoginFailed
Operation: UserLoginFailed
- name: Spoof attempts prevented
filters:
Audit.General:
- Policy: Spoof
Policy: Spoof
Binary file removed Linux/AuditLogCollector
Binary file not shown.
Binary file not shown.
185 changes: 97 additions & 88 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,28 +1,41 @@
# Office365 API audit log collector

Collect Office365 and Azure audit logs through their respective APIs. No prior knowledge of APIs is required,
onboarding and script usage is described below. There is a GUI for Windows. Currently supports the following outputs:
- Azure Analytics Workspace (OMS)
- Graylog (or any other source that accepts a simple socket connection)
- File

Simply download the executable(s) you need from the Windows or Linux folder.:
# Office365 audit log collector

Collect/retrieve Office365, Azure and DLP audit logs, optionally filter them, then send them to one or more outputs such as file, PRTG, Azure Log Analytics or Graylog.
Onboarding is easy and takes only a few minutes (steps described below). There are Windows and Linux executables, and an optional GUI for Windows only.
Easy configuration with a YAML config file (see the 'ConfigExamples' folder for reference).
If you have any issues or questions, feel free to create an issue in this repo.
- The following Audit logs can be extracted:
- Audit.General
- Audit.AzureActiveDirectory
- Audit.Exchange
- Audit.SharePoint
- DLP.All
- The following outputs are supported:
- Azure Analytics Workspace (OMS)
- PRTG Network Monitor
- Graylog (or any other source that accepts a simple socket connection)
- Local file

Simply download the executable you need from the Windows or Linux folder and copy a config file from the ConfigExamples folder that suits your need:
- Windows:
- GUI - Office Audit Log Collector.exe
- GUI for collecting audit logs AND subscribing to audit log feeds (see onboarding instructions below)
- Office Audit Log Collector.exe
- Command line tool for collecting audit logs (see syntax below)
- Office Audit Log Subscriber.exe
- Command line tool for subscribing to audit logs feeds (see onboarding instructions below)
- GUI-OfficeAuditLogCollector.exe
- GUI for collecting audit logs and subscribing to audit log feeds
- OfficeAuditLogCollector.exe
- Command line tool for collecting audit logs and (automatically) subscribing to audit log feeds
- Linux:
- OfficeAuditLogCollector
- Command line tool for collecting audit logs (see syntax below)
- OfficeAuditLogSubscriber
- Command line tool for subscribing to audit logs (see onboarding instructions below)
- Command line tool for collecting audit logs and (automatically) subscribing to audit log feeds

Find onboarding instructions and more detailed instructions for using the executables below.

For a full audit trail, schedule to run the collector on a regular basis (preferably at least once every day). Previously
retrieved logs can be remembered to prevent duplicates. Consider using the following parameters in the config file for a robust audit trail:
- skipKnownLogs: True (prevent duplicates)
- hoursToCollect: 24 (the maximum, or a number larger than the amount of hours between runs, for safety overlap)
- resume: False (don't resume where the last run stopped, have some overlap in case anything was missed for any reason)
See below for a more detailed instruction of the config file.

For a full audit trail schedule to run the script on a regular basis (preferably at least once every day). The last
run time is recorded automatically, so that when the script runs again it starts to retrieve audit logs from when it last ran.
Feel free to contribute other outputs if you happen to build any.
Lastly, feel free to contribute other outputs if you happen to build any. Also open to any other useful pull requests!
See the following link for more info on the management APIs: https://msdn.microsoft.com/en-us/office-365/office-365-management-activity-api-reference.

## Roadmap:
Expand All @@ -32,6 +45,9 @@ See the following link for more info on the management APIs: https://msdn.micros
- Create a tutorial for automatic onboarding + docker container for the easiest way to run this

## Latest changes:
- Added PRTG output
- Added filters
- Added YAML config file
- Added a GUI for Windows
- Added executables for Windows and Linux
- Added Azure Log Analytics Workspace OMS output
Expand All @@ -48,94 +64,87 @@ See the following link for more info on the management APIs: https://msdn.micros

- Ad-lib log retrieval;
- Scheduling regular execution to retrieve the full audit trail.
- Output to PRTG for alerts on audit logs

## Features:

- Subscribe to the audit logs of your choice through the subscription script;
- Subscribe to the audit logs of your choice through the --interactive-subscriber switch, or automatically when collecting logs;
- Collect General, Exchange, Sharepoint, Azure active directory and/or DLP audit logs through the collector script;
- Output to file or to a Graylog input (i.e. send the logs over a network socket)
- Output to file, PRTG, Azure Log Analytics or to a Graylog input (i.e. send the logs over a network socket).

## Requirements:
- Office365 tenant;
- Azure app registration created for this script (see instructions)
- AzureAD tenant ID;
- Client key of the new Azure app registration;
- Azure app registration created for this script (see onboarding instructions)
- Secret key (created in the new Azure app registration, see instructions);
- App permissions to access the APIs for the new Azure application (see instructions);
- Subscription to the APIs of your choice (General/Sharepoint/Exchange/AzureAD/DLP, run AuditLogSubscription script and follow the instructions).
- Subscription to the APIs of your choice (use autoSubscribe option in the config file to automate this).

## Instructions:

### Onboarding:
- Create an app registration:
- Create the app registration itself under Azure AD (own tenant only works fine for single tenant)
- Create app secret (only shown once upon creation, store it somewhere safe)
- Grant your new app permissions to read the Office API's:
- Graph: AuditLog.Read.All
- Office 365 Management APIs: ActivityFeed.Read
- Office 365 Management APIs: ActivityFeed.ReadDlp
### Onboarding (one time only):
- Make sure Auditing is turned on for your tenant!
- https://docs.microsoft.com/en-us/microsoft-365/compliance/turn-audit-log-search-on-or-off?view=o365-worldwide
- Use these instructions: https://docs.microsoft.com/en-us/microsoft-365/compliance/turn-audit-log-search-on-or-off?view=o365-worldwide
- If you had to turn it on, it may take a few hours to process
- Use the 'AuditLogSubscriber' script to subscribe to the audit API's of your choice
- You will need tenant id, client key and secret key for this
- Simply follow the instructions
- You can now run the script and retrieve logs.

- Create App registration:
- Azure AD > 'App registrations' > 'New registration':
- Choose any name for the registration
- Choose "Accounts in this organizational directory only (xyz only - Single tenant)"
- Hit 'register'
- Save 'Tenant ID' and 'Application (Client) ID' from the overview page of the new registration, you will need it to run the collector
- Create app secret:
- Azure AD > 'App registrations' > Click your new app registration > 'Certificates and secrets' > 'New client secret':
- Choose any name and expire date and hit 'add'
- Actual key is only shown once upon creation, store it somewhere safe. You will need it to run the collector.
- Grant your new app registration 'application' permissions to read the Office API's:
- Azure AD > 'App registrations' > Click your new app registration > 'API permissions' > 'Add permissions' > 'Office 365 Management APIs' > 'Application permissions':
- Check 'ActivityFeed.Read'
- Check 'ActivityFeed.ReadDlp'
- Hit 'Add permissions'
- Subscribe to audit log feeds of your choice
- Set 'autoSubscribe: True' in YAML config file to automate this.
- OR Use the '--interactive-subscriber' parameter when executing the collector to manually subscribe to the audit API's of your choice
- You can now run the collector and retrieve logs.


### Running the collector:

Running from the GUI should be self-explanatory. It can run once or on a schedule. Usually you will want to use the
command-line executable with a config file, and schedule it for periodic execution (e.g. through CRON, windows task
scheduler, or a PRTG sensor).

To run the command-line executable use the following syntax:

OfficeAuditLogCollector(.exe) %tenant_id% %client_key% %secret_key% --config %path/to/config.yaml%

To create a config file you can start with the 'fullConfig.yaml' from the ConfigExamples folder. This has all the
possible options and some explanatory comments. Cross-reference with a config example using the output(s) of your choice, and you
should be set.

### (optional) Creating an Azure Log Analytics Workspace (OMS):

If you are running this script to get audit events in an Azure Analytics Workspace you will a Workspace ID and a shared key.
Create a workspace from "Create resource" in Azure (no configuration required). Then get the ID and key from "Agent management".
You do not need to prepare any tables or other settings.

If you are running this script to get audit events in an Azure Analytics Workspace you will need a Workspace ID and a shared key.
- Create a workspace from "Create resource" in Azure (no configuration required);
- Get the ID and key from "Agent management";
- You do not need to prepare any tables or other settings.

### (optional) Creating a PRTG sensor

To run with PRTG you must create a sensor:
- Copy the OfficeAuditLogCollector.exe executable to the "\Custom Sensors\EXE" sub folder of your PRTG installation
- Create a device in PRTG with any host name (e.g. "Office Audit Logs")
- Create a 'EXE/Script Advanced Sensor' on that device and choose the executable you just copied
- Enter parameters, e.g.: "*tenant_id* *client_key* *secret_key* --config *full/path/to/config.yaml*"
(use full path, because PRTG will execute the script from a different working directory)
- Copy the prtg.config from ConfigExamples and modify at least the channel names and filters for your needs.
- Set the timeout of the script to something generous that suits the amount of logs you will retrieve.
Probably at least 300 seconds. Run the script manually first to check how long it takes.
- Match the interval of the sensor to the amount of hours of logs to retrieve. If your interval is 1 hour, hoursToCollect
in the config file should also be set to one hour.

### (optional) Creating a Graylog input

If you are running this script to get audit events in Graylog you will need to create a Graylog input. If not, just skip this.

If you are running this script to get audit events in Graylog you will need to create a Graylog input.
- Create a 'raw/plaintext TCP' input;
- Enter the IP and port you want to receive the logs on (you can use these in the script);
- All other settings can be left default.


### Running the script:

- Retrieve all logs and send to a network socket / Graylog server:
`python3 AuditLogCollector.py 'tenant_id' 'client_key' 'secret_key' --exchange --dlp --azure_ad --general --sharepoint -p 'random_publisher_id' -g -gA 10.10.10.1 -gP 6000`

#### Script options:
```
usage: AuditLogCollector.py [-h] [--general] [--exchange] [--azure_ad]
[--sharepoint] [--dlp] [-p publisher_id]
[-l log_path] [-f] [-fP file_output_path] [-g]
[-gA graylog_address] [-gP graylog_port]
tenant_id client_key secret_key`
positional arguments:
tenant_id Tenant ID of Azure AD
client_key Client key of Azure application
secret_key Secret key generated by Azure application`
optional arguments:
-h, --help show this help message and exit
--general Retrieve General content
--exchange Retrieve Exchange content
--azure_ad Retrieve Azure AD content
--sharepoint Retrieve SharePoint content
--dlp Retrieve DLP content
-r Resume looking for content from last run time for each content type (takes precedence over -tH and -tD)
-tH Number of hours to to go back and look for content
-tD Number of days to to go back and look for content
-p publisher_id Publisher GUID to avoid API throttling
-l log_path Path of log file
-f Output to file.
-fP file_output_path Path of directory of output files
-a Output to Azure Log Analytics workspace
-aC ID of log analytics workspace.
-aS Shared key of log analytics workspace.
-g Output to graylog.
-gA graylog_address Address of graylog server.
-gP graylog_port Port of graylog server.
-d Enable debug logging (large log files and lower performance)
```
Loading

0 comments on commit 0a0b6be

Please sign in to comment.