-
Notifications
You must be signed in to change notification settings - Fork 1.3k
[DOCS-13652] Add Sentinel Azure tables instructions #35260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
4933499
add azure table tab
maycmlee 17eba14
switch tab
maycmlee 68aa8fa
small edit
maycmlee 50c364b
fix link
maycmlee 3d0c561
apply suggestions
maycmlee c9bcd6b
reorder links
maycmlee 18a7666
update
maycmlee f0b1c54
small edit
maycmlee 497e6fd
Apply suggestions from code review
maycmlee 626a9c5
Merge branch 'master' into may/sentinel-azure-tables
maycmlee File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -23,21 +23,117 @@ | |
| 1. [Add Microsoft Sentinel][6] to the workspace. | ||
| 1. [Create a Data Collection Endpoint (DCE)][7]. | ||
| 1. [Create a Log Analytics Workspace][8] in the workspace if you haven't already. | ||
| 1. In the Log Analytics Workspace, navigate to **Settings** > **Tables**. | ||
| 1. Click **+ Create**. | ||
| 1. Define a custom table (for example, `MyOPWLogs`). | ||
| - **Notes**:<br>- After the table is configured, the prefix `Custom-` and suffix `_CL` are automatically appended to the table name. For example, if you defined the table name in Azure to be `MyOPWLogs`, the full table name is stored as `Custom-MyOPWLogs_CL`. You must use the full table name when you set up the Observability Pipelines Microsoft Sentinel destination.<br>-The full table name can be found in the resource JSON of the DCR under `streamDeclarations`.<br>- You can also use an Azure Table instead of a custom table. | ||
| 1. Select **New Custom Log (DCR-based)**. | ||
| 1. Click **Create a new data collection rule** and select the DCE you created earlier. | ||
| 1. Click **Next**. | ||
| 1. Upload a sample JSON Log. For this example, the following JSON is used for the **Schema and Transformation**, where `TimeGenerated` is required: | ||
| ```json | ||
| 1. Follow the instructions for the type of table to which you want to send data. | ||
| {{< tabs >}} | ||
| {{% tab "Azure Table" %}} | ||
| 1. Create a JSON file for your Data Collection Rule (DCR) parameters. See [Data collection rule (DCR)][1] for more information. | ||
| - In the `streamDeclarations` property, you must list all log fields you want mapped to the corresponding Azure table column. See [Stream declarations][2] for more information. | ||
| - In the `transformKql` property, you must list all fields on the log that are dropped and not mapped to the table. See [Data flow properties][3] for more information. | ||
| - **Note**: Each log field must be listed in one of these properties: either `streamDeclarations` or `transformKql`; otherwise the log is dropped. See [Monitor DCR data collection in Azure Monitor][4] on how to set up an alert when logs are dropped. | ||
| - For example, this JSON file (`dcr-commonsecuritylog.json`) adds the log fields to be mapped to the [`CommonSecurityLog`][5] table: | ||
| ```bash | ||
| { | ||
| "TimeGenerated": "2024-07-22T11:47:51Z", | ||
| "event": {} | ||
| } | ||
| ``` | ||
| 1. Click **Create**. | ||
| "location": "eastus", | ||
| "kind": "Direct", | ||
| "properties": { | ||
| "dataCollectionEndpointId": "<DCE_RESOURCE_ID>", | ||
| "streamDeclarations": { | ||
| "Custom-CommonSecurityLog": { | ||
| "columns": [ | ||
| { "name": "TimeGenerated", "type": "datetime" }, | ||
| { "name": "DeviceVendor", "type": "string" }, | ||
| { "name": "DeviceProduct", "type": "string" }, | ||
| { "name": "DeviceVersion", "type": "string" }, | ||
| { "name": "DeviceEventClassID", "type": "string" }, | ||
| { "name": "Activity", "type": "string" }, | ||
| { "name": "LogSeverity", "type": "string" }, | ||
| { "name": "SourceIP", "type": "string" }, | ||
| { "name": "DestinationIP", "type": "string" }, | ||
| { "name": "Message", "type": "string" }, | ||
| { "name": "source_type", "type": "string" }, | ||
| { "name": "path", "type": "string" }, | ||
| { "name": "timestamp", "type": "string" } | ||
|
maycmlee marked this conversation as resolved.
|
||
| ] | ||
| } | ||
| }, | ||
| "destinations": { | ||
| "logAnalytics": [ | ||
| { | ||
| "workspaceResourceId": "<WORKSPACE_RESOURCE_ID>", | ||
| "name": "LogAnalyticsDest" | ||
| } | ||
| ] | ||
| }, | ||
| "dataFlows": [ | ||
| { | ||
| "streams": ["Custom-CommonSecurityLog"], | ||
| "destinations": ["LogAnalyticsDest"], | ||
| "transformKql": "source | project-away source_type, path, timestamp", | ||
|
maycmlee marked this conversation as resolved.
|
||
| "outputStream": "Microsoft-CommonSecurityLog" | ||
| } | ||
| ] | ||
| } | ||
| ``` | ||
| - Replace the placeholders: | ||
| - `<DCE_RESOURCE_ID>` with the ID of the DCE resource you created in step 2. Run the [`az monitor data-collection endpoint show`][9] command to get the DCE resource ID. For example: | ||
| ``` | ||
| az monitor data-collection endpoint show \ | ||
| --name "<DCE_NAME>" \ | ||
| --resource-group <RESOURCE_GROUP> \ | ||
| --subscription <SUBSCRIPTION_ID> \ | ||
| --query "id" | ||
| ``` | ||
| - `<WORKSPACE_RESOURCE_ID>` with the ID of the Logs Analytics Workspace you created in step 3. Run the [`az monitor log-analytics workspace show`][10] command to get the Workspace resource ID. For example: | ||
| ``` | ||
| az monitor log-analytics workspace show \ | ||
| --workspace-name "<DCE_NAME>" \ | ||
| --resource-group <RESOURCE_GROUP> \ | ||
| --subscription <SUBSCRIPTION_ID> \ | ||
| --query "id" | ||
| ``` | ||
|
|
||
| - See [CommonSecurityLog Columns][6] for a full list of `commonsecuritylog` table columns. | ||
| - See the [Supported Azure Tables][7] for all available tables to which you can send data. | ||
| 1. Run the [`az monitor data-collection rule create`][8] Azure CLI command to create a DCR with the JSON file you created in the previous step. For example, with the `dcr-commonsecuritylog.json` example file: | ||
| ```bash | ||
| az monitor data-collection rule create \ | ||
| --resource-group "myResourceGroup" \ | ||
| --location "eastus" \ | ||
| --name "myCollectionRule" \ | ||
| --subscription "mysubscription" \ | ||
| --rule-file "\path\to\json\dcr-commonsecuritylog.json" | ||
| ``` | ||
|
|
||
| [1]: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#data-collection-rule-dcr | ||
| [2]: https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-structure#stream-declarations | ||
| [3]: https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-structure#data-flow-properties | ||
| [4]: https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-monitor | ||
| [5]: https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/commonsecuritylog | ||
| [6]: https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/commonsecuritylog#columns | ||
| [7]: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables | ||
| [8]: https://learn.microsoft.com/en-us/cli/azure/monitor/data-collection/rule?view=azure-cli-latest#az-monitor-data-collection-rule-create | ||
| [9]: https://learn.microsoft.com/en-us/cli/azure/monitor/data-collection/endpoint?view=azure-cli-latest#az-monitor-data-collection-endpoint-show | ||
| [10]: https://learn.microsoft.com/en-us/cli/azure/monitor/log-analytics/workspace?view=azure-cli-latest#az-monitor-log-analytics-workspace-show | ||
|
|
||
| {{% /tab %}} | ||
| {{% tab "Custom table" %}} | ||
| 1. In the Log Analytics Workspace, navigate to **Settings** > **Tables**. | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. For docs reviewer: Existing content was moved into this tab. No net-new content here. |
||
| 1. Click **+ Create**. | ||
| 1. Define a custom table (for example, `MyOPWLogs`). | ||
| - **Notes**:<br>- After the table is configured, the prefix `Custom-` and suffix `_CL` are automatically appended to the table name. For example, if you defined the table name in Azure to be `MyOPWLogs`, the full table name is stored as `Custom-MyOPWLogs_CL`. You must use the full table name when you set up the Observability Pipelines Microsoft Sentinel destination.<br>-The full table name can be found in the resource JSON of the DCR under `streamDeclarations`. | ||
| 1. Select **New Custom Log (DCR-based)**. | ||
| 1. Click **Create a new data collection rule** and select the DCE you created earlier. | ||
| 1. Click **Next**. | ||
| 1. Upload a sample JSON Log. For this example, the following JSON is used for the **Schema and Transformation**, where `TimeGenerated` is required: | ||
| ```json | ||
| { | ||
| "TimeGenerated": "2024-07-22T11:47:51Z", | ||
| "event": {} | ||
| } | ||
| ``` | ||
| 1. Click **Create**. | ||
| {{% /tab %}} | ||
| {{< /tabs >}} | ||
| 1. In Azure, navigate to **Microsoft Entra ID**. | ||
| 1. Click **Add** > **App Registration**. | ||
| 1. Click **Create**. | ||
|
|
||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we support all native Azure Tables or is there a limitation, say by MS Sentinel?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dd-sebastien-lb do you know?
Further down, we link to that page that Vlad had linked to: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't manually test all standard tables but I see no reason why this wouldn't work.
@p-parekh do you have specific tables you want to test ?